What you call 'composability' is nothing more than the 'software
tools' approach.

I recall reading an interview with Dennis Ritchie, but can no
longer find it, where he talks about this approach versus the
monolithic program model, as for example demonstrated by
'perl', which does everything. Dennis remarked that the
'tools' approach required careful thought (which programs to
use, how to connect them with pipes) and so was more difficult
to use, hence the popularity of 'perl'.

I'd be delighted if someone can point me to where this
interview might be.

P.S.

A most beautiful example of this approach is from Doug McIlroy
in his critique of Donald Knuth's solution to a problem posed
by Jon Bentley in his 'Programming Pearls' column:

    Read a file of text, determine the n most frequently used
    words, and print out a sorted list of those words along with
    their frequencies.

Donald Knuth wrote a long, beautifully crafted program in some
pseudo-code.  Doug McIlroy provided an alternative solution:

tr -cs A-Za-z '\n' |
tr A-Z a-z |
sort |
uniq -c |
sort -rn |
sed ${1}q

This is real genius.


On Sat, May 6, 2017 at 1:37 AM, Bakul Shah <bakul@bitblocks.com> wrote:
I think the key issue is not as much minimalism as composability. BSD often prioritized convenience over composability. Each command doing one thing well and doing line oriented makes them more composability. You can always package up convenient combinations in a script. Plan9 has lc which prints like unix ls -C but it is an rc script. Trying to achieve composability can result in leaner systems.