On Sun, Nov 18, 2018 at 10:11 PM Jon Steinhart <jon@fourwinds.com> wrote:
Sort of like Americans expecting others to speak to them in English when they travel instead of understanding that they're in a different environment and it makes more sense to learn the culture as it's unlikely that everybody is gonna change just for you.
Amen, brother Jon, can we get another Amen..


 
This is not a unique problem with man vs info.  I see it in the large number of different make utilities, package managers, and so on that really don't provide new functionality but do make it much harder to be a practitioner since one has a lot more stuff to learn for no real benefit.
Exactly!!
 

So were it me, I would have looked at the current culture in the UNIX environment and figured out how it gracefully extend it for new functionality.  To me, that's
a mark of good engineering instead of being a bull in a china shop.
I referred to this previously as the principle of 'least astonishment.'

Again - the argument for doing what he (and his followers did was) 'Gnu is Not Unix' - but my reply is that they created UNIX when they were done.  They road the research train, then BSD rode the same UNIX train to start and now ride the UNIX look/work alike, Linux, rides it still.    And because it was incremental on the past, we get more behind it.

A much as I'm live and left live, and to each her/his own -- if GNU had been a new system, then I might be a lot more willing to accept that the argument.  But what was build was (and is) not.  GNU is just the current and expanded UNIX implementation.  And the so its have the man page being useless and expecting people to use info in just wrong.   Even if you are used it it (ok, so you found English speakers when you travelled).

And Ted is not that I don't use the unix documents (full papers) - hey I do.   That is how I learned to use 'make' when it appeared (or C for that matter) from documents in /usr/doc. \

What started this whole thread was Doug's comment about how succinct and  to the point man was.  If was a fine interface for >>UNIX<<.   Man (using roff) was what people expect.  It's not about better or worse -- it worked and worked well.

As I said, if man had been maintained as the primary >>manual<< style interface and /usr/doc/<PROG>/foo.ms as the primary scheme (which >>IS<< what BSD did), then you don't fail the rule of least astonishment.  Then create a *roff -Tinfo | info_create backend, that produced the info files; those that want it, get it and love it.   Those that >>expect<< man to work because its UNIX, get what they expect.  No one is 'astonished.'

A good example of that in a different field is the way in which FM stereo was finessed in such a
way as to not break existing mono receivers.  Would have been easy to just toss it
and make everybody buy new gear, but I prefer the more elegant solution.
Yep.   Metcalfe's Law -- adding too and improving on the past; makes more people happy.   Yep, it is sometimes 'harder ' for the developer and some compromises do result.   But the result is a bigger pie and happier group in total.  Think of the contemporary system to Linux (including Plan9 for the matter) -- which were 'better' and which are we still using.   Its not that there are not good ideas.

The 'better than' argument fails when the difference ('betterness') is shallow and not something that really is remarkable (I like it and use it is not good enough).   Respecting the past and ensuring the 'old ways' work is good business.  And that is the problem.   When you are the creator of the alternate scheme, its hard to not understand how much better something is.  Being different and better >>sometimes<< can pay off (check out:  Bret Victor’s: The Future of Programming: https://www.youtube.com/watch?v=8pTEmbeENF4 His talk in 2013, but set and  presented it as if he were talking in the 1970s); but if you look each of these things he is talking about is remarkably different --  man vs. info (or ed/vi vs teco/emacs for that matter, I'm not sure really are/were).

Clem