[TUHS] Zombified SCO comes back from the dead, brings trial back to life against IBM

Kevin Bowling kevin.bowling at kev009.com
Tue Apr 6 06:44:26 AEST 2021


On Sun, Apr 4, 2021 at 4:34 PM Clem Cole <clemc at ccc.com> wrote:
>
>
>
> On Sun, Apr 4, 2021 at 7:01 PM Bakul Shah <bakul at iitbombay.org> wrote:
>>
>>
>>
>> On Apr 4, 2021, at 3:25 PM, David Arnold <davida at pobox.com> wrote:
>>
>>  For us UNIX historians, we need to be careful and learn from our own history here -- the Cell Phone/Mobile target is the engine for the next Christenian style disruption.  It is by far the #1 target for people writing new programs (which I find a little sad personally - but I understand and accept -- time has marched on).  In the end, a small mobile target will be the tech on top, and available will be driven by market behavior and those suppliers will be "who has the gold.”
>>
>>
>> I feel I should point out that both the dominant mobile operating systems are Unix-hased.  The UI is necessarily new, but astonishingly the 50 year old basic abstractions are the same.
>>
>>
>> Except Unix is kind of hard to see. It wasn't just the hierarchical file system but the idea of composability. Even now we whip up a shell "one-liners" to perform some task we just thought of. All that is lost. And not just on mobile devices. For example search through email messages for something in an email "app". And no UI composability. We have to use extremely heavyweight IDEs such as X-Code weighing at 15GB (even "du -s /Application/X-code" takes tens of seconds!) to painstakingly construct a UI. We can't just whip up a dashboard to measure & display some realtime changing process/entity. There may be equally heavyweight third party tools but there has been no Bell Labs like research crew to distill it down to the essence of composable UI and ship it with every copy. The idea that users too can learn to "program" if given the right tools.
>
>
> Exactly my point.  The only difference I suspect is I just don't bother with the IDE (Xcode or VS).   Frankly, vi/emacs, or as we discussed a few days ago, ed is still way more preferable when I'm programming.
>
> I mentioned in another email Intel's new development suite - OneAPI.  Absolutely speaking for myself here, I am a bit at odds with management WRT to much of it, as I feel the direction is a bit miss guided.   But I do understand why Intel is doing it/trying.   Everyone in the industry seems to be saying "use my Framework, my language, my solution and I will solve your problem."  "You will sell more copies of the program if you use my portal, etc."  Intel to compete, needs to do the same things.     To me, it seems a bit like fairy dust - a promise that will work for a set of people, and of course, some firms like my own employer will keep making money (or in the words of the Dr. Sueuss Lorax character: "Biggering and Biggering."   As I said in the previous message, it is driven by the other golden rule.
>
> What I always felt made UNIX powerful was that it did not seem like the BTL folks were trying to sell anything.  They were trying to solve real problems they and the folks at AT&T had when it came to realistically building and deploying systems.   Yes, there were hidden from the profit motive at the time because of the unique rules of the 1956 consent degree and we all were winners because of it because they say -- sure here you can use it too.
>
> Now that we are back to a winner take all market, (OSVM/360 vs. VMS vs. winders ...) I think we have traded away designing for the sake of getting the job done properly, for designing to sell as many as possible (i.e. be sexy and capture a market, not be simple and do the job well).

You guys are onto an interesting thread.  If you have time read this
https://danluu.com/essential-complexity/ .  One thing I find
interesting about this article is Dan, who externally seems at least a
magnitude smarter and more productive than me, seems to miss that the
approach he uses to debunk Brooks is largely a failure of design that
probably predates his entry to the problem by layering ever more
accidental complexity instead of solving the essential complexity.
For instance, a circular buffer of arbitrary size (to the hardware and
monetary constraints of his example) could be used to efficiently hold
resource usage and drive important business or engineering decisions.
A cascading set of buffers could be used to hold higher fidelity data
at the top level and decreasing fidelity for longer time series
intervals.  It doesn't match his approach, which allows finding signal
fidelity in extremely noisy data, but that's a problem of its own
creation.

The non-UNIX approach to things (IDEs, frameworks, big overlay APIs,
microservices etc) definitely help with certain things people are
willing to pay a lot of money for.  However they lend themselves to
creating and fixing ever more accidental complexity with ever more
accidental complexity.  I guess the thing we really like about UNIX,
historically, was that it did a good job at exposing the programmer to
the essential complexity.  If you need to make things go fast, it's
right there in your face.  If you need to do resource accounting or
just about any other task, there's a way to do it on fundamentally
limited hardware.  Big hardware and lots of people haven't made the
essential complexity easier.  UNIX, which was developed with
discipline, still exerts a positive effect on the essential complexity
in systems development when embraced.  When not embraced, it becomes a
liability (15GB Xcode).

Regards,
Kevin


More information about the TUHS mailing list