> Or consider this. Unix grew by about 39 system calls in its first
> decade, but an average of 40
> > per decade ever since. Is this accelerated growth more symptomatic of
> maturity or of cancer?
Looks like I need a typing tutor. 39 should be 30. And a math tutor, too. 40
should be 100.
Doug
$ k-2.9t
K 2.9t 2001-02-14 Copyright (C) 1993-2001 Kx Systems
Evaluation. Not for commercial use.
\ for help. \\ to exit.
This is a *linux* x86 binary from almost exactly 20 years ago running on FreeBSD built from last Wednesday’s sources.
$ uname -rom
FreeBSD 13.0-ALPHA3 amd64
Generally compatibility support for previous versions of FreeBSDs has been decent when I have tried. Though the future for x86 support doesn’t look bright.
> On Feb 8, 2021, at 10:56 PM, John Gilmore <gnu(a)toad.com> wrote:
>
> (I'm not up on what the BSD releases are doing.)
This topic is evocative, even though I really have nothing to say about it.
Mike Lesk started, and I believe Brian contributed to, "learn", a program
for interactive tutorials about Unix. It was never pushed very far--almost
certainly not into typing.
But the mention of typing brings to mind the inimitable Fred Grampp--he
who pioneered massive white-hat computer cracking. Fred's exploits justified
the opening sentence I wrote for Bell Labs' first computer-security task
force report, "It is easy and not very risky to pilfer data from Bell
Laboratories computers." Among Fred's many distinctive and endearing
quirks was the fact that he was a confirmed two-finger typist--proof that
typing technique is an insignificant factor in programmer productivity.
I thought this would be an excuse to tell another ftg story, but I
don't want to repeat myself and a search for "Grampp" in the tuhs archives
misses many that have already been told. Have the entries been lost or
is the index defective?
Doug
I would like to revive Lorinda Cherry's "parts".
Implicit in "revival" is dispelling the hundreds
of warnings from gcc -Wpedantic -Wall -Wextra.
Has anybody done this already?
Doug
> Does anyone know why the computer industry wound up standardising on
8-bit bytes?
I give the credit to the IBM Stretch, aka 7030, and the Harvest attachment
they made for NSA. For autocorrelation on bit streams--a fundamental need
in codebreaking--the hardware was bit-addressable. But that was overkill
for other supercomputing needs, so there was coarse-grained addressability
too. Address conversion among various operand sizes made power of two a
natural, lest address conversion entail division. The Stretch project also
coined the felicitous word "byte" for the operand size suitable for
character
sets of the era.
With the 360 series, IBM fully committed to multiple operand sizes. DEC
followed suit and C naturalized the idea into programmers' working
vocabulary.
The power-of-2 word length had the side effect of making the smallest
reasonable size for floating-point be 32 bits. Someone on the
Apollo project once noted that the 36-bit word on previous IBM
equipment was just adequate for planning moon orbits; they'd
have had to use double-precision if the 700-series machines had
been 32-bit. And double-precision took 10 times as long. That
observation turned out to be prescient: double has become the
norm.
Doug
The topic of GBACA (Get Back At Corporate America), the video game for
the BLIT/5620, has come up on a Facebook group.
Does anyone happen to have any details about it, source code, author,
screen shots, ...?
Thanks,
Mary Ann
I will ask Warren's indulgence here - as this probably should be continued
in COFF, which I have CC'ed but since was asked in TUHS I will answer
On Wed, Feb 3, 2021 at 6:28 AM Peter Jeremy via TUHS <tuhs(a)minnie.tuhs.org>
wrote:
> I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
> Does anyone know why the computer industry wound up standardising on
> 8-bit bytes?
>
Well, 'standardizing' is a little strong. Check out my QUORA answer: How
many bits are there in a byte
<https://www.quora.com/How-many-bits-are-there-in-a-byte/answer/Clem-Cole>
and What is a bit? Why are 8 bits considered as 1 byte? Why not 7 bit or 9
bit?
<https://www.quora.com/What-is-a-bit-Why-are-8-bits-considered-as-1-byte-Why…>
for my details but the 8-bit part of the tail is here (cribbed from those
posts):
The Industry followed IBM with the S/360.The story of why a byte is 8- bits
for the S/360 is one of my favorites since the number of bits in a byte is
defined for each computer architecture. Simply put, Fred Brooks (who lead
the IBM System 360 project) overruled the chief hardware designer, Gene
Amdahl, and told him to make things power of two to make it easier on the
SW writers. Amdahl famously thought it was a waste of hardware, but Brooks
had the final authority.
My friend Russ Robeleon, who was the lead HW guy on the 360/50 and later
the ASP (*a.k.a.* project X) who was in the room as it were, tells his yarn
this way: You need to remember that the 360 was designed to be IBM's
first *ASCII
machine*, (not EBCDIC as it ended up - a different story)[1] Amdahl was
planning for a word size to be 24-bits and the byte size to be 7-bits for
cost reasons. Fred kept throwing him out of his office and told him not to
come back “until a byte and word are powers of two, as we just don’t know
how to program it otherwise.”
Brooks would eventually relent on the original pointer on the Systems 360
became 24-bits, as long as it was stored in a 32-bit “word”.[2] As a
result, (and to answer your original question) a byte first widely became
8-bit with the IBM’s Systems 360.
It should be noted, that it still took some time before an 8-bit byte
occurred more widely and in almost all systems as we see it today. Many
systems like the DEC PDP-6/10 systems used 5, 7-bit bytes packed into a
36-bit word (with a single bit leftover) for a long time. I believe that
the real widespread use of the 8-bit byte did not really occur until the
rise of the minis such as the PDP-11 and the DG Nova in the late
1960s/early 1970s and eventually the mid-1970s’ microprocessors such as
8080/Z80/6502.
Clem
[1] While IBM did lead the effort to create ASCII, and System 360 actually
supported ASCII in hardware, but because the software was so late, IBM
marketing decided not the switch from BCD and instead used EBCDIC (their
own code). Most IBM software was released using that code for the System
360/370 over the years. It was not until IBM released their Series 1
<https://en.wikipedia.org/wiki/IBM_Series/1>minicomputer in the late 1970s
that IBM finally supported an ASCII-based system as the natural code for
the software, although it had a lot of support for EBCDIC as they were
selling them to interface to their ‘Mainframe’ products.
[2] Gordon Bell would later observe that those two choices (32-bit word and
8-bit byte) were what made the IBM System 360 architecture last in the
market, as neither would have been ‘fixable’ later.
> From: Greg A. Woods
> There's a "v6net" directory in this repository.
> ...
> I wonder if it is from either of the two ports you mention.
No; the NOSC system is an NCP system, not TCP; and this one has mbufs (which
the BBN v6 one did not have), so it's _probably_ a Berkleyism of some sort
(or did the BBN VAX code have mbuf's too; I don't recall - yes, it did:
https://minnie.tuhs.org//cgi-bin/utree.pl?file=BBN-Vax-TCP
see bbnnet/mbuf.c). It might also be totally new code which just chose to
re-use that meme. I don't have time to look closely to see if I see any
obvious descent.
> Too many broken half-baked MUAs seem to still be widely used.
I'm one of the offendors! Hey, this is a vintage computing list, so what's
the problem with vintage mail readers? :-)
Noel
PS: I'm just about done collecting up the MIT PWB1 TCP system; I only have
the Server FTP left to go. (Alas, it was a joint project between a student
and a staffer, who left just at the end, so half the source in one's personal
area, and the other half's in the other's. So I have to find all the pieces,
and put them in the system's source area.) Once that's done, I'll get it to
WKT to add to the repositoey. (Getting it to _actually run_ will take a
while, and will happen later: I have to write a device driver for it, the
code uses a rare, long-extinct board.)