Greetings,
I was reviewing a manual page update and came across an ambiguous answer to
a question that came up, so I thought I'd ask here.
execl and execv first appear in our extant unix man pages in V4. The v3 and
v2 man pages don't have this listed at all. Case closed, right? It appeared
in V4.
However, the Dennis_v1/unix72 tree has execl.s and execv.s in them. Diving
back into the history on Warren's github account,
https://github.com/DoctorWkt/unix-jun72/tree/master/src says they come from:
"The files in lib/ come from the libc.sa file which is on the
last1120c.tar.gz
tap(I) tape image, also at the same URL, and form the C library for the
above compiler."
and
"from a working C compiler for 2nd Edition UNIX."
which suggests that it may have been in V2 or maybe even V1. It's
first use in Unix appears to be V5, but the extant pre-v5 code is so
fragmentary it's hard to know for sure. It's not mentioned in section
II or section III of v1, v2 or v3.
Does anybody know for sure, or can provide more insight into the
last1120c.tar.gz file to help disambiguate?
Warner
When I left BTL in 1983, I made a tar tape. A number of years later I
translated the tape into a file. Only recently have I wandered through it.
I don't know how many people remember Ron Hardin in the Columbus BTL
location. He was one of the smartest guys I ever met. There are lot of Ron
Hardin stories. One of his creations (as far as I know he authored it) was
a program to create Memorandums For File -- technical memorandums. My tar
tape scooped up festoon. To this day it compiles and runs happily on
Windows 10. It was written in 1978 or thereabouts. Here is an example
output:
bin$ festoon.exe
.TL
No Worthynesses
.AU "C. C. Festoon" CCF Headquarters 1584734291
.AS
A restriction had been being amicated by a convenience at the inclusion.
.AE
.MT "MEMORANDUM FOR COAT LOCKER"
.hy 1
On this occasion,
no team responsibilities could have polyesced a renewed emphasis.
A friction had penated an activation.
At the present moment in time,
an undue number of good progresses being collected together with the
populations were being proportionately fideated by
the fact that there was a data stream which was transenniesced by an
issuance being joined together with these team re
sponsibilities,
because natural basises have been veriating a partitioning.
The supplementary work should be conclusively quinquepolyated by a well
defined interfacing.
A sophisticatedness by a schedule is operated by a nature in conflict with
a correspondence under some serious discussi
ons.
It is within the realm of possibility that the effectiveness had
vicfacesced a schedule,
but there was not a necessary background information which is being
testesced by a strong interest,
and a statistical accuracy was tempoesced by the preparation.
It should be noted that a joint partnership very repeatedly aidioated this
publication of a centralized organization.
Due to the fact that there is a simplification which simply enniesced a
process,
a new technology is fluxesced from monorogatities.
It is of the utmost importance that an insurance could be putated by an
assumption.
A major advance centered about a deficiency octocessates an important
outcome.
.P
An effectation would extramicroate to the situation.
A complete revision gravated a direction.
Inasmuch as there was not a potential usefulness that cedeates by the
timely delivery,
a consideration centered around a technique was monofortated by an
integration:
.BL
.LI
There is a not unclear meterdom which had risiesced an occasion.
.LE
.P
A clamstress of this enclosedness is cludescing the hemidormity.
.P
To arrive at an approximation,
a large quantity had been chromated by a strong feeling.
Moreover,
that idea sharing was lusated by a current proposal.
Anytime that the final outcomes had been very firmly unpathesced by not
unphilaible reasonable compromises,
no serious concerns might be being sacrated by internal establishments for
the basic objectives in back of a full utili
zation.
.P
As a consequence of the fact that a total effect might vacate an easily
situational beneficial assistance,
the apparent provisioning being effectuated by a continuing difference can
have protenesced a realization of an underly
ing purpose.
A different doubtful important outcome is cludated by a capkin.
A rationale had fortated attachments.
Moreover,
this assumption had nilcoresced the continuing study.
.P
.H 1 "An Easily Added Basic Assumption Being Joined Together With A Concept
Stage"
There is not an impediment which neoated a restriction,
therefore.
A couple utilizations could morsate a great similarity at considerable
difficulties,
but an input is primescing the concept activities,
and a growing importance was hemicisesced by that beneficial assistance.
In the same connection,
these extremenesses are rather usefully ultralucesced by directions.
.SG
.NS 0
C. R. Glitch
S. A. Hobble
R. S. Limn
M. Shayegan
.NE
Ed Bradford, Ph.D. Physics, retired from IBM
BTL 1976-1983
--
Advice is judged by results, not by intentions.
Cicero
> From: Dagobert Michelsen
> the excellent book "G=C3=B6del, Escher, Bach: An Eternal Golden Braid"
> from Douglas R. Hofstadter which also gives a nice introduction into
> logic and philosopy.
IIRC, the focus of the book is how systems made out of simple components can
exhibit complex behaviours; in particular, how information-processing systems
can come to develop self-awareness.
> From: Chet Ramey
> One of the best books I read in high school.
A book on a very similar topic to GEB, which was _extremely_ important in
developing my understanding of how the universe came to be, is "Recursive
Universe", by William Poundstone, which I recommend very highly to everyone
here. It's still in print, which is really good, because it's not as well
known as it should (IMO) be. It uses an analogy with Conway's Life to explain
how the large-scale structure of the universe can develop from a random
initial state. Buy it now!
Noel
Sorry to flog this topic, but the two examples below are an
unfair comparison. What happened to the multiplications in the
second? And two of the [enter]s in the first are unnecessary.
Ironically three of the four operations in the second are
actually reverse Polish! If you had to utter sqrt first,
as we do on paper and in speech, things wouldn't look so great.
[a] [enter]
[a] [enter]
[multiply]
[b] [enter]
[b] [enter]
[multiply]
[add]
[square root]
[a]
[square]
[plus]
[b]
[square]
[square root]
Doug
Once in a while a new program really surprises me. Reminiscing a while
ago, I came up with a list of eye-opening Unix gems. Only a couple of
these programs are indispensable or much used. What singles them out is
their originality. I cannot imagine myself inventing any of them.
What programs have struck you similarly?
PDP-7 Unix
The simplicity and power of the system caused me to turn away from big
iron to a tiny machine. It offered the essence of the hierarchical
file system, separate shell, and user-level process control that Multics
had yet to deliver after hundreds of man-years' effort. Unix's lacks
(e.g. record structure in the file system) were as enlightening and
liberating as its novelties (e.g. shell redirection operators).
dc
The math library for Bob Morris's variable-precision desk calculator
used backward error analysis to determine the precision necessary at
each step to attain the user-specified precision of the result. In
my software-components talk at the 1968 NATO conference on software
engineering, I posited measurement-standard routines, which could deliver
results of any desired precision, but did not know how to design one. dc
still has the only such routines I know of.
typo
Typo ordered the words of a text by their similarity to the rest of the
text. Typographic errors like "hte" tended to the front (dissimilar) end
of the list. Bob Morris proudly said it would work as well on Urdu as it
did on English. Although typo didn't help with phonetic misspellings,
it was a godsend for amateur typists, and got plenty of use until the
advent of a much less interesting, but more precise, dictionary-based
spelling checker.
Typo was as surprising inside as it was outside. Its similarity
measure was based on trigram frequencies, which it counted in a 26x26x26
array. The small memory, which had barely room enough for 1-byte counters,
spurred a scheme for squeezing large numbers into small counters. To
avoid overflow, counters were updated probabilistically to maintain an
estimate of the logarithm of the count.
eqn
With the advent of phototypesetting, it became possible, but hideously
tedious, to output classical math notation. Lorinda Cherry set out to
devise a higher-level description language and was soon joined by Brian
Kernighan. Their brilliant stroke was to adapt oral tradition into written
expression, so eqn was remarkably easy to learn. The first of its kind,
eqn has barely been improved upon since.
struct
Brenda Baker undertook her Fortan-to-Ratfor converter against the advice
of her department head--me. I thought it would likely produce an ad hoc
reordering of the orginal, freed of statement numbers, but otherwise no
more readable than a properly indented Fortran program. Brenda proved
me wrong. She discovered that every Fortran program has a canonically
structured form. Programmers preferred the canonicalized form to what
they had originally written.
pascal
The syntax diagnostics from the compiler made by Sue Graham's group at
Berkeley were the mmost helpful I have ever seen--and they were generated
automatically. At a syntax error the compiler would suggest a token that
could be inserted that would allow parsing to proceed further. No attempt
was made to explain what was wrong. The compiler taught me Pascal in
an evening, with no manual at hand.
parts
Hidden inside WWB (writer's workbench), Lorinda Cherry's Parts annotated
English text with parts of speech, based on only a smidgen of English
vocabulary, orthography, and grammar. From Parts markup, WWB inferred
stylometrics such as the prevalance of adjectives, subordinate clauses,
and compound sentences. The Today show picked up on WWB and interviewed
Lorinda about it in the first TV exposure of anything Unix.
egrep
Al Aho expected his deterministic regular-expression recognizer would beat
Ken's classic nondeterministic recognizer. Unfortunately, for single-shot
use on complex regular expressions, Ken's could finish while egrep was
still busy building a deterministic automaton. To finally gain the prize,
Al sidestepped the curse of the automaton's exponentially big state table
by inventing a way to build on the fly only the table entries that are
actually visited during recognition.
crabs
Luca Cardelli's charming meta-program for the Blit window system released
crabs that wandered around in empty screen space nibbling away at the
ever more ragged edges of active windows.
Some common threads
Theory, though invisible on the surface, played a crucial role in the
majority of these programs: typo, dc, struct, pascal, egrep. In fact
much of their surprise lay in the novelty of the application of theory.
Originators of nearly half the list--pascal, struct, parts, eqn--were
women, well beyond women's demographic share of computer science.
Doug McIlroy
March, 2020
Tomasz Rola writes on Thu, 19 Mar 2020 21:01:20 +0100 about awk:
>> One task I would be afraid to use awk for, is html processing. Most of
>> html sources I look at nowadays seems discouraging. Extracting
>> anything of value from the mess requires something more potent, I
>> think.
If you want to tackle raw HTML from abitrary source, then I agree with
you: most HTML on the Web is not grammar conformant, there are
numerous vendor extensions, and the HTML is hideously idiosynchratic
and irregularly formatted.
The solution that I adopted 25 years ago was to write a grammar
recognizing, but violation lenient, prettyprinter for HTML. It has
served well and I use it many times daily for my work in the BibNet
Project and TeX User Group bibliography archives, now approaching 1.55
million entries. The latest public release is available here:
http://www.math.utah.edu/pub/sgml/
I notice that the last version there is 1.01; I'll get that updated in
a couple of days to the latest 1.03 [subject to delays due to major
work dislocations due to the virus]. The code should install anywhere
in the Unix family without problems: I build and validate it on more
than 300 O/Ses in our test farm.
With standardized HTML, applying awk is easy, and I have more than 450
awk programs, and 380,000 lines of code, that process publisher
metadata to produce rough BibTeX entries that numerous other tools,
and some manual editing, turn into clean data for free access on the
Web.
For some journals, I run a single command of fewer than 15 characters
to download Web pages for journal issues for which I do not yet have
data, and then a single journal-specific command with no arguments
that runs a large shell script with a long pipeline that outputs
relatively clean BibTeX that then normally takes me only a couple of
minutes to visually validate in an editor session. The major work
there is bracing of proper nouns in titles that my software did not
already handle, thereby preventing downcasing of those words in the
many bibliography styles that do so.
I'm on journal announcement lists for many publishers, so I often have
new data released to the Web just 5 to 10 minutes after receiving
e-mail about new issues.
The above-mentioned archives are at
http://www.math.utah.edu/pub/bibnethttp://www.math.utah.edu/pub/tex/bibhttp://www.math.utah.edu/pub/tex/bib/index-table.htmlhttp://www.math.utah.edu/pub/tex/bib/idxhttp://www.math.utah.edu/pub/tex/bib/toc
They are mirrored at Universität Karlsruhe, Oak Ridge National
Laboratory, Sandia National Laboratory, and elsewhere.
Like Al Aho, Doug McIlroy, and Arnold Robbins, I'm a huge fan of awk;
I believe that I was the first to port it to PDP-10 TOPS-20 and VAX
VMS in the mid-1980s, and it is one of the first mandatory tools that
I install on any new computer.
-------------------------------------------------------------------------------
- Nelson H. F. Beebe Tel: +1 801 581 5254 -
- University of Utah FAX: +1 801 581 4148 -
- Department of Mathematics, 110 LCB Internet e-mail: beebe(a)math.utah.edu -
- 155 S 1400 E RM 233 beebe(a)acm.org beebe(a)computer.org -
- Salt Lake City, UT 84112-0090, USA URL: http://www.math.utah.edu/~beebe/ -
-------------------------------------------------------------------------------
i never got on with rpn, even though i am the kind if person who should - i have a little bit of dyslexia, perhaps thats why.
when i moved to plan9 i found it included hoc from K&P, just used that for the odd bits of maths i need.
-Steve
There's been a lot of discussion about early Unix on Intel, National
Semi, Motorola, and Sparc processors. I don't recall if Unix ran on
the Z8000, and if not, why not.
As I remember the Z8000 was going to be the great white hope that
would continue Zilog's success with the Z80 into modern times.
But, it obviously didn't happen.
Why?
Jon
I sent it off list to Chet and Paul but it's too big for Warren's
reflector. I'll work with him to get it separately. If anyone else wants
it before, please let me know off list.
Clem
On Mon, Mar 16, 2020 at 10:37 AM Clem Cole <clemc(a)ccc.com> wrote:
> Here you go...
>
> Warren - Would please be able to add to the docs/paper directories...
> USENIX Summer 1985 June 11-14
>
Hi,
Would anyone know how the CSRG made available updates to the public
distribution of Net/2 before releasing 4.4 BSD Lite?
Net/2 was released on tape in July 1991 as represented in the TUHS
archive. The first mention I see of the sources appearing online is old
usenet posts announcing the bsd-sources mirror on uu.net starting in
January 1992.
However, the Net/2 derived versions of NetBSD (0.8-0.9) and FreeBSD
(1.0-1.1.5.1) imported userland tools revised after both dates in early
1993, when there are also mentions of using 4.4 code in the commit logs
(well before Lite 1 was released in March 1994).
At this point, I'm wondering if the latest code was issued on tape at the
time of a request for Net/2 sources and there were direct uploads from
the CSRG to any online mirrors before the USL lawsuit?
Thanks,
Dan Plassche
--
Dan Plassche