System/360s, or at least 370s, could do ASCII perfectly well.

When we started UNIX on VM/370, it was clear to us that we wanted to run with ASCII.  But some otherwise intelligent people told us that it *just couldn't be done* - the instructions depended on EBCDIC.
But I think there was only 1 machine instruction with any hint of EBCDIC - and it was an instruction that no-one could imagine being used by a compiler,

Of course, plenty of EBCDIC/ASCII conversions went on in drivers, etc, but that was easy.

On Wed, Feb 3, 2021 at 12:09 PM Dave Horsfall <dave@horsfall.org> wrote:
On Wed, 3 Feb 2021, Peter Jeremy wrote:

> I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
> Does anyone know why the computer industry wound up standardising on
> 8-bit bytes?

Best reason I can think of is System/360 with 8-bit EBCDIC (Ugh!  Who said
that "J" should follow "I"?).  I'm told that you could coerce it into
using ASCII, although I've never seen it.

> Scientific computers were word-based and the number of bits in a word is
> more driven by the desired float range/precision.  Commercial computers
> needed to support BCD numbers and typically 6-bit characters. ASCII
> (when it turned up) was 7 bits and so 8-bit characters wasted ⅛ of the
> storage.  Minis tended to have shorter word sizes to minimise the amount
> of hardware.

Why would you want to have a 7-bit symbol?  Powers of two seem to be
natural on a binary machine (although there is a running joke that CDC
boxes has 7-1/2 bit bytes...

I guess the real question is why did we move to binary machines at all;
were there ever any ternary machines?

-- Dave


--
- Tom