It’s worse than that.   “char” is defined as neither signed nor unsigned.   The signedness is implementation defined.    This was why we have the inane “signed” keyword.

 

 

From: TUHS [mailto:tuhs-bounces@minnie.tuhs.org] On Behalf Of Arthur Krewat
Sent: Monday, November 6, 2017 7:35 PM
To: tuhs@minnie.tuhs.org
Subject: Re: [TUHS] origins of void* -- Apology!

 

char (at least these days) is signed. So really, it's 7-bit ASCII.

I've been bitten by the 7-bit ASCII thing when it comes to modern character sets. unsigned char gets tiresome ;)

On 11/6/2017 7:25 PM, Ron Natalie wrote:

I believe one of C’s biggest failings is that they did not solve the schizophrenic definition of char*.

 

Char* as historically implemented and then  CODIFIED in the C and C++ standards is both the basic character type as well as the smallest addressable unit of storage.

This was all peachy keen in the 8 bit ASCII days (and even earlier alternative character sets such as EBCDIC, and its predecessors and other historical character sets like UNIVAC’s fielddata), but fell apart when we started into the 16 bit and larger UNICODE.

 

We needed a basic memory type that had sizeof == 1 (which void*) did not meet and release char from having to play double duty.