I suspect that it was the absence of a signed right shift. In my Decsystem-20 OS class, one of the differences between the compiler on the VAX and the compiler on the '20 was that -1 >> 1 was -1 on the VAX and 2^35-1 on the '20. This was in 1985 or 1986 for a compiler that was written in 1982 or 83 (that no longer exists today, I'm told, other '20 compilers took over). Some signed overflows / underflow / traps were different as well, which only mattered in the '20 simulator we were running since all traps and interrupts reset the trap frame (whether you wanted to or not), so if you did it in the kernel interrupt, you'd double trap the machine (I think this was an intentional difference to teach about being careful in an interrupt context, but still...). It's the lack of uniformity for signed operations for machines generally available in the late 70s and early 80s (often based on designs dating back to the 60s) that I always assumed drove it... These details took up bits of three different lectures in the OS class, and was a big source of problems by everybody...
So while my specific case was super weird / edge. But if it came up in an undergraduate OS class at an obscure technical school in the middle of the desert in New Mexico, I can't imagine that the design committee didn't know about it. Since the standardization started a few years before c89, I'm guessing that we'd see that if we had early drafts of the standard (or maybe it was inherited from K&R-era).
Warner