In Brailsford's youtube series in Computerphile (something I came across through BWK's interview with Brailsford),
Episode 86 is about "Where Did Bytes Come From?". He claims that if you wanted to do decimal arithmetic on a
binary machine, you'd want to have 10 digits of accuracy to capture the 10 digit log tables that were then popular.
10 digits is around 33 to 36 bits, so words ended up that size (or half that size), 36 or 18 bits. (Brailsford's lectures are
fabulous, by the way, likely to appeal to TUHS types.)
I like that explanation better than the story I heard that the IBM 709 series had 36 bit words because Arthur Samuel,
then at IBM, needed 32 bits to identify the playable squares on a checkerboard, plus some bits for color and kinged
(if that's the proper term for getting across the board and gaining the ability to move toward either side). Samuel was
famous for writing a checker playing program that played champion-quality checkers.