Since 1968 I was taught B = byte and b = bit.
The confusion with kb = 1024 bits comes from the digital world using binary arithmetic: 100000000010000000000 binary = 2^10 decimal = 1024 decimal.
1000 decimal = 1111101000 binary. I don't know about you but I prefer the standard cludge.