The IBM 650, one of the first general-purpose digital computers, designed in the early fifties, used decimal digits with bi-quinary representation for reasons discussed here: Why did the IBM 650 use bi-quinary?
The Burroughs 205, nee CEC 30-201, was a computer developed around the same time, using essentially the same technology (vacuum tubes and drum memory) that also calculated with words of ten decimal digits plus sign, i.e. an extremely similar design, as far as I can tell not as a result of either company copying the other, but of convergent design driven by similar technological and market forces.
So I was surprised to read in the manual that it does not use bi-quinary, but BCD.
Why does it differ from the 650 in this regard? Was it simply a case of two teams of engineers weighing the trade-offs and reaching different conclusions, or was there some identifiable technical or business reason for the difference?