The original System/360 architecture had a bit in the Program Status Word that would select an "USASCII" mode rather than the usual EBCDIC. Setting this bit changed how the BCD arithmetic operations worked: The sign of results would now be encoded as a nibble reading 1010 for plus and 1011 for minus (instead of 1100/1101 for EBCDIC) and the upper nibble in unpacked "zoned decimal" numbers would be stored as 0101 (instead of 1111 for EBCDIC).
This feature was never much used, and was abolished in System/370.
For EBCDIC, putting 1111 in the upper nibbles makes perfect sense: it makes the bytes printable digits. And 1100/1101 as the sign nibble in unpacked results means that if you punch the result to a card, it will look identical to the same number computed by a 1401 (even though the 1401's internal bit patterns were different).
However what is particularly ASCII-friendly about using 1010/1011/0101 instead?
The digits in ASCII have 0011 as the upper nibble, not 0101. Putting 0101 in the upper bits would produce PQRSTUVWXY on an ASCII device rather than digits.
Likewise, there doesn't seem to be anything ASCII about the bit combinations 1010 and 1011 for signs. Naively converting a packed BCD number to ASCII characters nibble for nibble would result in digits followed by :
or ;
for positive and negative, hardly an improvement over <
and =
.
It doesn't look like producing ASCII output is made easier in any way by running the S/360 in "USASCII" mode rather than EBCDIC. So what was the point supposed to be?