RSSI is the "Received Signal Strength Indicator", and it's given in units of "decibels relative to 1 milliwatt", which we abbreviate as "dBm".
Decibels are a logarithmic scale, which is helpful here because it allows us to compare wildly different power levels without having to write too many digits. Every increase or decrease of 10dB represents an order of magnitude change. So since 0dBm is 1 milliwatt, 10dBm is 10mW, 20dBm is 100mW, and 30dBm is 1 full watt. It also works out that every 3dB increase is roughly a doubling, and every 3dB decrease is roughly a halving. So a 3dBm signal is roughly 2mW, and a -3dBm signal is roughly 0.5mW.
Regulations vary from region to region, but Wi-Fi devices are often allowed to transmit up to 1 full watt = 1000mW = 30dBm. A decent rule of thumb is that any air gap between a transmitter and a receiver, even with efficient antennas right next to each other, is going to be at least a 30dB hit. So even if your 802.11 AP (Wi-Fi router) is transmitting at one full watt (30dBm), and you put your client right next to it, you're not likely to see more than 0dBm RSSI on the client.
Why can't it be a percent? Well, that's mostly because there's no good universal reference points to define as "100%" or "0%". Is 100% the full transmit power of the AP? But that can vary due to regulations, transmitter designs, power saving schemes, and more. Is 100% the highest power signal the client can handle without distortion? That varies based on receiver designs and modulation schemes in effect. In contrast, 1 milliwatt is a known quantity that is independent of regulations and hardware implementations, so we use that.
Well-designed Wi-Fi radios can be sensitive to almost -100dBm, which is just 100 femtowatts (milli > micro > nano > pico > femto). So the typical range of power levels you need to talk about with Wi-Fi is 1,000 mW to 0.000,000,000,1 mW. Even as percentages of 1mW, that 100,000% to 0.000,000,01%. That's a lot of zeroes to type, which is why we go logarithmic and use decibels. 30dBm to -99dBm is much easier to write, and gives us all the range and precision we need with just two significant digits.
By the way, it's possible for a signal to be too strong for a receiver, and overload the receiver's "front end" circuitry, causing distortion that hinders communication speeds (imagine someone shouting in your ear so loud it makes your eardrum rattle). I've worked with plenty of Wi-Fi radios that hit distortion above -40dBm. I would suspect that -28dBm RSSI could be too much for your Mac's Wi-Fi radio front end, and might result in lower throughput than you would get if you moved your Mac farther away from the AP. It's definitely possible to design a radio that can handle signals above -30dBm, but not all vendors bother, and exceedingly few vendors publish the max RSSI their radio designs can handle. You kind of have to do your own throughput tests at different RSSIs to find out what your equipment is capable of.