Degrees are primarily a historical unit. There are two physically meaningful ways to measure angles: the cycle, and the radian. The cycle is the arc length of a circle subtended divided by the circumference of the circle, and it goes from zero to one. The radian is simply the same arc length divided by the circle's radius instead of its circumference. Physicists and mathematicians have a marked preference for radians because the derivatives of trig functions are substantially simplified in radians, simplifying the ways computers calculate them. These two quantities are, of course, related by a factor of $2\pi$.
The degree is just scaling the cycle up by 360 because it's a number that can be divided by a lot of small integers without producing a fraction: 2, 3, 4, 5, 6, 8, 9, 10, 12, etc. This dates back to a time when decimals had not been invented, and avoiding fractions had a lot of computational advantages.
So, degrees are not a base unit in any sense, neither conceptually nor in terms of overall convenience in a modern setting.
Similarly for bytes. A byte is just 8 bits. Why 8? Probably because it's the smallest power of two that can encode an entire ASCII character (7 bit code). Computer scientists have a thing for bits, and it makes it possible to easily detect many cases where a file is not ASCII text without making text files unnecessarily large. I believe that a long time ago a lot of machines had different word/character lengths, but the 8 bit byte became a de-facto standard.
All of that said, the byte is, fundamentally, a unit of information, and therefore of entropy. As far as units go, especially in physics, we have to deal with systems where the number of degrees of freedom are countable only in principle, not in practice. It is situations like that where you need units like the mole, where you know it's an integer but have no way to actually count it. That's why we derive our unit for entropy as a Joule per Kelvin.
In the context of information entropy, on the other hand, everything is actually countable. There, a more natural unit for the machines is, of course, the bit, but that's a question of technological convenience, not anything fundamental. We could also use the trit for ternary, the oct for octal, the hex for hexadecimal, the digit for decimal, etc. Notice how those correspond to different numbering systems, where we characterize them by the number of symbols in the system. In that thinking, treating the 8-bit byte as a unit is the same as using a base-256 counting system. There's no fundamental feature of reality that makes that number system more special than any other.
Point being, both bytes and degrees are not actual units. They're more akin to the percent, or the SI prefixes (e.g. kilo, centi, etc), but they're not a power of 10, and so not "metric". It could also be argued that a byte is more closely related to the decibel or the "magnitude" in astronomy, given the presence of logarithms in the definition of those and in entropy, but those aren't base units, either.