Now as per many blogs on the internet, the rated current capacity of a usb 2.0 and usb 3.0 (usb 3.1 gen 1) ports are 0.5A and 0.9A.
That is the minimum current a USB host must provide to meet the USB spec, it is not the maximum current USB ports will provide. There are exceptions to this for things like USB-OTG but that's getting into some niche applications.
One thing that I've discovered is that rated audio output power rarely has any correlation to how most people understand physics. I once ordered a "240 watt" car stereo amplifier and I was very concerned about having power supply wires big enough to provide all that power. When I received the amp I discovered a 5 amp fuse on it. As anyone that knows Ohm's law can figure out 12 volts at 5 amps does not get even close to 240 watts. The amplified speakers in question here likely also have similarly bogus claims on audio power output.
I see the speakers are powered by a USB-A connector, and just how much power can we get from such a connector? USB 2.0 allows for up to 1.5 amps, or 7.5 watts, supplied to a device. It's quite common for desktop computers to supply 1.5 amps to their USB-A ports. But 7.5 watts is less than 10 watts.
As you may have noticed from various smart phones on the market that they will have USB-A power bricks rated for more than 7.5 watts output. The USB-BC and USB-PD specification will allow for 2.4 amps, or 12 watts, from USB-A ports. It appears MSI supports putting some USB ports into a high power mode for charging smart phones and other such devices, calling this "super charger" mode. My guess is this is just USB-BC by another name as it appears to be limited to 1.6 amps and data transfer capability is lost on USB ports in this mode.
There's a few phone makers that violated the bounds of USB specifications with protocols like Quick Charge, which appears to allow for 18 watts through a USB-A port. I'm sure this is fascinating to everyone but there is a point, that being power out must be less than power in because nothing is 100% efficient. Just how much power input is required to get 10 watts audio output?
Amplifier designs are grouped into "classes" since we discovered that there's only so many ways to build a working amplifier. The simplest is called class A and ideally they are 50% efficient. The most common are class AB, which is a variation on the class B, and both are ideally 78% efficient. After that there's class C, D, and E that get more efficient but also more complicated and are impractical for amplified speakers. What it comes down to is that most people will choose a class AB design and be quite pleased to have 65% real world efficiency.
To get 10 watts audio output power, have room for amplifier inefficiency, and have power for the LED lights on the speaker, it's going to have to use something like Quick Charge on the USB-A port so it has 18 watts to work with. Since they don't do that then they aren't getting 10 watts of audio output.
So how much power will the speakers actually output? Probably one whole watt. Total from both speakers.
I went through that lengthy explanation to hopefully dispose of some common misconceptions about USB and audio amplifiers.
How people can get away with such blatant exaggerated audio output specs still baffles me. I know how they do it should someone take them to court on it. They will show they can get a highly distorted test tone from one speaker at 5 watts. Then they do the same from the other speaker. Then they add the two, implying the test tones have any meaningful correlation to what people might actually want to listen to, and that 5 watts from each individual speaker at a time means 10 watts. That's some tortured logic, but they keep getting away with it.