I've been trying to find an answer to the question at the top of this page, and so far have not found anyone who has actually measured things. Lots of people say "should" and "might" and "generally" but there is no actual answer. Having the equipment to do so, I did so.
I measured the voltage and current coming from my USB power source (more on that later) and used a relatively new Galaxy S4, and an S5, as my test units.
My S5 is drawing 1.79 A from the standard USB wall charger and standard (2.0) USB cable. The S5 is powered down during this test (obviously the internal smart-charge circuit is not powered down).
The short answer is that if your USB power source is capable of pumping the current into one of these devices, they will accept it. However, it also appears that the internal charging circuit is monitoring the source voltage as it goes along, and if it looks like the source is starting to current limit (evidenced by the voltage dropping, even slightly), it reduces the amount of current that it will demand for charging. This is NOT what people usually think is going on when you are recharging a battery.
There are a lot more details to my analysis, including intermediate results -- many of which are fascinating to me because I am a hardware geek (and EE and ham operator and experimenter) but probably not as interesting to someone who only wants to know the answer to the question at the top of this page.
The USB 3.0 port is not what makes my newer phone charge faster than its older brother. It seems to be the wall charger itself, as well as the phone being switched off, which allows the power manager (inside the phone) to provide 100% of the charging power to go to recharging the battery.
If anyone's interested in my measurement details, follow up here and I'll answer!
Dave