I am wondering why 12V is the standard voltage for LED strips (RGB/W) when each SMD LED whether it be 3535 or 5050 is usually rated for ~3V.
This will be a bit long but it turns out that this standard is a natural fit for the use case of strips.
12V white LED strips use groups of 3 LEDs and a resistor in series. With 3-3.4V on each LED, this leaves 1.8-3V on the resistor, wasting 15-25% of total power. 24V strips use groups of 6 LEDs instead of 3, and the power lost in resistors is the same, however voltage drop in the copper on the flex pcb strip is much less due to current being halved relative to 12V.
Now, why 12V or 24V? It's a compromise between convenience, cost, standardization, and efficiency.
There are two ways to make LED lights.
First you can have a small number of high power LEDs, or a COB LED, to make a point light source and a spotlight with a nice controlled beam. However all the heat comes from a small area, which complicates thermal management. In standard household lightbulbs, there is very little space, and manufacturing cost is kept to the absolute minimum, so the heat sink is tiny and inefficient and LED temperatures can reach 80-100°C.
Second option is to use lots of low power LEDs over a large area to make a diffuse light, like strips, LED tubes, or LED panels. This makes thermal management much easier since the heat is generated over a large area, and there is also a large contact area with ambient air for cooling. This has implications for efficiency. A cool LED is more efficient than a hot one. Compared to a cheap lightbulb running very hot, this makes up for a part of the power lost in the resistors of a strip.
Then... how to drive these LEDs?
In the first case, a small number of LEDs can be wired in series and driven by a constant current AC-DC switcher. But for a diffuse light with perhaps 50-100 LEDs or more, this is not that user-friendly. There are AC-DC switching constant current drivers designed to output high voltage like 200 volts, but these are absolutely not DIY friendly for obvious reasons, then the LEDs have dangerous voltage on them and must be isolated from fingers, etc. This type of driver targets a completely different customer (ie, industrial light manufacturer) than the LED strips which aim at DIYers, interior decoration, etc. A LED assembly inside a T8 LED tube can be at high voltage since it is inside an isolated enclosure. Strips you can glue under your kitchen cabinets in DIY aluminium profiles have to be low voltage!
So how to drive lots of LEDs without high voltage? With serial-parallel groups. However these will need a way to balance the current. Resistors can be reduced or omitted if LEDs are binned by Vf and kept at the same temperature, which results in arrangements like Zhaga. You're supposed to drive that with a constant current driver with output voltage around 40V. It's a metal core pcb so pretty good for thermal management and high output, and if you mount that on an aluminium heat sink then LEDs will be reasonably at the same temperature so current sharing resistors can be omitted. That makes slick linear lights, but it won't work for interior decoration or under kitchen cabinets, it needs a non-standard power supply, it is not flexible...
If we want a cheap flexible strip for the non-specialist market to let people and interior designers make cool stuff with we're running out of options pretty quick.
LEDs won't be binned and temperature won't be uniform, and there will be voltage drop along the length of the strip, so there will have to be current setting device for each series group of LEDs and the whole strip will have to be voltage driven. This means a standard voltage, so 12V or 24V for cost.
So as we've seen, the strip format pretty much results in constant voltage drive, the only choice we have is what device sets the current. It can be a resistor, which is cheap.
It can be a constant current linear driver too. This will waste voltage as heat, but... on a 24V strip, if you have a low voltage headroom current regulator, you can put 7 LEDs in series (7*3.2V=22.4V) instead of 6 LEDs, which halves the losses, pretty good at only 7-12% losses now. However it is a bit more expensive, and it prevents the use of flicker-free voltage/current regulated dimming, instead PWM dimming has to be used, so if we want flicker free it has to be high frequency. So, a compromise on convenience, but on the other hand it won't care about voltage drop.
But are we going to use a switching constant current driver for more efficiency? Hell no! With 7 LED per group, losses are already pretty low. With a constant current LED strip it would be a much cheaper and better choice to adjust the supply voltage down so the current regulators have just enough headroom to work, and you'll get very low losses.
A 120 LEDs/m strip, using resistors, with excellent CRI95 LEDs, costs less that USD 6/meter and much cheaper if you want garbage CRI. This means each group of 7 LEDs costs 33 cents. Adding a switching constant current driver for each LED series string would double the price of the strip, and no-one would buy it. It would also probably not work with high frequency PWM drive and would be an EMI nightmare, for no efficiency gains relative to linear constant current driver.
So we have either resistors or linear current regulators.
And if you worry about efficiency...
A GU10 LED spot gets about 70 lm/W because it runs extremely hot and the power supply is too tiny to be optimized for efficiency. A standard bulb will be about 100lm/W for the same reasons. A strip can have 100lm/W efficiency too with efficient LEDs and a decent power supply, and everything runs much cooler, so it will last longer.