Question: Is that requirement of converting 50Hz to a higher frequency in SMPS (Switched Mode Power Supply) is just to reduce the transformer/inductor core size?
Background:
Since as we all know that in a typical SMPS design, the AC input is first converted into DC and then it is chopped and converted to a higher frequency (100KHz to several MHz). There are problems on this approach, such as bad power factor correction, etc.
Why isn't is possible to just use the 50Hz as it is and using a traic and a optocoupler we could do the same thing at 50Hz. I mean turning off cycles when optocoupler is on through the traic and otherwise turning it off. Why is that design not used in power supply designs?
Off topic, but curious readers will fire this one soon, then why we are using such a low frequency in transmission lines. Can't we make power plant rated transformers into a size similar to a television?
[ I originally stolen this idea from one Amstrong oscillator based SMPS].