Let's answer it part by part.
My question is, can Windows 10 even make use of this power?
Technically - yes, based on Windows specifications it is ready to use those cores.
We know that it can support up to 32 cores but would it actually use them?
Yes, Windows theoretically can use them. No, practically there will be no significant performance boost for typical usage.
Are there programs out there that could?
Yes. From mining programs (that use CPU power for "making money") to multi-threaded compression programs, virtual machines and so on.
Performance gain will be visible when running multiple single-threaded and/or multi-threaded CPU demanding applications.
These types of processors seem to be aimed at the gaming market
Those are ads, targeted mostly on gamers who don't know how to waste their money. (Explanations are given below.)
So would games be able to use all of that power?
Theoretically - yes. Practically - not now. Right now there is no AAA game that can make use of 16 CPU cores and bring any significant performance improvement. What you may gain from the most demanding game may be +1-2 FPS, compared to a typical 8 core CPU.
Game Streaming:
The cores may be useful for game streaming software, though it's dependent on the software - one can use CPU, the other GPU, the third - use both.
Yet low-lag game streaming hardware can be bought to provide constant high streaming FPS without significant harm to game FPS. The cost of such hardware (USD$200) is cheaper than the cost of "extra" CPU cores (+USD$1000).
I feel we have personal computers with masses of power that can't really be used and are hamstrung by the operating system?
Yes. Currently there is a lot of CPU and GPU power in computers and mobiles that are unused. That mostly includes users who use 4+ CPU core computers and smartphones mostly for browsing, leaving the cores almost idle most of the time.
Suggestion for gaming PCs:
Buying such CPUs for gaming is not recommended.
Currently there is no game to use this power, so it's waste of money.
After a few years - when games will be able to use it - this "old-gold" CPU will miss new technologies (such as the next DDR* RAM) which you may want for having a next-generation top-gaming PC. So again, it will become a waste of money spent in the past.
Gaming computers should be upgraded approximately each three years, to always be able to run the top games. If you buy a part (CPU, GPU, or RAM) that will "last longer" (e.g. five years), then it's much better to save that extra money and upgrade the computer after three years, in order to avoid performance bottlenecks caused by old hardware.