First, let's make sure we're all on the same page:
As a bit of background information, please know that when you start up an operating system -- any operating system -- the BIOS (or UEFI GOP) tells the operating system which graphics adapter should be used as the primary framebuffer. The primary framebuffer is basically a region of virtual memory that, when written to, causes the graphics driver to initiate a DMA transfer to send frame data to the graphics output device. Clearly, in a multi-GPU configuration, it's not this simple, but the general idea is, the operating system at a basic level is only aware of one single framebuffer. For the purposes of deciding what constitutes a framebuffer, monitors plugged into the same graphics card are considered to be driven by the same framebuffer. Monitors plugged into different cards are, by default, driven by different framebuffers. There are several technical tricks in place today that help to bridge the hardware gap between different framebuffers; hence my question...
Assume that you have purchased more monitors than you have ports on any one of your graphics cards. For instance, if your graphics card has one port, you have two monitors. If your graphics card has two ports, you have three monitors. And so on.
Also assume that you do not want an Eyefinity or similar setup, where the operating system treats all the monitors as one "big monitor".
You do want to be able to drag the mouse, and windows, between different monitors, seamlessly.
Ways to do this:
Physical graphics card bridging: Nvidia SLI or AMD CrossFire. These solutions will allow you to plug your "extra monitor(s)" into a second discrete graphics card. The two graphics cards communicate between one another using a dedicated bridge hardware (or in the case of the latest generation AMD Radeons, using the PCIe bus).
Platform hardware-assisted framebuffer sharing: Nvidia Optimus, AMD Enduro, LucidLogix Virtu MVP... the concept is the same. You have monitor(s) plugged into one card (usually the motherboard, for using the processor's iGPU), and monitor(s) plugged into a discrete graphics card. Some chip on the motherboard helps coordinate and synchronize these two separate graphics cards so that the operating system has the illusion of only one framebuffer, and thus you are able to have your multi-monitor setup. Note that some of these solutions can also control which GPU the frames are rendered from, not just the location that the output frames are rasterized to.
Software?: If neither of the first two hardware solutions are available, apparently there is still a way to do it. For instance, if your motherboard does not have Nvidia Optimus or LucidLogix Virtu MVP; and your cards are not in SLI; you can still take, say, an Nvidia GTX 280 and an Nvidia GT 210, plug them into the same machine, and you'll get the same result in terms of multi-monitor user experience. You can move your mouse and windows between the monitors seamlessly.
My question is, in the third option above, "Software?", how the heck does that work on Windows? Also, what is that particular mechanism/feature called?
- Is it a feature of the vendor-specific graphics driver?
- Is it built into Windows itself?
- What is the darn thing called?