5

I remember when Windows 95 came out and the 'Add New Hardware Wizard' seemed almost magical! It was able to detect a large amount of hardware on the COM ports and automatically install drivers. It was a miracle that you no longer had to fuss about with device driver diskettes, DMA addresses, etc.

I assume it probed the ports or hardware registers looking for device signatures (e.g. specific responses to AT commands), but with all the different baud rates, diverging implementations of AT commands, and other difficulties getting hardware to identify itself, it still strikes me to be a marvel. Is this how it worked, or was the technology behind it more mundane?

2
  • 7
    As I understand it, it worked by "sheer force of will". There were giant test labs with thousands of different machines, people installing and uninstalling hardware continuously, random Microsoft employees going to the local Best Buy or whatever and buying one of every single peripheral device on the shelf, etc. As far as technical details, they used every trick in the book to narrow down and identify what you just installed. It was indeed a marvel that it worked as well as it did, but a lot of people put a massive amount of effort into it. Commented Oct 26, 2022 at 3:28
  • I would assume that the Add New Hardware wizard did something similar to what modern OS kernels do during system boot: stackoverflow.com/questions/18854931/… (with the exception of probably relying more on ISA PnP because PCI market penetration was not that large in 1995 IIRC)
    – DmytroL
    Commented Oct 26, 2022 at 9:12

1 Answer 1

5

For the first computers, there were no standards. Every manufacturer built whatever they wanted, and many devices were locked into specific form factors, etc. You couldn't use a Tandy CPU with a Commodore motherboard, for example. Eventually, as the industry grew, they colloquially decided that being interoperable meant that consumers would have more choices, and therefore manufacturers could be more competitive, because consumers could choose from various parts depending on their preferences of quality and features without being locked into a single vendor.

In the earliest days of computing, computer configuration was manual, literally to the point where you'd move a "jumper" to connect a circuit electrically or twist a few wires. For example, old hard drives had a cable with a twist in it; if you attach the hard drive below the twist, it was the primary drive, and above would be the secondary drive (historically, "master" and "slave").

Newer drives would support the option to be told which drive was which. This was through a jumper, a small black rectangle with metal inside to complete one of three circuits, named "master", "slave", and "cable select". If you had a twisted cable, you'd use the Cable Select mode on both drives. If you had a straight wire, you would use the other two jumpers, with the primary drive being whichever you wanted. If you tried to use the same mode for each, the computer might not be able to read either; this was the earliest of conflicts, a physical mismatch of settings. Even later, the standards evolved to the point where the BIOS itself would tell each drive what designation it would receive.

In the years leading up to Windows 95, manufacturers started to build hardware that was configured through software. This was accomplished through the BIOS, which could detect these devices and configure them, or even allow manual setup through the BIOS screen. The BIOS itself was standardized and accessed through the OS, typically DOS, through a specific interrupt call. This allowed a single program to run on a wide range of hardware as long as the CPU was compatible.

As an example of this consolidation, at one point, you had to tell a program if you had a SoundBlaster or AWE or something else. By the time Windows 95 came around, it already knew how to output the audio you wanted to the card, regardless of which card it was. Programs only had to worry about outputting audio without any concern for which card it was. This was better for developers, since they didn't have to distinguish between them. Standardization made more software accessible to more people.

Most of the pieces were already in place when this revolution came around. All Microsoft had to do was combine everything into a single package that Windows 95 could use. That single package was called Plug and Play, or PnP for short. The Add New Hardware wizard had a few layers. First, it could detect PnP-compatible hardware, which would use standard interfaces. Next, there was a suite of standard drivers for well-known interfaces. After that, the wizard would check for installed drivers, and select one of those, if appropriate. Finally, the user could manually configure a driver, usually provided by the manufacturer, or they could attempt to use one of the generic drivers, though they might not work.

Microsoft didn't have to build everything from scratch, they just had to bring it all to one place. There were limitations to the strategy. DOS was still running underneath Windows at the time, so a user with installed TSR (Terminate and Stay Resident) or DOS drivers might introduce conflicts or other system instability. Windows was more or less a very complicated and sophisticated DOS program, rather than an OS of its own right. It still was at the mercy of the BIOS configuration as well.

Even the "Scanning for new hardware" feature would occasionally run into a card it couldn't detect property and freeze the entire computer, forcing a hard reboot to recover. There was a system in place allowed Windows to remember that device and ignore it next time, but that meant the hardware probably wouldn't work. It was entirely plausible to open up a box, install something, and then spend countless hours trying to figure out why it didn't work, only to realize you didn't have the correct driver for it.

Formal hardware testing didn't come about until WQHL, in Windows XP onwards, where Microsoft took a more aggressive approach to fine-tuning quality control. Before then, you could plug in whatever you wanted, and Windows would do its best to be compatible, but if that failed, you were out of luck. With WQHL, they could enforce manufacturers to follow the standards they set, else fail to be certified. Certified hardware was seen as a commercial advantage, so most manufacturers followed that path.


So, what does this all mean? It mostly means that computing technology was already converging to their various standards, and Microsoft just jumped in at the right time to add that one final touch that brought everything together. It wasn't perfect, but it was accessible to the masses, and really drove computing forward in a way that probably would have taken much longer if not for the feature that made it all possible, the PnP system. It took the entire concerted effort of the computing industry to make this magic possible. I guess you'd say this was mundane in nature, yet it took the effort of everyone involved to make this a reality.

4
  • 3
    Note that Microsoft didn’t just tie everything together in Windows 95; they initiated many of the standards that were involved. For example, the BIOS extensions and hardware features involved were mandated by various Plug and Play specifications, which were created by Microsoft and Intel. There were also various branding initiatives with onerous requirements before WHQL, such as “designed for Windows 98”. Commented Oct 26, 2022 at 10:32
  • 3
    Technically, DOS wasn't "running under Windows 95". It was "just an extremely elaborate decoy". See What was the role of MS-DOS in Windows 95?. Basically, Windows 95 would virtualize DOS for TSRs and 16-bit drivers to run in, then indirect Windows's hardware access through the DOS dummy if the relevant interrupt vectors had gotten hooked by something. Using DOS as the bootloader for Windows 95 was a necessary part of allowing that hooking to take place so the compatibility goal could be achieved.
    – ssokolow
    Commented Oct 26, 2022 at 11:03
  • What type of controller used a "cable twist" to decide on master/slave? I have seen this for floppy drive cables for drive A/B selection but am not aware of it for hard disks use.
    – Edders
    Commented Oct 26, 2022 at 11:06
  • 2
    For context, Cable Select was introduced when IDE got standardized and it's accomplished by disconnecting pin 28 on the slave drive (as diagrammed here). Apparently the 80-pin cables handle that within the connectors as a necessity of swapping the master to the end of the cable to resolve signal reflection issues in single-drive configurations.
    – ssokolow
    Commented Oct 26, 2022 at 11:16

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .