96

Studying beginners course on hardware/software interface and operating systems, often come up the topic of if it would be better to replace some hardware parts with software and vice-versa. I can't make the connection.

7
  • 8
    General purpose CPU's have instruction sets that allow them to execute arbitrary logic operations. Software is compiled to a stream of commands executed by the CPU instruction set. This is an example of doing stuff "in software". Conversely, a software algorithm can be directly executed "in hardware" by created specialized arrays of logic gates in silicon.
    – Chimera
    Commented Dec 7, 2016 at 23:07
  • 12
    The "connection" is that both software and hardware execute, by different means, logic statements.
    – Chimera
    Commented Dec 7, 2016 at 23:27
  • 14
    It means exactly what it says. Software and hardware do stuff. Sometimes you can do stuff with either software or hardware, so you pick one. Then you wonder whether it would be better to do it with the other one. Commented Dec 8, 2016 at 4:15
  • 4
    Obviously you can't replace all hardware with software - there's no point trying to make a software monitor, mouse, or keyboard. (Unless they're virtual ones) Commented Dec 8, 2016 at 4:16
  • 2
    Consider that we use (software) Operating Systems because: 1) Writing an OS directly in hardware would require an incredibly complex chip, that would cost a lot 2) Hardware cannot be easily updated, hence an hardware OS would not receive updates. If a security problem is found it cannot be patched etc. etc.
    – Bakuriu
    Commented Dec 8, 2016 at 8:21

12 Answers 12

177

I think the fundamental connection that other answers are missing is this:

Given a general-purpose computer (e.g. a CPU), one can program it to perform pretty much any computation that we have defined. However, specialized hardware may perform better, or may not provide any value.

(this answer is focused on desktop processing and uses examples from that domain)

Replacing software with hardware

If you are old enough to remember PC gaming in the mid-to-late 1990s, you probably remember FPS games like Quake. It started out being "software rendered," meaning the CPU performed the calculations necessary to render the graphics. Meanwhile, the CPU also had to perform input processing, audio processing, AI processing, etc. It was very taxing on the CPU resources. In addition, graphics processing is not well-suited to a mainstream CPU (then or now). It tends to be a very highly parallel task, requiring many more cores than even a modern high-end CPU (8).

We moved graphics processing from software to hardware: enter the 3dfx Voodoo and Nvidia TNT (now GeForce). These were specialized graphics cards that offloaded processing from the CPU to the GPU. Not only did this spread the workload, providing more computing resources to do the same amount of work, the graphics cards were specialized hardware that could render 3D graphics much faster and with more features than the CPU could.

Fast forward to the modern era, and non-CPU graphics are required on the desktop. Even the operating system cannot function without a GPU. It is so important that CPUs actually integrate GPUs now.1

Replacing hardware with software

Back when DVD was brand-new, you could install a DVD drive in your desktop computer. However, the CPUs of the day were not powerful enough to decode the DVD video and audio streams without stuttering. At first, a specialized PCI board was required to perform the decoding. This was specialized hardware that was build specifically to decode the DVD format and nothing else. Much like with 3D graphics, it not only provided more computing resources but was custom-built for the task, making DVD playback smooth.

As CPUs grew much more powerful, it became feasible to decode DVDs "in software," meaning "on a general-purpose computer." Even with a less-efficient processor, it had enough raw speed and pipeline optimizations to make DVD playback work to users' expectations.

We now have CPUs hundreds or even thousands of times as powerful2 as we had when DVDs were introduced. When Blu-ray came along, we never needed specialized hardware, because general-purpose hardware was more than powerful enough to handle the task.

Doing both

Modern Intel CPUs have specialized instructions for H.264 encoding and decoding. This is part of a trend where general-purpose CPUs are gaining specialized functions, all in the same chip. We do not need a separate PCI Express board to decode H.264 efficiently as with DVDs early on, because CPUs contain similar circuitry.


1 GPU refers to a processor specifically designed to perform graphical computations. Older 2D graphics cards were not GPUs: they were simply framebuffers with DACs to talk to the monitor. The difference is GPUs contain specialized processors that excel at certain types of calculations, and as time went on, are now actually programmable themselves (shaders). Graphics hardware has always contained the specialized circuitry necessary to convert the data in a framebuffer into a format that can be output across a cable (VGA, DVI, HDMI, DisplayPort) and understood by a monitor. That is irrelevant to the discussion of offloading the computations to specialized hardware.

2 DVD-Video was released in 1997, at a time when the Pentium 2 was also newly-released. This was a time when CPUs were rapidly increasing in power: one could consider a new P2 computer with a DVD decoder, or installing one in a slightly older P1. Compare that to a modern generation 6 Core i7 using Wikipedia's list of MIPS, and a modern CPU is anywhere between 590 and 1,690 times faster. This is due in part to clock speed, but also the move to multiple cores as being standard as well as modern CPUs doing a lot more work per core per clock tick. Also relevant is that as technology advances, Intel (who dominates the desktop and x86 server market) adds specialized instructions to help speed up operations that desktop users want to do (e.g. video decoding).

2
  • Older 2d graphics cards were not GPUs in the sense understood today, but specialized hardware for graphics functions was very much present. A blitter can quickly copy large amounts of memory from one location to another, possibly using some simple logical operation to combine it with the data at the target location. This was present in typical 2d cards of the VGA/SVGA era. Other computers had other specialized 2d hardware, such as hardware sprites in home computers and game consoles of the 80s and 90s. Commented Mar 5, 2019 at 7:13
  • "Fast forward to the modern era, and non-CPU graphics are required on the desktop." interestingly on linux at least, things came full circle. Desktops increasingly started requiring 3D accelerated graphics, but then mesa came out with the LLVMPipe implementation. which will nowhere near as fast as dedicated hardware GL implementations was a lot better than previous software implementations. Good enough that you could run the fancy 3D desktops even without hardware 3D acceleration. Commented May 10 at 21:26
130

I am surprised nobody mentioned yet one of the most glaring examples: software-defined radio.

If you took a present-day smartphone back in time some 50 years and showed it to a competent engineer from the mid-1960s, he would be able to comprehend most of it. That a supercomputer can be reduced to something that fits in your pocket? Check. That you can have the equivalent of an ultra-high-quality color television in the package? Check. That it is that much faster, has that much more storage, etc., than computers of the era? Check. That software has been written that can perform such complex functions? Check.

But tell that competent engineer that oh, by the way, this package contains a set of extremely efficient transmitters and sensitive receivers: a digital spread spectrum transceiver that can simultaneously transmit and receive on multiple channels, communicating with an infrastructure tower that may be miles away; another digital transceiver that communicates high speed data with a base station somewhere in the building; yet another digital transceiver that communicates with low-power wearable devices; and another receiver that picks up a weak signal from a satellite in intermediate orbit... he would call you a liar.

He would call you a liar because he knows that receivers of such high sensitivity cannot be constructed without a multitude of tuned circuits, which filter out neighboring stations and select the signal of interest. And that such circuits require parts with sizes that are defined more by physics than technology, such as capacitors and inductors.

You would then have to explain that in a modern radio, most of that is done in software. That after the signal incoming from the antenna is converted to an intermediate frequency and amplified a little, it is then sampled by an analog-to-digital converter; and subsequent processing takes place in a digital signal processor. All that tuning, that filtering, which used to require tons of hardware in an old-school high-end radio can be described in the form of mathematical equations; and if that can be done, those equations can be executed in real-time by the DSP.

This, I think, is one of the most glaring examples of software replacing hardware. As a result, we carry smartphones in our pockets that, even to a competent 1960s engineer, would be akin to a magic trick.

Compared to this, the idea that the simple logic of a garage door opener, an electronic bathroom scale or a TV remote nowadays is implemented using a general-purpose microcontroller and software instead of custom hardware almost seems trivial (and it would certainly be a lot more comprehensible to our hypothetical 1960s engineer than software-defined radio.)

8
  • 16
    Underrated answer is underrated. Though I have a small background in electronics, and having constructed a few radio devices myself, I never knew that signal tuning was done by software these days. My mind is going bananas trying to imagine exactly how this is done.
    – Machado
    Commented Dec 8, 2016 at 11:18
  • 6
    @Machado If you ever do some serious electronic design, you'll quickly realize that it's basically all mathematical equations. Physical electronics can add, subtract, divide, multiply, differentiate, integrate, etc. It's just that instead of a physical capacitor to filter out all of the low frequency noise, it's done in code. The physical signal gets "loaded" into software through a specialized hunk of hardware called an analog to digital converter.
    – CHendrix
    Commented Dec 8, 2016 at 13:03
  • 6
    @Machado dspguide.com/pdfbook.htm This textbook has a lot of pseudocode examples. It's a gem.
    – brian_o
    Commented Dec 8, 2016 at 16:00
  • 4
    @Machado Don't forget to pick up a software defined radio (for about $20) and start playing with it! Commented Dec 9, 2016 at 5:47
  • 6
    Just don't tell the guy from the 1960s that what you primarily use all that computer-power and features for; is to post pictures of your food, stream cat-videos, and hunt Pokemons... Commented Dec 10, 2016 at 15:31
42

Consider this circuit:

enter image description here

It is a Flip Flop, aka a Bistable Multivibrator. It can be replaced with this code:

static bool toggle;

if (toggle == true) 
{
    lblTop.BackColor = Color.Black;
    lblBottom.back Color = Color.Red;
}
else
{
    lblTop.BackColor = Color.Red;
    lblBottom.BackColor = Color.Black;
}
toggle = !toggle;
4
  • 1
    thank you very much for your answer and for correcting the post! Do you have any book to suggest to learn a little bit more about it? Gabriele Commented Dec 7, 2016 at 16:17
  • 11
    check Code: The Hidden Language of Computer Hardware and Software. It covers nicely transition from hardware to sofware
    – Igor Milla
    Commented Dec 7, 2016 at 18:27
  • 4
    @igormilla I can vouch for your suggestion. I'm currently reading the book and it is by far the best book on computer architecture I've ever read. It does an excellent job of clearly and concisely explaining each concept, without shoving too much technical jargon down your throat. I'd highly recommend it to anyone who wants a deeper understanding of the relationship between hardware and software.
    – Chris
    Commented Dec 7, 2016 at 21:15
  • 1
    @igormilla, nice find! Happily for me, it's available as part of Safari Books Online,, so I can start reading right now. :) (Reading my comment again before posting: it sounds like an advertisement, but I'm just a happy customer. I haven't found any recommended technical book in the last year that I couldn't immediately browse or even read in full online.)
    – Wildcard
    Commented Dec 9, 2016 at 1:30
28

It means exactly what it sounds like.

A particularly famous example is the Disk II Drive designed by Steve Wozniak for the Apple II:

The chief innovation was making the controller compact by using software while competitors relied on hardware. As Bill Fernandez, then an electronic technician at Apple, remembers it, "the key advantage of [Wozniak's] design [was] that it used only six chips instead of the usual 60 to 70

Another example you're probably more familiar with: Emulators. They replace entire sets of hardware (and software) entirely in software. CPUs, various control chips, even storage devices.

Now you can't eliminate all hardware, eventually you need something to run the software on. But in general, any logic task you can implement in hardware can also be implemented in software (performance may not be identical, it may be slower, faster, or either in different situations, depending on the underlying hardware and the implementation).

7
  • So the only thing you always need is at least one processor? Commented Dec 7, 2016 at 15:49
  • 3
    @GabrieleScarlatti You'll need a bit more, specifically, I/O devices need to be hardware (but the controllers can be software!), and memory needs to be hardware. The connections between them need to be hardware, but the necessary logic is limited. Simple wires if you have enough pins, slightly less simple shift registers or similar devices if you don't have enough pins.
    – 8bittree
    Commented Dec 7, 2016 at 15:55
  • Ahaha yes I was a little too approximative, can you suggest some good book to learn more about it? Thank you very much for the answers! Commented Dec 7, 2016 at 15:58
  • 1
    The best answer so far in this topic are "emulators".
    – Machado
    Commented Dec 8, 2016 at 11:21
  • 1
    Can't talk about this subject without mentioning the Woz!
    – James R.
    Commented Dec 8, 2016 at 19:33
11

Another field in which this is true is synthesisers.

Early synthesizers were 100% analog hardware that generated waveforms directly then modified them via circuitry (filters, amplifiers, etc.). It was possible to digitally synthesize sound, but it required computing resources that the average person could not afford (an actual mainframe and custom digital-to-analog converter hardware).

As chip fabrication improved, synthesizers shifted from pure analog to synthesizer chips controlled by digital signals but still generating analog signals, and then to pure digital synthesis (sample playback, FM synthesis, true additive synthesis, and so on).

Today, processors as cheap enough and fast enough to allow programmers to create computer versions of classic analog synthesizers that exactly duplicate the behavior of the original circuits by simulating their behavior in realtime - in fact, phones and tablets are now capable of running fast enough to run these re-creations; the Korg iMS-20 is an example.

Both classic synthesizers and new ones are available as VST or AU plugins for digital audio programs such as Ableton Live, Logic, or Cubase, and these provide the access to synthesizers to people who wouldn't otherwise have space or money to be able to use them.

Edit: I should at this point also mention VCVRack, which simulates analog modular synthesis in realtime. Quite a step forward from multi-hour render times for a few seconds of music.

3
  • 1
    A 1977 Atari 2600 has enough CPU horsepower to generate four-voice music with a five-octave range, even while leaving more than 39% of CPU time available for display generation. The necessary lookup tables would take up more than half of a 4K cartridge (about 2300 bytes), but it was of course possible for cartridges to be larger than 4K. One probably couldn't have terribly complicated gameplay while playing music, but a pretty colorful title screen with a scroll-text would be a definite possibility >:*3.
    – supercat
    Commented Dec 9, 2016 at 16:08
  • Yes, I left out about 10? 20? years of development so my answer wouldn't turn into a novel. Very good points! I used to have something called (Musicworks)[thinkclassic.org/viewtopic.php?id=550] on my Mac 512K that could handle 4 voices, barely. Commented Dec 9, 2016 at 21:35
  • The 68000 takes more cycles to execute each instruction than a 6502, but with proper coding a four-voice wave-table synthesis should be fairly efficient. I'd estimate about 240 cycles/sample with amplitude scaling using a 256-byte table for each volume setting; eliminating the amplitude scaling would shave that time by 56 cycles/sample.
    – supercat
    Commented Dec 9, 2016 at 22:49
7

In former times, the cut was quite clear. Most things that needed speedy execution had to be implemented in hardware. Take for example a multivibrator which generates a frequency. Not too long ago you needed a couple of transistors, capacitors and eventually a quartz to generate a (fixed) frequency. Now there are cheap micro-controllers that cost only a few cents or so. Since they are so fast, you can use them easily to create a multivibrator. And moreover you easily can control via software what frequency to generate where in former times you needed to solder different hardware. Though, going over a certain (but now rather high) frequency you'd still need pure hardware. So you see, there is a line between both, but the part you can solve with software is growing (exponentially).

Edit Actually “Software can replace hardware” is not really correct. It's just the fact that hardware got so mighty that you can use it to run software which emulates hardware. So instead of a few simple but statically soldered transistors you use millions of transistors that understand software. So the term should be "Hardware can now understand software" instead.

4
  • To be fair though, nearly all micro-controllers have (at least one) general purpose PWM module, implemented in hardware. So it is rarely done to bitbang a frequency output, using software running on the CPU proper. Commented Dec 9, 2016 at 23:32
  • @LyndonWhite A PWM is no general multivibrator. It modulated the pulse width which is a subset of a frequency generator.
    – user188153
    Commented Dec 10, 2016 at 8:33
  • 1
    Indeed this is true. My point was rather that PWM though is by far the most common type of signal one wants to generate (with-in some range of duty cycles) -- which is why microproccessors have special hardware for it. I would probably put PWM vs bitbanging up as an example of the opposite "hardware can replace software". Commented Dec 10, 2016 at 8:45
  • @LyndonWhite I made an edit to my answer.
    – user188153
    Commented Dec 14, 2016 at 11:04
5

A comparison between the arcade game Tank (circa 1976) and the home console game Combat (1977) yields a nice example of how software could replace hardware even 40 years ago.

The arcade game Tank (circa 1976) allowed two players to drive around tanks and shoot at each other. It did not include any sort of processor, but instead had hardware counters to keep track of the horizontal and vertical positions of the electron beam, tanks, and shots, as well as the player's scores, rotational angles, the elapsed time. It had hardwired logic to output the bitmap data associated with the scores, the players' tank shapes, and the background.

The Atari 2600 Video Computer System (a home game console circa 1977) included hardware to track the horizontal (but not vertical!) positions of two bitmap objects and four variable-width pulse generators, hold and clock out a 20-bit-wide low-resolution playfield graphics pattern as well as two high-resolution 8-bit patterns, latch colors for the players, background, and playfield, and detect collisions among the various objects. It also included a general-purpose programmable timer, but the hardware had little other than the above. Nonetheless, even though the hardware is much simpler than that of the game Tank, the 2K ROM cartridge Combat allows the 2600 to play the same basic game but with many other features (a variety of vehicles and backgrounds, bouncing shots, etc.) because it can replace most of the arcade machine's hardware with software. Interestingly, even though the Atari 2600 is probably the second-simplest hardware platform of any commercially-sold microprocessor-based home video game system, it is so well designed to facilitate replacing hardware with software that when programmed correctly it can run circles around many of its competitors.

2
  • I spent a lot of hours and quarters playing Tank. A friend of mine restores those old game consoles and I was very surprised to learn that it is all hardware. Someone spent a lot of time doing Karnaugh maps to make it manageable. Replicating it in software is much easier, and requires less maintenance. Same with Asteroids. But neither ends up with the same feel unless you replicate it exactly, including the console controls. Plus, vector graphics just don't look the same on raster displays, IMO.
    – SDsolar
    Commented Dec 11, 2016 at 21:12
  • @SDsolar: I remember reading a sheet, published I think by Atari, with some mods that an owner could do to tweak various aspects of gameplay, I think two mods involved holding enabled the latch for the player shot angle (adding a "guided missile" feature) and maybe adding something like an invisible-tank feature. Changing firmware would have required more complicated equipment, and some kinds of mods would that would be easy in hardware would be impossible in firmware (e.g. if the player position uses a binary counter, having a player appear twice per scan line is trivial).
    – supercat
    Commented Dec 11, 2016 at 21:19
1

The phrase "software can replace hardware" is a warning to not try and solve problems with hardware unless there are very clear advantages. Software is 10x-50x cheaper to develop and almost infinitely cheaper to produce per unit than hardware. Doing X in hardware will not be a winning solution unless X really can't be done efficiently in software.

7
  • I didn't downvote, but I'm pretty sure this is inaccurate.
    – J. Allan
    Commented Dec 10, 2016 at 16:25
  • @JefréN. That's just about buying hardware rather developing it. Commented Dec 10, 2016 at 18:21
  • You're right; I misunderstood the intent of the question. Do you have a citation or link to back your assertion that "software is 10x-50x cheaper to develop ... than hardware."? I'd be interested in knowing if that is a ballpark figure or if that is a [commonly accepted/verified] statement. (I'm sorry you're sad, btw. ;D)
    – J. Allan
    Commented Dec 10, 2016 at 18:30
  • I didn't downvote but I can't upvote because it is not always true that software is cheaper. Often it can be much more expensive. Also, consider the concept of ASICs and FPGAs, where software is used to create a hardware equivalent which then can run faster. Like the ultimate difference between executables vs interpreted programs, but more so.
    – SDsolar
    Commented Dec 11, 2016 at 20:59
  • 1
    @SDsolar I haven't heard of a case where the same feature is more cheaply implemented in hardware than software. If it is feasible to do in software, it will almost universally be cheaper to do it in software. Of course, e.g., a software renderer will run into performance problems sooner when compared to a graphics card. But that goes more to the feasibility of a software implementation for achieving a certain baseline of performance. Commented Dec 11, 2016 at 22:25
1

The nuance has been well tackled, but I think it's possible the stumbling block for OP, is that's its very much not possible to replace hardware with software. The 'hardware' invariably involves significantly less 'hardware' than the 'software' solution.

The difference is that the logic of a process/algorithm/computation can be moved between hardware and software. Many examples have been given, so I won't elaborate.

-1

In early computers with virtual memory, you had to make a task switch upon a TLB miss to load a new page entry. A piece of OS software would find the correct process, and walk through the page tables, finding the correct entry and writing it back to the TLB. Before switching back to the original process to continue.

Now most CPU's use hardware to do the process, reading page table, walking the page tables and updating the TLB.

Both methods need to use software to handle page-faults, but as TLB-misses handily outnumber page-faults the hardware walk still outperform software.

In generel if you have a simple procedure that you need to handle repeatedly you find a hardware replacement. If you have a complex hardware solution with a complicated control flow, you can simplify the hardware by using software.

-2

There are many instances where software can replace hardware and vice versa.

A classic example of this is a math lookup table. Instead of calculating the results to common expressions each time, they are stored internally in your math co-processor, and simply referred to when needed.

Most are probably already familiar with audio filters, and software that can mimic real instruments and devices like pedals or amplifiers.

Any hardware that can be created virtually will be used, if it's faster and/or cheaper than the physical equivalent.

-2

In accounting, a hard copy of an invoice (for instance) can now be sent electronically, and software is starting to handle the receipt of this kind of paperwork and its processing more and more. It is an excellent example of hardware being replaced by software.

Not the answer you're looking for? Browse other questions tagged or ask your own question.