60
$\begingroup$

I was born in 1968 and used dial-up to mainframes back before PCs were available. We got an Apple ][+ when they came out and thought 48K was a lot of memory.

I live in constant wonder at the world we are now in, which seems like science fiction to my childhood self. How can I convey to students born around the turn of the millennium how amazing this growth has been?

I tell them about Moore's Law, of course, and that, when I was young, we paid 25 cents to play Asteroids, back when 25 cents was a lot of money, but I know I sound like Grampa Simpson. I share that a single Google search uses more compute power than all of the Apollo missions, but even I can't wrap my head around that.

How do you convey to students some of the wonder that you feel (as part of motivating them to appreciate and learn computer science)?

$\endgroup$
19
  • 5
    $\begingroup$ One thing you might do is show the video by Bret Victor called, "The Future of Programming" to show what has NOT changed since 1968 - our attitudes towards programming and what computers can do. $\endgroup$
    – user737
    Commented Aug 4, 2017 at 18:36
  • 15
    $\begingroup$ Can I play devil's advocate here for a second: why do you need to convey this at all? $\endgroup$ Commented Aug 4, 2017 at 21:26
  • 6
    $\begingroup$ @KevinWorkman I don't need to, but I want to share my sense of wonder. Also, I teach computer architecture, where Moore's Law is part of the curriculum. $\endgroup$ Commented Aug 5, 2017 at 2:14
  • 7
    $\begingroup$ One thing is to look at the TOPS-100 list and see what is the last year your desktop PC would qualify as a supercomputer… and how few more years when it exceeds the sum of all supercomputers in operation. $\endgroup$
    – JDługosz
    Commented Aug 5, 2017 at 6:17
  • 3
    $\begingroup$ You don't even need to go back to the 60s to get that sense of wonder. Just the other day, I noticed that one of the research machines was almost out of memory; it only had 400 MB free. I literally stopped in awe for a few minutes after realizing what I was thinking of as "almost no memory": that was the size of my first computer's hard drive. $\endgroup$
    – Ray
    Commented Aug 7, 2017 at 21:10

25 Answers 25

54
$\begingroup$

This is really difficult to communicate to anyone who hasn't lived through it (and even to those of us who have).

I don't usually go back as far as the 60s. I show my students a picture of the ASCI RED supercomputer from ~1998, which was the first supercomputer to be able to perform 1 trillion floating point operations in a second (1 TFLOP). It's basically the size of a warehouse floor, with another floor consumed by its climate control system. It cost ~46 million dollars.

Then I show my students a picture of a typical rack of servers from 2008, which could perform 2-4 TFLOPs. In a decade, the size had shrunk from a warehouse to a single rack, the computing power had doubled, and the cost was about $200K.

Finally, I show my students a picture of a Nvidia K80 GPU card, which can perform over 2 TFLOPs. In less than a decade, the form factor has shrunk to a card you can put inside a desktop computer, with the same power. Launch price was about $5K.

That is in less than 20 years. Extrapolating from that back another 20 years only gets you to 1978; you have to go 10 years beyond that to reach 1968. It's really hard to communicate the extent of this change.

And that's just the change in raw computational power, ignoring other advances like the Internet...

$\endgroup$
12
  • 33
    $\begingroup$ If an image helps, here is a 5mb hard drive in 1956 being loaded on to a truck. $\endgroup$
    – BruceWayne
    Commented Aug 5, 2017 at 2:18
  • 2
    $\begingroup$ And here is a 5MB drive being loaded into an airplane in 1956 (same drive?) reddit.com/r/pcmasterrace/comments/2xtxpb/… i.imgur.com/m1cndmB.png $\endgroup$ Commented Aug 5, 2017 at 5:20
  • 31
    $\begingroup$ If you want to go a bit further back, I like to compare with the iconic Cray-1, the worlds' faster computer as of 1976, which produced a stunning 160 MFLOPS, weighed over 5 tonnes, cost \$8 million, and used enough power that it had to be liquid cooled in a fluorinated refrigerant. Which we can compare with the Raspberry Pi Zero, which fits across the fingers of your hand, costs \$8, is more than 10 times as fast, and has 32 times as much memory, and doesn't even need a heatsink to cool. $\endgroup$
    – Jules
    Commented Aug 5, 2017 at 12:36
  • 11
    $\begingroup$ @BruceWayne When that pic was posted to Imgur a while ago, someone joked that the size was because it contained a midget with a chisel and a supply of stone tablets. The mindblowing part is that 5MB is such a low number that would actually work. $\endgroup$
    – JollyJoker
    Commented Aug 7, 2017 at 8:54
  • 2
    $\begingroup$ @OrangeDog Say each tablet is the size of an iPad. Half a cubic meter can hold a thousand of those. 2500 bytes per side should be easily doable for the midget with a chisel. $\endgroup$
    – JollyJoker
    Commented Aug 7, 2017 at 13:57
33
$\begingroup$

Others have mentioned volume of a system and I think that's a great place to start. I have always thought of it like this:

According to Wikipedia, in 1971, the 4004 could perform roughly 75000 instructions per second. So if you bought one when it was first announced, on November 15, 1971, and were able to keep it running continuously on some task, you could have performed about 1.0821931e+14 operations.

If you purchased a Intel Core i7 5960X and started it running continuously on some task ten minutes ago, it would have performed about 1.42986e+14 operations.

Modern GPUs can perform even more operations so in reality, it would probably take 2 minutes to match the 40+ years of the 4004.

Think of some task your students will be interested in like an image filter and figure out how many operations it takes. Then divide it and say: "If you wanted to apply a sepia filter to your collection of 1000 images and you bought a computer in 1971 and left it running, it would almost be done. If on the other hand, you had waited until the beginning of this class, it would now be complete."

Basically, it's usually faster to wait for new hardware than start a task on old hardware and leave it running.

Sources: Wikipedia (Don't have enough rep for 2 more links) http://scottsoapbox.com/2015/08/15/how-far-weve-come-40-years-of-processing-power/

$\endgroup$
7
  • 1
    $\begingroup$ Welcome to Computer Science Educators! This is an interesting comparison. I hope we'll be hearing more great content from you. $\endgroup$
    – ItamarG3
    Commented Aug 5, 2017 at 18:42
  • 4
    $\begingroup$ And this is why we don't have interstellar craft (well, one reason...) - the same principle: if we left today and it took 40 years to get there, we could instead leave in 35 years time and it only take 3 years to get there. $\endgroup$
    – fdomn-m
    Commented Aug 7, 2017 at 16:28
  • 2
    $\begingroup$ @freedomn-m this assumes that there will be a civilization capable of building an interstellar craft in 35 years time, so in this case it might be favourable to launch a craft now and get there in 40 years :) $\endgroup$
    – Lee F
    Commented Aug 8, 2017 at 10:31
  • 1
    $\begingroup$ The Human Genome Project is a real-life example for this phenomenon. What used to be the work of years can now be done in what? A few days? $\endgroup$ Commented Aug 8, 2017 at 14:36
  • 3
    $\begingroup$ @freedomn-m So, faster-than-light transportation technology is going to be invented within the next 35 years? Travel time in space has not experienced regular massive improvements like processor performance has. It took months to reach Mars in the 70s, it takes months to reach Mars today. $\endgroup$
    – 8bittree
    Commented Aug 8, 2017 at 22:53
15
$\begingroup$

I (barely) remember when the first iPhone came out. I remember playing this skeeball app on it, and really enjoying it. I thought it was so cool, and fast, and I thought the phone was sleek and small.

Fast foward - I use that iPhone as an alarm clock (it can't do much else - we took it off our "plan" or whatever it's called a while ago). I've played the Skeeball game since then. It's slow as all get out, and it looks much thicker and uglier than the shiny new smartphones common among teens now.

I remember using an older Mac computer that was my mom's when I was younger. Now I have a chromebook that's pretty fast, and dual boots Linux and Chromium.

I have an arduino the size of your hand that can run LEDs and motors and whatever else you want it to do. You can buy a raspberry pi for \$30 (of course, you have to hook it up to a monitor, keyboard, and mouse, but still). The chromebook we got on sale for $100. I remember looking up prices for old computers and being shocked.

Even in your student's lifetimes (they're older than I am!) they'll have some memory of computers getting faster, cheaper, nicer, better. Tell them to extrapolate that back a couple of years. And then a couple of years more. Show them where they fell on a graph of Moore's Law - and ask them to imagine the start of it.

(EDIT)

I see you live in San Francisco. Send those young whippersnappers down to the Computer History Museum in San Jose on a Saturday to see one of the demonstrations for the IBM 1401 (and tell them to look through the galleries while they're at it; it's a great museum, probably in my top ten I've ever been to, and I've been to a lot of museums).

$\endgroup$
6
  • 4
    $\begingroup$ Completely agree. OP makes it look like technology during the life of its students hasn't progressed anymore. Trying to place himself in a elitary position of "I was the only one seeing the evolution " $\endgroup$
    – Fez Vrasta
    Commented Aug 5, 2017 at 11:15
  • $\begingroup$ FYI: Even if your mother's old Mac was a PowerPC, it possibly would have supported dual boot, which doesn’t have to do much with computing power. $\endgroup$
    – idmean
    Commented Aug 7, 2017 at 19:55
  • 1
    $\begingroup$ @idmean sure, yeah, I just meant that the chromebook - a simpler, lower memory computer that's pretty small, light, and cheap - can do a lot. $\endgroup$
    – auden
    Commented Aug 7, 2017 at 20:01
  • 1
    $\begingroup$ @FezVrasta FYI, OP is a she and never said technology hasn't progressed in students' lifetimes. Moore's Law didn't end when they were born. $\endgroup$ Commented Aug 9, 2017 at 1:29
  • 1
    $\begingroup$ The science museum in London is pretty great. Technology going back to Faraday's coil that he built to study magnetic induction 200 years ago. $\endgroup$
    – Scott Rowe
    Commented Jan 15, 2022 at 18:41
11
$\begingroup$

Robert A. Heinlein's The Moon is a Harsh Mistress contains passages about sneaker-netting printouts of program code from one location to another and re-typing it into the target system.

It seems ridiculous today, but I guess it was way beyond bleeding edge when written.

I'd say a (dramatic?) reading and a discussion about

  1. why it seems so silly today
  2. why it wouldn't have seemed silly in 1966

would be a good jumping off point.

Of course, it's quite possible there are other passages from now-classic literature that would hew a little more closely to your topic, but this is my favorite, and so the one that comes to mind.

$\endgroup$
11
  • 8
    $\begingroup$ Actually, sneaker-netting disk drives is still sometimes the fastest data transmission protocol. $\endgroup$
    – Buffy
    Commented Aug 4, 2017 at 18:48
  • 4
    $\begingroup$ Since you mentioned paper specifically: xkcd.com/1162 Be sure to hover the mouse to read the title text. $\endgroup$
    – Buffy
    Commented Aug 4, 2017 at 19:23
  • 2
    $\begingroup$ I remember reading a (Heinlein?) novel in which astrogators used slide rules on space craft. $\endgroup$ Commented Aug 5, 2017 at 0:53
  • 7
    $\begingroup$ Forget 1966. Look at the average "enthusiast" home computing magazine of, say, 1986. You'd get whole listings, usually in BASIC (with or without machine code interspersed), for games; usually a few different variants for different computers. (So you might get ones for the same game for, say, C-64, IBM PC, and maybe one or two others.) Readers were actually expected to type those back in, just to play a game that we'd barely glance at today. $\endgroup$
    – user
    Commented Aug 7, 2017 at 14:14
  • 7
    $\begingroup$ @MichaelKjörling - In the late 70's-early 80's there used to be a minor industry of publishing books full of BASIC source code (mostly games) like cookbooks. I owned several of those. Of course BASIC is not particularly portable, and humans make typos. Getting the result running is how an entire generation of us learned programming. $\endgroup$
    – T.E.D.
    Commented Aug 7, 2017 at 16:04
11
$\begingroup$

As someone in their 20s who has not actually lived through the history of programming and computer technology but can still very much appreciate it, here's a handful of things that have helped me appreciate the progress of computers:

  • Watching (and laughing at) various episodes of "The Computer Chronicles" (a handful can be found on YouTube). Especially ones where they demonstrate cutting-edge technology (I especially like the one where they're demoing what is basically paint 1.0 for some system or another). Similar old television programmes discussing 'modern' technology should get similar results (e.g. BBC's "The Computer Programme" and "Micro Live"). I quite like the ones from the 80s because it was when computers were starting to become commercialised. Usually people's reactions and interest say it all.
  • Going to a computer museum and actually trying out some old systems. Some were a bit broken, but having a proper old terminal to play with was fun (apart from the means of editing programs, it's not that different to modern command lines).
  • Often simply being told the facts, like learning that the first 'compiler' was an assembler created by Grace Hopper, where there was mostly a one-to-one correlation with machine instructions (and that was a massive step up from switches and punch-cards).*

To be honest though (and some may disagree with me), personally I think that programming is like playing an instrument. Pretty much everyone thinks it's cool, but some people are drawn to it more than others. You can't force an interest in it, someone will be interested in it or they won't be. It depends what they want to do with their lives and what their interests are. Likewise some people are more naturally talented at programming than others (typically people with inquisitive problem-solving oriented brains), much like how some people have 'a musical ear' and some are 'tone deaf'.


* (If you'll excuse the slightly unpleasant reductio ad-absurdum, I don't necessarily need to attempt using a vinegar-soaked sponge instead of toilet paper to imagine how awful roman hygine was.)

$\endgroup$
5
  • 1
    $\begingroup$ Nicely said. And welcome to Computer Science Educators! $\endgroup$
    – Ben I.
    Commented Aug 5, 2017 at 1:25
  • 2
    $\begingroup$ Reminds me of a programme I used to watch where the end credit soundtrack was a download you could record onto tape and then load into your BBC B. $\endgroup$ Commented Aug 7, 2017 at 16:08
  • 3
    $\begingroup$ @DarrenBartrup-Cook That's much cooler than the modern insistance of "like us on facebook/follow us on twitter for reason X". In turn, that reminds me of a thing Valve did where they encoded hidden images in sound files using SSTV. $\endgroup$
    – Pharap
    Commented Aug 7, 2017 at 17:33
  • 2
    $\begingroup$ I'm in the same boat as Pharap. The most potent example of change was one of my professors was telling me about what it was like when he went to college for CS. He was on a punchcard system and they only had one computer. So every student would fill out their punch card, get in a ten minute line, and if they messed up at all, they would have to go and do it all over. Assignments would take hours, just because of how much time they would spend waiting in line to use the computer. $\endgroup$ Commented Aug 7, 2017 at 17:54
  • 1
    $\begingroup$ @EvSunWoodard my mum worked at Marconi and dropped a box of those punchcards - took hours to put back in the right order. Can also remember my dad bringing home a 30mb Winchester Hard Drive. I swear they make cars smaller than that now. $\endgroup$ Commented Aug 8, 2017 at 8:03
7
$\begingroup$

I share that a single Google search uses more compute power than all of the Apollo missions, but even I can't wrap my head around that.

Forget Google search. Your phone is more powerful than the computers that got us to the moon.

Some random articles I found:

It's pretty easy to compare the CPU power, RAM, and hard drive space of a phone to an early computer. Maybe pick a few computers from each decade and plot them out.

But something that was really effective for me was hearing stories from my (older) coworkers. About how they had to store their code on punch cards and run their programs at midnight every night because it was the only time the mainframe (read: only computer in the computer science deparatment) was not being used. About needing to wait another 24 hours to run their code again (which they fixed using tape and hole punches?). Compare that to modern development, where my IDE updates several times a second whenever I'm typing code.

I don't know if you should organize any of this info into a lecture though, as like you said it all comes off as a bit "back in my day we had to walk uphill to the mainframe- both ways!" Instead, maybe try to start a discussion about how much technology has changed in the students' lives, and then work backwards from there.

$\endgroup$
4
  • $\begingroup$ Even if my phone is a decade old brick? $\endgroup$
    – Pharap
    Commented Aug 5, 2017 at 1:07
  • 7
    $\begingroup$ @Pharap Yes, even if your phone is a decade old brick. I daresay the $0.50 microcontroller in an Arduino is more powerful than the AGC, though not as radiation-proof. $\endgroup$ Commented Aug 5, 2017 at 5:40
  • 4
    $\begingroup$ @Pharap The original Game Boy (the old grey brick) had 2x more processing power than the Apollo Guidance Computer. Of course, the Apollo computer was designed around triply redundant circuits, so perhaps you might have to duct tape 3 Game Boys together in order to beat out the AGC. $\endgroup$
    – Cort Ammon
    Commented Aug 6, 2017 at 17:45
  • 1
    $\begingroup$ @immibis This article state even the lowly microSD have more performance than an Arduino $\endgroup$
    – Martheen
    Commented Aug 7, 2017 at 2:35
6
$\begingroup$

I like all the suggestions about visuals but it can sometimes be very hard to really connect an old storage device and a new one since the data stored is hidden in bits.

It's easy to show that punchards used to be used and required and lend themselves to a nice visual. Punchards are 1 line of code per card.

The Linux Kernel is 15,000,000 lines long. A ream of paper has 500 sheets in it with a sheet being thicker than a punchcard. That means the Linux kernel would be a stack of 30,000 reams of paper. You can grab some reams of paper from the copy room and start stacking. Maybe not the Linux kernel but other examples -- the kids final projects. Contrast that with the fact that these programs take no space and no time to load and run now.

I'm thinking that there's probably something that can be shown with emulators - spin up tons of old computer emulators on a modern PC or something like that.

$\endgroup$
1
  • $\begingroup$ Usborne books has released its 1980s introductory programming manuals as free downloads, and emulators for the systems used are available. $\endgroup$
    – Perkins
    Commented Aug 7, 2017 at 23:15
6
$\begingroup$

Here's one pretty good infographic from the Wall Street Journal, though it is almost 4 years old now:

Infographic

Of course, another way is to describe the change in scale. I love the approach used by BlueRaja here, though again this is from 2013. If adding two numbers consistently took a person one second (which is already pretty fast for most people, especially since you could be adding rather large numbers), we could create some interesting comparisons to the 2013 computers. Of course, these all presume that you never sleep, eat, talk, etc. You just sit and calculate, all day, every day. Some of my favorites:

  • It would take 0.1 seconds, at that scale, for light to travel a centimeter in a vacuum.
  • It would take 3 days for a bullet to travel an inch.
  • The fastest blip that the human eye could even see would take 2.25 months.
  • There would be 1.5 years between every picture on a 60 fps screen.
  • The average human reaction time would be 21 years.
  • A single second would be 95 years.

Of course, lately, the speed increases have slowed somewhat. I know that many have been crying out about the end of Moore's Law, though my pet theory is that we have switched over our processor research more towards space and battery life to encompass our huge societal shift to smaller and smaller devices.

$\endgroup$
0
6
$\begingroup$

Compare a graphing calculator with a smart phone.

If your students are in high school, then they probably carry around something like a TI-83 Plus calculator, which contains a 6MHz Z80 processor, 27 kB of user RAM, and a 96×64-pixel monochrome display. That's the same processor and similar specs as an Osborne 1, which was sold in 1976 for over \$4,000 (in today's money) or a TRS-80, which was sold in 1977 for $2,400.

Place a graphing calculator along-side a smart phone. Ask your students whether they'd rather pay \$5,000 for the calculator or $500 for the smart phone. In 40 years, processors and memory have improved by a factor of 1,000, desktops have shrunk to pocket-sized, and cost has dropped by a factor of 10.

The reason why a graphing calculator costs \$100 instead of \$10 is beyond the scope of the question, but it makes more sense of you think of calculators as a necessity for passing a class (no smart phone apps allowed).

$\endgroup$
7
  • $\begingroup$ Welcome to Computer Science Educators! This is quite a nice answer. I hope we'll be getting more good content from you. $\endgroup$
    – ItamarG3
    Commented Aug 6, 2017 at 14:27
  • 1
    $\begingroup$ The only problem with this is the follow up question: Why the heck are TI-83's still so bloody expensive?! $\endgroup$
    – Cort Ammon
    Commented Aug 6, 2017 at 17:46
  • 1
    $\begingroup$ @CortAmmon according to the economics parrot, "Aawk! Supply and demand! Supply and demand! Aawk!" $\endgroup$ Commented Aug 7, 2017 at 4:53
  • 1
    $\begingroup$ @SpencerJoplin and a monopoly thanks to that model being mandated at most schools. I was lucky enough to be able to pick what I wanted. Most of the class had much more modern Casios with a price tag almost 2/3rds less than the TI models. $\endgroup$
    – Baldrickk
    Commented Aug 7, 2017 at 10:11
  • 1
    $\begingroup$ @CortAmmon, obligatory xkcd. $\endgroup$
    – Wildcard
    Commented Aug 8, 2017 at 5:25
5
$\begingroup$

One idea is to take your students on a field trip to a "computer museum" of some sort, if your local area has one. Being able to physically look at and maybe even interact with machines can maybe help give your students a more "visceral" feel for how computing has advanced in general.

(One of my professors actually did this and took us all to the Living Computer Museum in Seattle, and made us write a program of some kind on a retro computer of our choice. It took me a while to figure out how to even use one of the computers -- it certainly gave me a deeper appreciation for how dramatically HCI has advanced over just a few decades!)

You could also perhaps contrast this by following this up with a field-trip to a local datacenter (if you have one).

If you don't have a museums/datacenters like this in your area, you could maybe bring in props of some kind -- maybe bring in an old tape drive of some sort along with a USB and contrast how much data each of them can store.

You could also perhaps try showing them clips of seminal tech demos. I personally like Steve Job's original iPhone demo (YouTube link warning) -- I stumbled across this recently, and it was so incredibly bizarre and surreal to me to watch people go crazy over features I take for granted.

Engelbart's "Mother of all Demos" (also YouTube) might be another good one, though it always felt a bit dry to me, so maybe not.

$\endgroup$
3
  • 2
    $\begingroup$ We are near the Computer History Museum, where they have a difference engine. $\endgroup$ Commented Aug 5, 2017 at 0:53
  • $\begingroup$ @EllenSpertus Make it an anual thing ASAP. One of the biggest disappointments I had with my college was the lack of attempts to take us on field trips or get people from the computing industry in to talk to us. $\endgroup$
    – Pharap
    Commented Aug 5, 2017 at 1:34
  • $\begingroup$ For me, comparing machines from a few years ago gives the best impression of how things have changed (not just the raw numbers), both in terms of battery size, and speed if you're using smartphones. $\endgroup$ Commented Aug 7, 2017 at 16:34
5
$\begingroup$

I have two thoughts on this:

  • Volume matters. Grab some picture of a very early computer (the first one in our sense, the Zuse Z3, would do nicely, and Wikipedia has its details, e.g. 4kW power consumption and 1 metric tonne of weight). Then take any of the numbers you find (i.e., number of individual elements etc.) and compare it to todays CPUs. Something easy. Like one of those smart armbands (Apple Watch, whatever). Then calculate how many buildings/cities worth of Zuse3's you would need to perform the function of that Watch (ignoring networking and so on).

    Oh, Networking would be quite the eye opener. Showing them (with some terminal program that allows to artificially curb network rate) how a 300baud acoustic coupler feels when typing... and making sure they understand that that was a single stream. No TCP/IP on those dialup modem lines back then!

    Oh, and if a stone age "super computer" is too abstract in your opinion, the Apollo computers are just fine as an example, too, although they would be a little skewed - they could have been faster/bigger, but speed/capacity were reduced in favour of robustness for obvious reasons.

  • I don't think you can really get them to "feel" the wonder as you do. Us oldies who have witnessed all those developments first hand had a lot of time to get them ingrained in our brains. Talking about it will never do anything than make you a Grandma in their eyes. It is too late now; youngsters grow up with the whole plethora of functioning "stuff" from day #1.

    An example for raw computing power would be calculating fractals. When I was little, it took hours, days, sometimes weeks to calculate one simple image. These days, you just fly through them in 3D, realtime, 60Hz or more, with 32x the number of pixels on your average gaming PC. Yes. You can tell them that. But I don't really think it is possible to convey the emotional impact these things have on you.

$\endgroup$
2
  • $\begingroup$ I don't think you even need to compare to a "smartwatch" (whether those really count as "smart" is a different discussion, but I digress). Just compare it to a typical digital watch. If you want to get fancy, compare it to one with a calculator, such as those that were popular around 1990s-ish. $\endgroup$
    – user
    Commented Aug 7, 2017 at 14:18
  • $\begingroup$ I disagree you can never get them to "feel" the wonder. But you don't do it by talking, that's true. As a student, you learn it by painstakingly following through algorithms by hand. $\endgroup$
    – Wildcard
    Commented Aug 8, 2017 at 5:26
5
$\begingroup$

I think nobody has mentioned the (energy) cost of running an algorithm on these machines. For example when you are taking a photo on a phone, it is running a face recognition algorithm in real time. This consumes X watts of power and costs Y USD / second to run.

You'd need to calculate how many multi-million USD machines you'd need from a specific era to run the algorithm at same speed on those (assuming it was possible), and how many watts of power (CPU power + cooling) it would take.

Algorithms on the topic of linear algebra or databases could be better examples but most people aren't that familiar with those. Video compression is an other good example, you could calculate how many USD it costs to compress 1 hour of 1080p video on a modern laptop (most energy efficient?) and on an older system. It may need like 100x the time and 10000x the power, making the whole process cost a million times more even without considering initial investment costs.

$\endgroup$
3
  • $\begingroup$ Welcome to CSEducators. Thanks for your answer. We hope to hear more from you. $\endgroup$
    – Buffy
    Commented Aug 5, 2017 at 15:50
  • $\begingroup$ I'm not a good educator, I just have opinions and ideas on the topic of CS ;) Hopefully I'll have interesting ideas to share. $\endgroup$
    – NikoNyrh
    Commented Aug 5, 2017 at 17:40
  • $\begingroup$ @NikoNyrth Hmmm, not a good educator. Some of us official educators have to admit to that occasionally also. But we are welcome here too. $\endgroup$
    – Buffy
    Commented Aug 5, 2017 at 17:46
5
$\begingroup$

I began to realise how drastic the change was in 1997, when I had more computing power and more disc storage on my desk than my entire university campus had when I graduated in 1983. In 2010, I had more than that on my phone.

To enumerate, the university in 1983 had:

  • An old mainframe, an ICT 1904S* with 256K of 24-bit words, maybe 10 5MB and 10MB exchangeable-pack disc drives and two or three tape drives, with each tape holding about 1MB. The main way of using it was by submitting large decks of punched cards, and it ran several hundred such jobs per day.
  • Several Prime minicomputers, each with about 2MB RAM, and 50MB disc, plus a 300MB disc that was definitely the largest on campus. They were networked together with some kind of serial cable, not anything recognisable as a network to modern eyes. The terminals for this network each printed on a roll of paper, with an astonishingly cheap and nasty dot-matrix printer. There were maybe 50 terminals, and several hundred students would use them each day, going to one of the buildings that held them.
  • A DEC VAX-11/750, with about 4MB RAM and several 70MB disks, running a BSD Unix that would be usable to someone today who can work with the Linux command line, albeit astonishingly slowly. It belonged to the computer science department, and drove about 20 terminals, which had black-and-white cathode-ray tube displays, like an old-fashioned TV. This was high-tech for the time.
  • A few DEC PDP-11 minicomputers.
  • A scattering of 8-bit microcomputers: Apple IIs, BBC Micros, and assorted CP/M machines.
  • Various specialised graphics terminals, which could do color graphics, very slowly.
  • Various machines that the electronics engineering students had been experimenting on, and were no longer useful owing to unreliability.

In 1997, I had an HP PA-RISC workstation, running HP-UX Unix, with 256MB RAM, about 4GB of disc, a 21" color CRT, a mouse, and 100 megabit Ethernet. I also had a Dell Pentium III PC, running Windows NT 4, with about the same amount of RAM, disc, display and networking. The two machines had similar CPU power, very slow by today's standards, but actually capable of running a modern (albeit 32-bit) OS, if you're prepared to wait a lot. They could both access the same fileserver, which could share the same disc space to both of them, something that was unheard of in 1983.

in 2010 I had an HTC Dream, with an 8GB microSD card in it. That fingernail-sized card had more storage capacity than all the discs, tapes and punched cards on a university campus of 27 years earlier.

$\endgroup$
3
  • 1
    $\begingroup$ Welcome to Computer Science Educators! While personal experience is sometimes useful, it wouldn't help the asker in this specific situation: The question asks about conveying an idea or point, and personal experience without much explanation is not enough. If you could edit your answer to expand on how it might be used to convey the idea to students, it would improve the answer's quality. $\endgroup$
    – ItamarG3
    Commented Aug 5, 2017 at 15:51
  • $\begingroup$ @ItamarG3: Better? $\endgroup$ Commented Aug 5, 2017 at 17:22
  • $\begingroup$ Considerably :) $\endgroup$
    – ItamarG3
    Commented Aug 5, 2017 at 17:24
4
$\begingroup$

The problem with teaching the trajectory of computing technology is with basic numeracy. The orders of magnitude are just too big for many students to comprehend without more numeracy and feeling for log() scales. Many of the examples here would just go over their heads as random magical numbers.

I would teach this by not talking about computers, at first, but start with one of those science picture books that shows a proton, then orders magnitudes to the height of a humun, then orders of magnitudes to the size of the known universe. Also one of those books that show all of human history as the last fraction of a second on a 24 hour clock mapped to the age of the Earth.

Then, after students go wow over these scales, maybe an appreciation for them, only then overlay computing technology: gears to relays to tubes to discrete transistors to ICs to VLSI to 10 nm finfets SOCs in their iPhones, and put each on top of the one of pages of those order-of-magnitude books. Give them a better sense of Moore's law scaling as it relates to the universe of scaling.

Then they might be impressed that you started out in the Paleolithic period of computing. Or as a protozoa equivalent.

$\endgroup$
0
4
$\begingroup$

I suggest asking a kind of riddle.

Which is faster, UPS or the internet? (Use your schools internet as a starting point)

Then specify many sizes of data. Use "common sizes" like:

  • Blu-Ray
  • DVD
  • CD-ROM
  • 120 Gig Hard Drive
  • 2 Gig Hard Drive
  • 1.44 Floppy
  • 5" Floppy
  • Casset Tapes (think C64)
  • Tape Spools
  • SD Cards

Then use some variables for internet speed. Common ones like "Cable", ADSL, 56k, 28.8K etc etc.

Then once you are done playing the "game" of finding out which is faster in what circumstances. (Hint UPS is still faster for large data sets) Then start assigning years to the values of speed and size. Try to find out, when, UPS became slower than the internet, and for what sizes of data. Make a line graph.

Of course keep expanding on the "data" side till your far enough back. The internet didn't exist, there was no network, just snail mail. Let that sink in a moment.

Also try to give examples of what would fit in that data size. a 1.44 Floppy couldn't hold even 1 song from today, and took hours to transfer over a 28.8k Modem.

Then you could give examples of things computers did. How long they would take to do it. And how you would interact with that data.

$\endgroup$
2
  • 1
    $\begingroup$ @nocomprende, I have trouble understanding that question. You want to build the smaller, more capable equipment, and then build the big clunky ineffective equipment? :) $\endgroup$
    – Wildcard
    Commented Aug 8, 2017 at 5:29
  • $\begingroup$ @nocomprende funny! :) Economies of scale, my friend. And marketing. (Marketing is a good thing, by the way.) $\endgroup$
    – Wildcard
    Commented Aug 8, 2017 at 11:13
3
$\begingroup$

From my experience of trying a history of computers lesson...

I tried showing this old tech programme but they failed to relate to it and got bored really quickly. I think it destroyed my street cred more than anything :).

They did enjoy a clip that showed one of the first home pcs and games consoles, so decided to change my lesson plan. I found a youtube video showing a zx spectrum loading a game. Some jaws dropped and after they begged me to turn off the noise, I let them play some a retro game (type "pacman" into Google). Maybe not as informative as planned, but it salvaged the lesson.

A classroom display may work well. They can then look and read about it in their own time. In the iMac suite I have a poster showing Macs through the ages. It kept them amused for weeks and they appreciated what they have.

$\endgroup$
1
  • $\begingroup$ An interesting point. $\endgroup$
    – ItamarG3
    Commented Aug 5, 2017 at 7:30
3
$\begingroup$

What about the 90s internet? Maybe they'll relate to that. Then take them on a journey to pre commercial internet, then onto computing and networking before the internet?

This 90s ad for AOL is hilarious.

Try searching for "internet minute". There's some great infographics that may illustrate how far computing power and the internet has come.

$\endgroup$
0
3
$\begingroup$

It's almost impossible to convey a sense of wonder when they don't appreciate what they have.

My personal favorite approach is to point out that the Apollo Guidance Computer (AGC), which sent us to the moon, was a 2Mhz processor with 2k of RAM. The original Game Boy had a 4MHz processor and 8k of RAM. The old game boy was more powerful than the AGC! Well, almost. The AGC did have triple redundancy on all circuits, so you would have to duct tape 3 Game Boys together if you really wanted to compete with it!

And, of course, a modern cellphone chuggs along at 1800Mhz!

I find it helps to first appreciate just how blindingly powerful and fast computers actually are. Then we can talk about where they came from. One of the calculations I did a while back was comparing the speed of a cache hit versus going out to memory to get the information. It turns out, due to serendipity, that the difference in speed between a L1 cache hit and a round trip to memory is actually the same speed difference as that of a Cheetah at full sprint vs. the top speed of a snail. No joke!

Of course, that doesn't help if you don't have a sense of how fast either of those are, so let's compare them both to something we can appreciate: how long it takes to load a web page. There's a latency associated with going across the world over fiber optics. It turns out that your round trip ping time from America to Europe (the absolute bare minimum portion of loading a web page) can be put on this scale between cheetahs and snails. This web page access proceeds at the speed of the San Andreas fault slowly opening in California! Yes. On this scale of speeds, California actually starts sinking into the ocean!

$\endgroup$
2
  • $\begingroup$ Nice images. Hope we get them educated before the heat death of the universe! $\endgroup$
    – user737
    Commented Aug 7, 2017 at 12:39
  • $\begingroup$ I love it about the cheetah and snail! I'm going to put that in a lecture. $\endgroup$ Commented Aug 9, 2017 at 1:32
3
$\begingroup$

I have 8k of genuine Ferrite core memory, on a circuit board of about 1 square foot. I show that, followed by an 8k static RAM chip (1970's), followed by an 8 GB SDRAM (1M times the storage capacity of the original board), followed by a 1 TB SSD (125x the storage on the SDRAM.

$\endgroup$
4
  • $\begingroup$ Now if you could just get a replica of memory from the Apollo missions. $\endgroup$ Commented Aug 9, 2018 at 22:07
  • $\begingroup$ I believe ferrite core memory is what was used through the skylab days $\endgroup$
    – pojo-guy
    Commented Aug 10, 2018 at 3:36
  • $\begingroup$ Ferrite core with wire rope. Hand-woven to make the program. Nice photo here $\endgroup$ Commented Aug 10, 2018 at 4:05
  • $\begingroup$ In one of the lectures I attended as an undergrad we passed round a mercury delay line. It was short enough to pass round, so it can't have been a 35-bit one; I suspect it would have been a 17-bit one. I'm sure a mockup could be made for educational purposes using a tubular lightbulb. $\endgroup$ Commented Aug 13, 2018 at 9:57
3
$\begingroup$

There is an old joke, along the lines of "If automobiles improved like computers, you could now buy a car for $5 that would drive supersonic and do 1000 miles/gallon. And it would crash every minute." Leave out the second part, and use the first part as an indication of the increase in power.

But seriously. There is a saying, no longer entirely true, that a supercomputer has always cost 10 million dollars. (Actually, the very top now costs over 100 million, but it's still true for "ordinary supercomputers".) My first super ran at 50M floating point operations/sec, and I've seen 50G and 50T come and go, and I now have a 50P (10-to-the-15: billion times faster) machine, and it still costs in that range. And it's still roughly the same size.

$\endgroup$
6
  • $\begingroup$ Funny that competition hasn't brought the cost of a supercomputer down. $\endgroup$
    – Scott Rowe
    Commented Jan 15, 2022 at 18:32
  • $\begingroup$ Well, they got lots and lots faster, so you can choose: same power less price, or same price more power. And there is probably something about the 1-10M$ price point that makes it a plausible number for universities / labs / funding agencies. $\endgroup$ Commented Jan 15, 2022 at 18:51
  • $\begingroup$ If we can do the same real work using a laptop today which required a huge expensive machine decades ago, I wonder why we are still furiously improving computers and also what more is being done with them? If I could drive to work at 1% of C, any improvement beyond that would probably not be noticable. Wait, I don't have to drive to work anymore, I get it... But only because I work in IT, a field that basically didn't exist when I was born. What value add am I really giving to the human experience? $\endgroup$
    – Scott Rowe
    Commented Jan 15, 2022 at 18:58
  • $\begingroup$ @ScottRowe "same real work" is of course not the case. The old machines had a lot more bandwidth relative to their compute power. With current processors it's extremely hard to get close to peak performance. And of course OSses are getting ever more bloated, which means that the user experience is not getting much faster. But of course there has been progress. In the 1990s my Mac could barely play digital audio. Then it could record it, then it could generate it with real-time effects processing. There has been progress..... $\endgroup$ Commented Jan 15, 2022 at 19:27
  • $\begingroup$ Ok. So supercomputers are now so fast that we can't keep them busy, except with new forms of entertainment. I just remember when I was a child and no one I knew had seen a computer, wondering what they were good for. Work expands to fill the available time-sharing, I guess. Perhaps I should inquire into what people do with supercomputers these days. Read recently about an architecture for handling stock trading, it was a good design. But are we that much better off trading stocks in milliseconds than, say a minute? Would child me think this was a social good? $\endgroup$
    – Scott Rowe
    Commented Jan 15, 2022 at 21:31
2
$\begingroup$

The blog "Programming in the 21st Century" has several posts that are interesting and suitable for students. I like to show my students the post "Slow Languages Battle Across Time", and then recommend them to read everything else on that site when they have free time. For the same idea but for memory, the great post is "A Spellchecker Used to Be a Major Feat of Software Engineering".

Since I use a Mandelbrot renderer (needs BigComplex.java, and note that you can press the Stop button and drag mouse to select zoom area inside window) as a big example in the second programming course (there is parallelism, graph search and strategy pattern stuff going on), the first 1980 visualization on the Wikipedia page makes a nice comparison to that, as do the tales of using Fractint on a basic 4 MHz PC machine when I was but a teen to render the basic Mandelbrot image in many long crawling minutes. This simple Java renderer can then be similarly compared to those that were used to create the deep renders amply available on YouTube, and imagine how many human lifetimes my Java renderer would require to create those.

$\endgroup$
0
2
$\begingroup$

I suggest that's not realistically possible.

When my niece (7) asked what had changed since I (60) was her age, I had to say "Pretty-much everything" and that remains true; true of society in general and more particularly of computing.

Going back to computing power, you're looking at something that could easily be termed millions of times greater; inarguably, many thousands… Who said a single death is a tragedy but a million deaths are merely statistics?

Why not try the library scenario?

Today almost any search engine will answer almost any question in a fraction of a second…

Back in the '60s you had two choices, the computer route being to rely solely on what was filed locally.

The other choice was to trek down to your local library, where a knowledgeable assistant might be able to find a helpful book. That almost certainly took an hour or so.

Failing that, you could file a request… not a generic, search-engine-type question but a specific request for a particular title. That might squirm its way round the system for weeks… perhaps months.

If you did get a hit, you'd trek down to the library again…

So broadly, what now takes milliseconds might then have taken weeks or even months.

Does that fit the bill?

$\endgroup$
0
1
$\begingroup$

Interesting question. Along the lines of "computers were the size of a room" story, my idea is maybe more tangible.

You're the expert, so I ask you: what component inside a computer the size of a room is the size of a modern phone? You could have your students take out their phones and look at them. "Your phones are the size of one flux capacitor that was made in the 60's. Your phone has one million flux capacitors in it."

Or, estimate the size of all of the phones and computers in your building combined, and then give a estimate of the computing power that could be made from something that size on the 60's.

$\endgroup$
1
  • 1
    $\begingroup$ Welcome to Computer Science Educators! That's an interesting point. If you could add more examples along the line of what you have (i.e. the scale growth), it would really make it into a much better answer. $\endgroup$
    – ItamarG3
    Commented Aug 5, 2017 at 11:52
1
$\begingroup$

Draw a graph of the typical price of RAM and HDDs (both in currency / capacity) from 1960 to 2010. I don't have the figures to hand, but by way of illustration -

In 1990 (IIRC), RAM was priced at £10 / MB, hard drives at £1 / MB - in 1990 prices. Now its £14 / GB for RAM, £4 / GB for SDDs

$\endgroup$
1
  • $\begingroup$ Good idea. I already do this. $\endgroup$ Commented Aug 9, 2017 at 1:33
1
$\begingroup$

I recently wrote several emulators for some 1970's era minicomputers, and I included memory display windows showing one pixel per word, so I could watch memory change in real time. I was quite impressed to see that the largest minicomputer memory available back then is no bigger than a postage stamp on my screen. Perhaps you could direct their attention to a picture containing 32768 pixels and explain that a minicomputer or early microcomputer could hold exactly one of these at a time.

Of course even this factoid may not make much of an impression on someone who does not need to know the exact nature of the difficulties we faced in order to understand that there were difficulties, and to be grateful that somebody dealt with them. We joke about our grandfathers' stories but at bottom we know they really did have it uphill both ways.

The main point to convey is that the world we hold dear today will look similarly primitive to our children. (And by "children" I do not mean a metaphor for some unspecified future generation -- I mean infants being born right now may have trouble comprehending, e.g., why we had to drive our cars on "manual" the whole time, or hold an election and "vote" just to choose a mayor or a president.)

$\endgroup$
1
  • $\begingroup$ Thanks for your answer and welcome to CSEducators. We hope you will contribute in the future as well. $\endgroup$
    – Buffy
    Commented Aug 5, 2017 at 17:15

Not the answer you're looking for? Browse other questions tagged or ask your own question.