28
$\begingroup$

So, I time travelled back to the year 1969. I messed around back in time for about a week before leaving. But, me being forgetful and everything, I leave behind my modern laptop ( it’s an Apple MacBook). After being found, it makes it way to the US gov’t, who quickly see its huge value. My question is, could they feasibly recreate it?

$\endgroup$
1
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – Monty Wild
    Commented Oct 30, 2019 at 21:17

14 Answers 14

72
$\begingroup$

Not any time soon. In the 1960s, MOS transistors were in their infancy, with a typical size of 10 micrometers. Modern transistors are more in the 10 nanometer range, 1000 times smaller. Even if they had all of the schematics and specifications (and raw materials, about which I know less) they would not be able to work with components as small as those in modern processors.

Even if they did have the production capacity, they would need to reverse-engineer the precise schematics needed for assembly - not just the end result, but the steps taken to get there. This can be done - in the 80s and 90s, Soviet and former Soviet engineering programs were notorious for reverse-engineering Western chips, and they were hardly alone - but it's an involved process; working with hardware so far in advance of the state of the art, I wouldn't be surprised if it took some years.

The display is in a similar state: LCDs were just beginning to be developed in the 60s, and the development of thin-film transistors wouldn't be for several years yet. Then, once they had figured out how to replicate the display itself, they would need to reverse-engineer modern display specifications, i.e., how the processor and display communicate.

And the same goes for software. C, the venerable ancestor of modern programming languages like C# and Java, is a few years from its first edition. Simula, the first object-oriented programming language, was a few years old. If they managed to decompile some of the software (or you had a lot of source code and language documentation sitting around for whatever reason) it would propel the state of the art forward considerably. Of course they don't need to write any additional code if they understand how to copy it from your laptop to their replication, but they would be limited to only what you happened to have at the time, which might not be useful.

TL;DR: Most of the industries involved would just about be able to grapple with the challenges posed by duplicating your laptop. Studying it would lead to major advancements in pretty much every aspect of computers. After some years (I would guess inside of a decade) they could probably describe it in enough theoretical detail to duplicate it. The industrial and technical requirements would take another few years to fill. Remember that the industrial machines involved use computers themselves, so you're looking at an iterative model of improvement - better computers mean better fabricators mean even better computers. That would get them a laptop comparable to a modern one. For it to be compatible, they would need more in-depth study of the exact specifications, which would take more time.

$\endgroup$
1
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – Monty Wild
    Commented Nov 1, 2019 at 1:31
28
$\begingroup$

Probably, but it would take a while.

The Transistor and solid state electronics was already known. Computers that do binary math was already known, so once they got hold of the thing they could figure it out. The Microprocessor was only 2 years away. Magnetic storage was invented long before. The display could be figured out. From a theoretical standpoint, almost everything was there...

BUT Not everything

While the computer would be wildly more advanced than most of the engineers could have initially imagined at that time, they could see it being possible. What they are missing is the entire modern supply chain and the super precise fabrication facilities that make a modern computer possible. They will have a goal, but they have build to it and that is a lot of iterative design to get there.

An unintended consequence is that you might start off a government time travel program. Moore's law was already known. If they can figure out how powerful the machine is, then extrapolate back to how powerful their machines are at the time, they could guess about how far in the future the laptop came from. This would be confirmed by the copyright on the bios. So they would know time travel happened. Cold war paranoia would kick in just in case it was an evil Russian kind of thing and we would want to be First to get time travel for national security, of course.

$\endgroup$
15
  • 10
    $\begingroup$ +1 for exposing time travel. $\endgroup$ Commented Oct 29, 2019 at 15:14
  • 1
    $\begingroup$ Granted, I'm not up on Apple products, but if it were any other PC when you turn it on and see the normal POST notifications from the Bios before the machine even gets to the OS you wil visibly see the memory check indicating RAM. If you hit the esc key and go into the bios proper you will see clear navigation instructions. some will display the clock rate, size of the hard drive, memory and so on. It's not hard to decipher for a reasonably intelligent NSA engineer type. $\endgroup$
    – Paul TIKI
    Commented Oct 29, 2019 at 17:36
  • 1
    $\begingroup$ True enough, but if you see "press esc for setup" and you look and hey, there is a key that says esc.... This is quite literally how I learned about what a BIOS is and how to manipulate it. "press esc for setup" is the first thing on the screen when I turn on my laptop. $\endgroup$
    – Paul TIKI
    Commented Oct 29, 2019 at 19:52
  • 5
    $\begingroup$ @Harper That explains a great deal. as the fae detest iron and I used an old steel tool to open my last Macbook (10gb Hdd. don't even remember what it was called), the elf now hates me. He has told his friends and they hate me too. This is why I have never been able to use a Macbook since. $\endgroup$
    – Paul TIKI
    Commented Oct 29, 2019 at 20:10
  • 2
    $\begingroup$ in case it was an evil Russian kind of thing - also, top artists would try to deduce the evolutionary path from the sickle and hammer to the Apple logo; based on that, political scientists would have to deduce how Russian political thought has changed, and what depths of pure evil has it reached. $\endgroup$
    – Headcrab
    Commented Oct 30, 2019 at 1:02
12
$\begingroup$

Yes, but it will take some time.

It is a common fallacy that all the true scientific progress that has been made in the world has been made in the last few decades. Sure, we seem to have more gadgets today than we did in the 1960s, but the concepts on which they have been built have been understood for some time.

There is an interesting blog post on progress between 1885 - 1950 compared to 1950 - 2015 that highlights this well. The point being, that in the late 1960s we already had computers; we'd been putting them in Apollo spacecraft after all - we had screens (albeit CRT) that had been used for broadcast receiving for 40 years or so in the US (first TV channel started broadcasting there in 1928) and the integrated circuit was already 10 years old.

So, conceptually, your laptop is just a VERY advanced version of a lot of technologies we already knew about. Even the battery could be reverse engineered with enough time and effort so all things considered, it would most certainly be possible. The US science domain was (arguably) at its peak around that time thanks to Apollo so all in all the ability to reverse engineer such an artefact was available.

The problem is, you only get one of them. Once you pull it apart completely, you either need to know how to reassemble it perfectly, or you need to research yourself until you can create your own. In other words, you don't get a spare to keep operating so you can look at software in action, experiment with yanking this or that large component then putting it back in so that you know what to do with the bits on your workbench. Once you disassemble it to reverse engineer it, the chances of it ever working again are slim and you're committed to your path to create a new one from scratch by researching the parts in front of you.

As an aside, given that the first PCs were coming out in the mid 80s with CRT monitors, and given the effort that would be involved in reverse engineering this technology before designing something for the commercial sector, and the eponential growth of this technology over the last few decades, if time travel was possible then it's entirely plausible that your hypothetical scenario is exactly what happened...

$\endgroup$
5
  • 1
    $\begingroup$ I would worry that this could end up being counterproductive. If somebody in government ended up showing IBM the type of device that would ultimately make them a ghost of their former selves, they notice the similarities with developing tech and kill off MS and Apple more actively. The Macbook-equivalent might not be the recipient of as much underdog culture. $\endgroup$
    – Zwuwdz
    Commented Oct 28, 2019 at 22:46
  • 4
    $\begingroup$ Yes, but good question is - how long it would take to recreate? For example, scientist can study 10nm-process microchips all they want, but it should take many years of work trying to successfully reproduce one. $\endgroup$
    – Alexander
    Commented Oct 28, 2019 at 22:50
  • 2
    $\begingroup$ @Alexander They can study it but they still don't have access to the machines that made it thus can't really replicate it. At best the chip shows them what is possible just not how to make it. $\endgroup$
    – Thorne
    Commented Oct 29, 2019 at 1:17
  • 3
    $\begingroup$ @Zwuwdz: Why do you think IBM is now a ghost of its former self? In 1969 it had revenues of 7.19 billion (49.20 billion corrected for inflation) : ibm.com/ibm/history/history/year_1969.html In 2018, its revenue was 79.6 billion. Sure, it has a smaller share of the total tech market, but that market is a heck of a lot bigger. $\endgroup$
    – jamesqf
    Commented Oct 29, 2019 at 3:17
  • $\begingroup$ The 1800's had a lot of breakthroughs, but you and your blog link do a disservice to current technology. steemit.com/philosophy/@rsguardian/limitless-alih9vcs0x explainthatstuff.com/timeline.html $\endgroup$ Commented Oct 29, 2019 at 20:00
10
$\begingroup$

Recreate something just like it, highly unlikely. However, it's likely they could extract a lot of knowledge about the right directions to research in that would accelerate development by decades, particularly:

  • CMOS logic
  • flash
  • choices about what data channels should be serial vs parallel
  • differential signaling
  • line codes (8b/10b, etc.)
  • PCB design
  • modern LEDs
  • modern LCDs
  • battery chemistry

not to mention all kinds of UI/UX concepts. You wouldn't end up with a modern laptop anytime soon, but with a huge amount of funding poured into it, they might be able to make some sort of laptop with a mix of characteristics from various machines between the 80s and now. My guess would be that it would be on the lower end in memory capacity and clock speeds, but with relatively modern UI elements.

And of course if the laptop you left happened to have Wikipedia bookmarked and some of it cached and available in offline browsing, they'd have an even bigger head start.

$\endgroup$
3
  • 1
    $\begingroup$ I think other answer show definitively that the answer is no, but your answer shows that, even though the answer is no, there's still a ton of value there. +1. $\endgroup$
    – Michael W.
    Commented Oct 30, 2019 at 2:38
  • 1
    $\begingroup$ How you'd non-destructively reverse-engineer some of these things with 1960s technology would be an amazing topic in itself. $\endgroup$ Commented Oct 30, 2019 at 2:42
  • $\begingroup$ Among these, CMOS might be the most underappreciated. It was just emerging in the mid 60s, but realizing it would be the right long-term answer for digital logic would be huge. The alternatives we wasted decades fooling around with have massive fundamental power consumption and heat dissipation issues that make a laptop with any significant computational power a non-starter. $\endgroup$ Commented Oct 30, 2019 at 15:32
4
$\begingroup$

They won't replicate the macbook because depending on which software it is loaded with it is more useful intact and running then disassembled and if they won't disassemble they won't replicate.

If it came with xcode, gcc and other develpment tools, will change the course of programming because the scientist will learn how modern programming works and apply this knowledge on their own systems. Also if you brought the tools for machine learning/neural networks and such, the scientist won't waste decades in perceptrons and other earlier attempts at AI, they will go right into deep neural networks, changing the course of computation even more.

Another thing is that a macbook is many times more powerful then any computer they could build then, even the first Cray was barely comparable with a 486/pentium 100. They will use your mac to run climate models, simulate nuclear weapows and process astronomical data.

$\endgroup$
9
  • 7
    $\begingroup$ Modern programming works that way because we have the CPU capacity and memory for it to work that way. You wouldn't really be able to advance to a higher programming language without having fast CPUs (and GPUs) and memory to handle them. It was not by luck that C compiler was designed as a single pass processor (with headers). Sure, you can avoid hitting some wrong ways and go quicker but people learn from errors. $\endgroup$
    – Sulthan
    Commented Oct 29, 2019 at 14:19
  • 2
    $\begingroup$ @Sulthan perhaps more relevantly, a great lot of modern programming-language ideas were actually already discovered by 1965. Lisp, Algol, Simula were much more advanced than many of the languages that later dominated the kind of “old-school, primitive coding” that we may associate with the 70s and 80s. $\endgroup$ Commented Oct 29, 2019 at 17:02
  • 2
    $\begingroup$ @jamesqf, that's a good question. Are dnn really useful and would they be useful with sixties' datasets? I dont know. $\endgroup$
    – Geronimo
    Commented Oct 29, 2019 at 20:57
  • 2
    $\begingroup$ I don't think there's enough computer storage in the whole world back then to run a modern DNN at all. The storage infrastructure just isn't there, nevermind the access speeds. Just because you have a single computer that is from the future, how would you get the data inside? And it's a Macbook too! You'll never be able to hook it up to transfer data. We have problems now with the damn port fiasco... $\endgroup$
    – Nelson
    Commented Oct 30, 2019 at 2:32
  • 1
    $\begingroup$ @Nelson: that's a really good point; without a serial port there's no "simple" port. Even ethernet is non-simple. Although it's really only gigabit ethernet that's crazy (both sides send in both directions at once over 4 pairs, to keep frequencies down). 10baseT is just a send and receive pair, and it's plausible that could get reversed, if they could get it to auto-negotiate to 10baseT. If any kernel / driver source code was sitting around on the laptop with wifi drivers, discussion in comments of QAM might lead to an early revolution in data radio... $\endgroup$ Commented Oct 30, 2019 at 3:54
2
$\begingroup$

Could they feasibly replicate it? They'd have far more chance of replicating it if you'd taken a crate of identical MacBooks with you because the chance would be incredibly high that someone while investigating it would do something irreversibly destructive before anyone understood what the device or component would do. With no prior knowledge of that level of circuit integration or component miniaturization the overwhelming probability would be that someone would use a tool that allowed a small leakage current to burn out microprocessor gates, would crack a component applying too much force, would melt some plastic element because they used far too big a soldering iron (or for far too long because it's got lead-free solder that melts hotter than they'd expect). Even following the signals in the system would push 1969 technology way beyond its limits... as example the clock speed is far higher than 1969 oscilloscopes could measure.

Or with only one of them available, they'd not be allowed to do enough investigation because it was the only one of its kind and far too precious as a working object to take to pieces.

But simply knowing that it would be possible to get things that small would prompt a huge research leap and huge technology surge. Just think about the power supply alone... a modern switch-mode power supply taking AC mains and turning it into 19V DC at many amps at 90+% efficiency in a few cubic inches of space compared to a 1960's era linear power supply occupying thousands of cubic inches and pouring out huge amounts of heat. And probably full of big valves

But even with a crate of machines to play with, it would take years of work before the technology was good enough to even understand what went on in the components of the laptop let alone stand a chance of replicating it. It's taken half a century to get to here from there in technology terms, I doubt that even having a working example would shorten that time by half. And of course as you've protected your laptop with the latest encryption algorithms they'd have to do a mammoth codebreaking task to get into the software on it!

$\endgroup$
2
$\begingroup$

Recapitulating 50 years of fabrication progress would probably take them... about 50 years.

As an analogy, we know a lot about how a living organism works, but we can't synthesize one. Frankly, given current feature sizes, our microprocessors are closer to a biological organism than they are to 1960s electronics.

In 1969, we had transistors and primitive integrated circuits. Engineers would probably be able to guess that the tiny things soldered to those boards are semiconductor packages -- but they'd have no hope of reverse-engineering them, through non-destructive or destructive investigation. There's just too much of a gulf.

$\endgroup$
1
  • $\begingroup$ Actually, we don't fully understand how a living organism works. We have made considerable progress in the last decades, but there's still a lot of detail that we simply haven't figured out yet. $\endgroup$
    – toolforger
    Commented Oct 31, 2019 at 6:21
2
$\begingroup$

I estimate that starting in 1969, if handed a MacBook the US government could reproduce it, software and all, in 100 years, at an estimated expenditure of $10B/year (1969 dollars). It would, of course, be turned over to the CIA who would launch a secret project to analyze it and they'd spend years analyzing, considering, re-hashing, and etceterizing everything, without bothering to make much progress. The Soviets would get wind of it, and through a back-door channel between the CIA and KGB in Vienna they'd get some incomplete schematics and would spend the entire GDP of Moldavia on trying to flesh it all out. Xerox would get involved because, y'know, why not, but although they'd be able to simulate the thing on a computer the size of a refrigerator they'd get distracted by this or that or something else and in the end, although they'd produce a lot of interesting stuff, they wouldn't produce anything useful.

But ultimately there'd have to be a couple of geeks in San Jose who'd start a little company after their employer turned them down, who'd develop the same thing in about 25 years.

$\endgroup$
2
$\begingroup$

In addition to the valid points about physical construction, modern computers are too complicated for paper and pencil design to be practical. Designing a generation N computer requires many generation N-1 computers to store and manage the design documents, run simulations, produce layouts, check timings, etc. 1960's computers were simple and small enough to be designed without using computers. Current computers are not.

If you take a typical consumer laptop back, it is very unlikely to have much digital design software on it, so that would have to be developed from scratch.

Similarly, a 1960's computer would not be fast enough or have enough memory to run compilers for a modern object orientated language.

It might be possible to avoid some dead ends, and so get to current technology a bit faster, but most of the intermediate steps would have to happen, and take almost as long even knowing where one is heading.

$\endgroup$
1
$\begingroup$

In 1961, was the first transistor computer called the IBM stretch, it cost 7 million and used 100 kilowatts. You would need 60,000 of them to compare to a laptop, which is 600 megawatts. That's achievable using hydroelectric, the hoover dam is 2 gigawatts.

https://en.wikipedia.org/wiki/Instructions_per_second#Timeline_of_instructions_per_second

in 1960, the J Edgar Hoover dam could power 4 laptop equivalents. in 1965, it could power 40 of them. So, yes it's totally achievable at a cost of 10-20 billion dollars, to process at the same speed, and if you process slower, you divide the cost and equipment.

https://en.wikipedia.org/wiki/IBM_7030_Stretch

$\endgroup$
0
$\begingroup$

No, as Cadence points out, the fabrication techniques would be beyond them.

However, they could learn a lot by reverse-engineering it. If they’re willing to destroy their one copy of the machine, or if they have more than one, x-ray analysis of the CPU would reveal its fine structure. Large portions of that would still work at a larger size, just slower. (Historically, it’s gone the other way: the first chips being made with a new process are typically an older design, but shrunk to be faster.) Especially if it’s a RISC design, it would be useful well before they could replicate modern feature sizes.

Similarly, if they could figure out what the machine-language instructions do, they could reverse-engineer the binaries on the machine and learn a lot that way. If there’s source code on it, examining that would help them design a more futuristic programming language (perhaps with the help of a reverse-engineered parser-generator). There are a lot of components of the system that they could begin using to improve their own computers well before they could exactly clone a MacBook. And they don’t need to, unless they have a big library of MacBook software and peripherals they need to use.

$\endgroup$
0
$\begingroup$

No. At one point during the Apollo program of the mid-1960s, over one half of all ICs in production in the world were being sucked up by the space program. Also, one third of all engineering and math professionals in the USA were working either for the government or for private contractors involved in the space program. That means that we would have zero excess production capacity or human resources to devote to such a project.

$\endgroup$
11
  • $\begingroup$ You don't think the discovery of a time-travelled artifact from the future would motivate anyone to work on it? The other 2/3rds of those people are working in industry or academia and many of them would be extremely interested in it! $\endgroup$ Commented Oct 30, 2019 at 4:00
  • $\begingroup$ Those other people were needed to keep the rest of the country running. $\endgroup$ Commented Oct 30, 2019 at 4:05
  • $\begingroup$ Uh huh. So you don't think anyone working on developing new computers or software would have wanted or been allowed to take time out of their day to have a look at this future tech? Academics working in math, computing, and electrical engineering departments don't need to devote their day-to-day activity to "keeping the rest of the country running". Instead of working on a paper on some idea they had last year, many of them would be spending their time (trying to) get new ideas much faster from playing with this future tech. Especially once anyone builds ethernet to allow remote terminals $\endgroup$ Commented Oct 30, 2019 at 4:09
  • $\begingroup$ @PeterCordes No, Paul has a point. The hardest part about making anything is the necessary tools and infrastructure. These are things not very obvious when examining a part, especially just one example of it (even multiple identical examples helps due to tolerances) and a 60 year gap just further obfuscates things more. Examining something and understanding how it works is only half the battle (maybe even less). $\endgroup$
    – DKNguyen
    Commented Oct 30, 2019 at 23:01
  • 1
    $\begingroup$ In 1969 I'd just want to drive the darned DeLorean and to heck with the computer. :-) $\endgroup$ Commented Oct 30, 2019 at 23:04
0
$\begingroup$

If you enabled FileVault encryption, then the contents of the hard drive would be unusable. They would probably have better luck reverse engineering the keyboard, display, and networking devices. Just knowing what materials the CPU is fabricated and the basic layout would jumpstart their chip design. The project would start off slow, maybe nothing functional for several years, but then accelerate at an exponential rate.

$\endgroup$
0
$\begingroup$

No.

Even if you know everything about how a modern integrated chip works (dubious -- you need really advanced xray microscopy that's being invented now) you still have no idea how to build one.

Going from the sort of chip fabs that existed in the 1960s to the sort that make modern laptop cpus took 60 years of hard work from a ton of people. Having a few chips to study won't help that process along. Nor would government pressure help much.

If you left a complete fab in the past, that would be helpful.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .