160

I'm currently developing a web application for government land planning. The application runs mostly in the browser, using ajax to load and save data.

I will do the initial development, and then graduate (it's a student job). After this, the rest of the team will add the occasional feature as needed. They know how to code, but they're mostly land-planning experts.

Considering the pace at which Javascript technologies change, how can I write code that will still work 20 years from now? Specifically, which libraries, technologies, and design ideas should I use (or avoid) to future-proof my code?

17
  • 96
    I started programming in Fortran in late 1966, so I've had plenty of time to think about exactly that kind of issue. If you ever come across an even-50%-reliable answer, please let me know. Meanwhile, just think of the almost-certain inevitable obsolescence as "job security" :) Commented Oct 13, 2016 at 8:19
  • 11
    Nothing last forever in Software Engineery. Only HOST at banks and because nobody dares to update such critical systems. Well, I guess the program running in the Voyager also counts.
    – Laiv
    Commented Oct 13, 2016 at 8:22
  • 9
    @Laiv Some time back, I worked on money transfer applications for Bankers Trust using Swift messaging running on Vax/VMS. A few years later, Swift eol'ed (end-of-life'ed) all VMS support. Boy, did that cause some problems ... and provided me with yet another contract at BTCo. Like I said above, "job security":). Anyway, my point is that even critical financial market applications aren't immune to obsolescence. Commented Oct 13, 2016 at 8:36
  • 102
    How about "Write code that the next developer can understand"? If and when the code becomes obsolete to the point that they will need to find a programmer to update it, the best scenario is that they will understand what your code is doing (and maybe why certain decisions were made). Commented Oct 13, 2016 at 13:30
  • 38
    Just use plain old HTML, no JS, no plugins, nothing fancy. If it works in Lynx, it's good for all time.
    – Gaius
    Commented Oct 13, 2016 at 20:17

8 Answers 8

137

Planning software for such a lifespan is difficult, because we don't know what the future holds. A bit of context: Java was published 1995, 21 years ago. XmlHttpRequest first became available as a proprietary extension for Internet Explorer 5, published 1999, 17 years ago. It took about 5 years until it became available across all major browsers. The 20 years you are trying to look ahead are just about the time rich web applications have even existed.

Some things have certainly stayed the same since then. There has been a strong standardization effort, and most browsers conform well to the various standards involved. A web site that worked across browsers 15 years ago will still work the same, provided that it worked because it targeted the common subset of all browsers, not because it used workarounds for each browser.

Other things came and went – most prominently Flash. Flash had a variety of problems that led to its demise. Most importantly, it was controlled by a single company. Instead of competition inside the Flash platform, there was competition between Flash and HTML5 – and HTML5 won.

From this history, we can gather a couple of clues:

  • Keep it simple: Do what works right now, without having to use any workarounds. This behaviour will likely stay available long into the future for backwards-compatibility reasons.

  • Avoid reliance on proprietary technologies, and prefer open standards.

The JavaScript world today is relatively volatile with a high flux of libraries and frameworks. However, nearly none of them will matter in 20 years – the only “framework” I'm certain that will still be used by then is Vanilla JS.

If you want to use a library or tool because it really makes development a lot easier, first make sure that it's built on today's well-supported standards. You must then download the library or tool and include it with your source code. Your code repository should include everything needed to get the system runnable. Anything external is a dependency that could break in the future. An interesting way to test this is to copy your code to a thumb drive, go to a new computer with a different operating system, disconnect it from the internet, and see whether you can get your frontend to work. As long as your project consists of plain HTML+CSS+JavaScript plus perhaps some libraries, you're likely going to pass.

23
  • 4
    Large scale applications are unmaintainablr in vanilla js, as of now. ES6 already somehow fixes the issue, but there is a reason why flow or TypeScript are gaining popularity.
    – Andy
    Commented Oct 13, 2016 at 12:00
  • 35
    @DavidPacker Absolutely, TypeScript etc. are great and make development easier. But as soon as I introduce a build process, all the tools required for the build process become dependencies: NodeJS, Gulp, NPM – who says NPM will still be online in 20 years? I'll have to run my own registry to be certain. This is not impossible. But some point, it's better to let go of things that make development easier only immediately, but not in the long run.
    – amon
    Commented Oct 13, 2016 at 12:29
  • 32
    @DavidPacker There are many dynamic languages, and surprisingly, many successful systems have been built with Smalltalk, Ruby, Perl, Python, even with PHP and JS. While statically typed languages tend to be more maintainable whereas dynamic languages tend to be better for rapid prototyping, it's not impossible to write maintainable JS. In the absence of a compiler, high median skill in the team, craftsmanship, and extra emphasis on clear code organization becomes even more crucial. I personally think types make everything easier, but they're no silver bullet.
    – amon
    Commented Oct 13, 2016 at 13:09
  • 4
    Did I just read "take usb and test on different machine"? Why not just spin up virtualbox or just use incognito mode (with ethX disabled).
    – Kyslik
    Commented Oct 13, 2016 at 15:58
  • 5
    I’m not certain vanilla JS will be a sure thing 20 years from now. Its history was rocky and experimental, and it’s picked up a fair amount of cruft along the way, even as it has emerged as a delightful and effective language (I personally prefer JavaScript or TypeScript myself). It’s not hard to imagine that vendors may well want to ditch some or all of that cruft, whether it means starting to offer a new alternative language—as Google seemed to be proposing with Dart, however much that doesn’t seem to have gone anywhere—or by deprecating and then eliminating portions of JS.
    – KRyan
    Commented Oct 13, 2016 at 17:19
182

What is even more important than your code surviving for 20 years is that your data survives for 20 years. Chances are, that's the thing worth preserving. If your data is easy to work with, building an alternate system on top of it with newer technology will be easy.

  • So start with a clear and well documented data model.
  • Use an established, well supported database system, such as Oracle[1] or SQL Server.
  • Use basic features, don't try to squeeze in flashy new ones.
  • Prefer simple over clever.
  • Accept that future maintainability can come at the expense of aspects like performance. For instance, you might be tempted to use stored procedures, but these might limit future maintainability if they prevent someone from migrating the system to a simpler storage solution.

Once you have that, future-proofing the app itself is simpler, because it's a wrapper around the data model, and can be replaced if, in 10 years, no one uses Javascript anymore, for instance, and you need to migrate the app to WASM or something. Keeping things modular, less interdependent, allows for easier future maintenance.


[1] Most comments to this answer take a strong stance against using Oracle for a DB, citing a lot of perfectly legitimate reasons why Oracle is a pain to work with, has a steep learning curve and installation overhead. These are entirely valid concerns when choosing Oracle as a DB, but in our case, we're not looking for a general purpose DB, but one where the primary concern is maintainability. Oracle has been around since the late 70's and will probabl be supported for many years to come, and there's a huge ecosystem of consultants and support options that can help you keep it running. Is this an overpriced mess for many companies? Sure. But will it keep your database running for 20 years? Quite likely.

16
  • 142
    I'm sorry, but I have to say this. If you use Oracle, you're shooting everyone in the foot with regards to "easy to work with." Oracle is not easy to work with in the slightest. A great deal of functionality that SQL Server, PostgreSQL, and probably even MySQL make simple, Oracle either flat out doesn't have or makes overly difficult. I never have as many stupid problems with other DBs as I have with Oracle; even just setting up the client is a huge pain in the butt. Even Googling things is hard. If you want "easy to work with," stay away from Oracle.
    – jpmc26
    Commented Oct 13, 2016 at 22:17
  • 4
    +1 for keeping the data as simple as possible. Use standard SQL for this e.g. use OUTER JOIN instead of the oracle specific + operator. Use simple table layouts. Dont normalize your tables to the absolute maximum level. Decide if some tables can have redundant data or if you really must create a new table so that every value exists only once. Are stored procedures vendor specific? If yes then dont use them. Dont use the hottst feature of your current language of choice: I've seen more COBOL programs without OOP-Features then with them. And thats totally ok.
    – some_coder
    Commented Oct 14, 2016 at 6:46
  • 3
    @jpmc26 I agree with your sentiments about Oracle, but as I said, "easy to work with" isn't necessarily the main requirement here. I prefer a solidly supported platform here, even if it's a pain to work with. Because when amortized over 20 years, it's not too bad. Commented Oct 14, 2016 at 7:15
  • 8
    Indeed avoid Oracle. The only DB in existence today that is likely to not look like a bad choice in 20 years is Postgresql.
    – Joshua
    Commented Oct 14, 2016 at 18:55
  • 3
    I'd like to add that great open source DBMS are preferable because there is a good chance they won't die. If Oracle stops making money in 10 years, then in 11 it will be gone. PostreSQL seems like the best horse to bet on.
    – Shautieh
    Commented Oct 15, 2016 at 6:32
38

The previous answer by amon is great, but there are two additional points which weren't mentioned:

  • It's not just about browsers; devices matter too.

    amon mentions the fact that a “web site that worked across browsers 15 years ago will still work the same”, which is true. However, look at the websites created not fifteen, but ten years ago, which, when created, worked in most browsers for most users. Today, a large part of users won't be able to use those websites at all, not because browsers changed, but because devices did. Those websites would look terrible on small screens of mobile devices, and eventually not work at all if developers decided to rely on JavaScript click event, without knowing that tap event is also important.

  • You're focusing on a wrong subject.

    Technology changes are one thing, but a more important one is the changes of requirements. The product may need to be scaled, or may need to have additional features, or may need its current features to be changed.

    It doesn't matter what will happen to browsers, or devices, or W3C, or... whatever.

    If you write your code in a way it can be refactored, the product will evolve with technology.

    If you write your code in a way nobody can understand and maintain it, technology doesn't matter: any environmental change will bring your application down anyway, such as a migration to a different operating system, or even a simple thing as natural data growth.

    As an example, I work in software development for ten years. Among the dozens and dozens of projects, there were only two I decided to change because of technology, more precisely because PHP evolved a lot over the last ten years. It wasn't even the decision of the customer: he wouldn't care less if the site uses PHP's namespaces or closures. However, changes related to new requirements and scalability, there were plenty!

4
  • 4
    Adoption to different screen sizes is a general problem. Mobile is the hyped thing at the moment, but if you are looking at this website in a full screen browser window on a screen with enough resolution, there's a lot of empty (wasted) space. Changing layouts and how information is presented to best use the available pixels never really happened in a smart way. Mobile made this obvious. But thinking in the other direction might be more important for the question at hand.
    – null
    Commented Oct 13, 2016 at 12:46
  • 9
    @null: while I agree with your comment, StackExchange websites may not be the best illustration of your point. Given the data to display, I believe StackExchange designers/developers did a great job of displaying it as it needs to be displayed, including on large monitors. You can't make the main column wider, because text would become much more difficult to read, and you can't use multiple columns because it won't look nice for short questions and answers. Commented Oct 13, 2016 at 12:56
  • Another good example is the 'hover' event that was often used in menu systems. Many of those menus fail miserably with touch devices.
    – Justas
    Commented Oct 13, 2016 at 15:59
  • You're 110% (or more) right about devices, and I can provide you with decades-older examples. Back in the late 1980's I worked on CICS applications running on IBM mainframes and synchronous 3270 terminals. The CICS region is kind of analogous to server-side apps, sending screen-fulls of data at a time to the synchronous terminals, which are thus analogous to dedicated-device-browsers. And CICS programming was maybe 80% Cobol, 20% PL/1. Both those languages are mostly obsolete nowadays, and the appearance of Unix workstations (Sun and Apollo) in the early 1990's pretty much killed CICS entirely Commented Oct 17, 2016 at 11:06
32

You do not plan to last 20 years. Plain and simple. Instead you shift your goals to compartmentalization.

Is your app database agnostic? If you had to switch data-bases right now, could you. Is your logic language agnostic. If you had to rewrite the app in a totally new language right now, could you? Are you following good design guidelines like SRP and DRY?

I have had projects live for longer then 20 years, and I can tell you that things change. Like pop-ups. 20 Years ago you could rely on a pop-up, today you can not. XSS wasn't a thing 20 years ago, now you have to account for CORS.

So what you do is make sure your logic is nicely separated, and that you avoid using ANY technology that locks you in to a specific vendor.

This can be very tricky at times. .NET for example is great at exposing logic and method for it's MSSQL database adapter that don't have equivalents in other adapters. MSSQL might seems like a good plan today but will it remain so for 20 years? Who knows. An example of how to get around this to to have a data layer totally separate from the other parts of the application. Then, worst case, you only have to re-write the entire data layer, the rest of your application stays unaffected.

In other words think of it like a car. Your car is not going to make it 20 years. But, with new tires, new engine, new transmission, new windows, new electronics, etc. That same car can be on the road for a very long time.

8
  • 2
    "If you had to switch data-bases right now, could you" This is nigh impossible to accomplish if you do anything more than CRUD on one row at a time.
    – jpmc26
    Commented Oct 13, 2016 at 22:26
  • 1
    Plenty of ORMs are database agnostic. I could given any one of the projects I am working on gaurentee that I could switch from SQLLite, to MySql and Postgre with no effort.
    – coteyr
    Commented Oct 13, 2016 at 23:21
  • 5
    And ORMs cease to be very good tools for the job when you do more than simple CRUD on a single record at a time. That's why I qualified it. I've tried. As query complexity grows, even the best ORMs become more trouble than just writing the query, and even if you force your query into them, you pretty quickly find yourself using database specific features or optimizations.
    – jpmc26
    Commented Oct 13, 2016 at 23:22
  • 1
    Define "complex". Was this a bulk operation? Did it include window queries? Subqueries? CTEs? Unions? Complex grouping conditions? Complex math on each row and the aggregates? How many joins in a single query? What kinds of joins? How many rows were processed at once? Admittedly, saying anything over single row CRUD (Mind you, this means one row per query, not per web request or whatever.) is a bit of hyperbole, but the road to when the ORM becomes more trouble than it's worth is much shorter than you think. And the steps to making a query perform well are very frequently database specific.
    – jpmc26
    Commented Oct 13, 2016 at 23:34
  • 4
    "Is your app database agnostic? If you had to switch data-bases right now, could you?. Is your logic language agnostic. If you had to rewrite the app in a totally new language right now, could you?" - This is ABSOLUTELY TERRIBLE advice! Don't constraint yourself artificially to whatever you think the largest common denominator of programming languages or databases is - this will force you to reinvent the wheel constantly. Instead, try to find the NATURAL way to express the desired behaviour in your programming language and database of choice.
    – fgp
    Commented Oct 14, 2016 at 13:38
12

The answers by @amon and some others are great, but I wanted to suggest you look at this from another perspective.

I've worked with Large Manufacturers and Government Agencies who were relying on programs or code-bases that had been used for well over 20 years, and they all had one thing in common -- the company controlled the hardware. Having something running and extensible for 20+ years isn't difficult when you control what it runs on. The employees at these groups developed code on modern machines that were hundreds of times faster than the deployment machines... but the deployment machines were frozen in time.

Your situation is complicated, because a website means you need to plan for two environments -- the server and the browser.

When it comes to the server, you have two general choices:

  • Rely on the operating system for various support functions which may be much faster, but means the OS may need to be "frozen in time". If that's the case, you'll want to prepare some backups of the OS installation for the server. If something crashes in 10 years, you don't want to make someone go crazy trying to reinstall the OS or rewrite the code to work in a different environment.

  • Use versioned libraries within a given language/framework, which are slower, but can be packaged in a virtual environment and likely run on different operating systems or architectures.

When it comes to the browser, you'll need to host everything on the server (i.e. you can't use a global CDN to host files). We can assume that future browsers will still run HTML and Javascript (at least for compatibility), but that's really a guess/assumption and you can't control that.

7
  • 11
    You have to consider security too. A 20-year old unsupported OS will probably be full of security holes. I worked for a company and inherited this problem. Government agency, ancient OSs (all long virtualised, fortunately), but this was a huge problem, and upgrading was nigh impossible due to having to completely rewrite the software (hundreds of individual spaghetti-code PHP scripts, each of which had the database calls hardcoded, using deprecated functions that the new driver didn't support /shudder).
    – user203448
    Commented Oct 13, 2016 at 18:37
  • If you go the OS route, at best you can hope that security patches were applied, and that future maintainers will be able to shield stuff at the networking layer. In order to plan for stuff to work like this in the long term (esp in the absence of a large budget, as the OP is a student) you basically need to accept that your application and server will eventually become insecure. For example, in 20 years there will eventually exist known exploits for the SSL version on the server... but that OS may not be compatible with openssl versions in10 years. This is all about minimizing tradeoffs. Commented Oct 13, 2016 at 20:50
  • @FighterJet, you can always run a firewall on a supported OS, then you have few risks apart of SQL injects etc that you should have coded for anyway.
    – Ian
    Commented Oct 18, 2016 at 11:56
  • @Ian: I wish. There was a firewall. But I didn't write the code, I inherited it. And yes, there were thousands of SQL vulnerabilities that I wish I could have fixed, but the real problem was that the code depended on a particular version of PHP4 (which has been deprecated for forever and is chock-full of security holes) and a particular version of the database driver (which didn't work on newer OSs), which prevented us upgrading to a newer version of the database... the point is, relying on something staying the same doesn't always work. Let's just say I'm glad I don't work there anymore.
    – user203448
    Commented Oct 18, 2016 at 16:43
  • 1
    @FighterJet That's actually a really good example of what I had meant to talk about. You ended up inheriting code that only works on a particular version of PHP4 and a driver that only runs on a particular OS... so you can't upgrade the server. I wouldn't advocate anyone doing that, but it happens. -- a lot. FWIW, I do agree with you but I wanted my answer to foster thinking around those types of scenarios, not make a recommendation. Commented Oct 18, 2016 at 17:23
6

The core of most applications is the data. Data is forever. Code is more expendable, changeable, malleable. The data must be preserved, though. So focus on creating a really solid data model. Keep the schema and the data clean. Anticipate, that a fresh application might be built on top of the same database.

Pick a database that is capable of enforcing integrity constraints. Unenforced constraints tend to be violated as time passes. Nobody notices. Make maximum use of facilities such as foreign keys, unique constraints, check constraints and possibly triggers for validation. There are some tricks to abuse indexed views to enforce cross-table uniqueness constraints.

So maybe you need to accept that the application will be rewritten at some time. If the database is clean there will be little migration work. Migrations are extremely expensive in terms of labor and defects caused.

From a technology perspective it might be a good idea to put most of the application on the server and not in a JavaScript form on the client. You'll probably be able to run the same application in the same OS instance for an extremely long time thanks to virtualization. That's not really nice but it's a guarantee the app will work 20 years from now without any expensive maintenance and hardware costs. Doing this you at least have the safe and cheap fallback of continuing to run old, working code.

Also, I find that some technology stacks are more stable than others. I'd say that .NET has the best possible backwards compatibility story currently. Microsoft is dead serious about it. Java and C/C++ are really stable as well. Python has proven that it is very unstable with the Python 3 breaking changes. JavaScript actually seems quite stable to me because breaking the web is not an option for any browser vendor. You probably should not rely on anything experimental or funky, though. ("Funky" being defined as "I know it when I see it").

3
  • about .net backwards compatibility story - I don't think I've seen a java app that would ask for an older version of java, as in contrast. That might change with Java 9 or beyond, but haven't seen it happen yet.
    – eis
    Commented Oct 17, 2016 at 11:29
  • It is amazingly compatible in practice, and installing an older version side by side is not an issue. Also note, that the .NET BCL is in my estimate 10-100x larger than the Java built-in classes.
    – usr
    Commented Oct 17, 2016 at 12:11
  • backwards compatibility means that there should be no need to install also an older version. But we digress from the original question, this is not really relevant to OP.
    – eis
    Commented Oct 17, 2016 at 12:23
0

The other answers do make sense. However, I feel the comments on the client technology is over complicating things. I've been working as a developer for the past 16 years. In my experience, as long as you keep your client code intuitive, you should be fine. So no "hacks" with frames / iframes, etc.. Only use well defined functions in the browsers.

You can always use compatibility modes in browsers to keep them working.

To prove my point, only a few months ago I fixed a millennium bug in the javascript code for a customer, who has been running their web app for 17 years. Still works on recent machines, recent database, recent operating system.

Conclusion: keep it simple and clean and you should be fine.

2
  • 1
    Frames and iframes are very well defined in the HTML spec. What makes them unsuitable? Commented Oct 14, 2016 at 11:55
  • 3
    @curiousdannii: It is not so much the use of iframes (frames are no longer supported in HTML5), as the use of frames and iframes to load content asynchronously through scripting, etc.. It can work great right now, but it will always be subject to security changes. Commented Oct 14, 2016 at 12:30
-2

A few axioms:

  • Truth survives. In this context, it would be algorithms and data models - that which truthfully represents the "what" and the "how" of your problem space. Although, there is always the potential for refinement and improvement, or an evolution of the problem itself.
  • Languages evolve. This is as true for computer languages as it is for natural languages.
  • All technology is vulnerable to obsolescence. It just may take longer for some technologies than others

The most stable technologies and standards (those least vulnerable to obsolescence) tend to be those which are non-proprietary and have been most widely adopted. The wider the adoption, the greater the inertia against almost any form of change. Proprietary "standards" are always vulnerable to the fortunes and whims of their owner and competitive forces.

Twenty years is a very long time in the computer industry. Five years is a more realistic target. In five years' time, the whole problem your application is meant to solve could be completely redefined.

A few examples to illustrate:

C and C++ have been around for a long time. They have implementations on just about every platform. C++ continues to evolve, but "universal" features (those available on all platforms) are pretty much guaranteed to never be deprecated.

Flash almost became a universal standard, but it is proprietary. Corporate decisions to not support it on popular mobile platforms have basically doomed it everywhere - if you're authoring for the web, you want your content available on all platforms; you don't want to miss the major market mobile has become.

WinTel (Windows/x86) despite being proprietary to Microsoft and Intel, having started out on a less-than-optimal platform (16 bit internal / 8 bit external 8088 vs contemporaneous Apple Macintosh 32 bit internal / 16 bit external 68000), and erosion to Apple in the consumer market remains a de facto choice for business platforms. In all that time (25 years), a commitment to backward compatibility has both hobbled future development and inspired considerable confidence that what worked on the old box will still work on the new one.

Final thoughts

JavaScript might not be the best choice for implementing business logic. For reasons of data integrity and security, business logic should be performed on the server, so client-side JavaScript should be limited to UI behavior. Even on the server, JavaScript might not be the best choice. Although easier to work with than other stacks (Java or C#) for small projects, it lacks the formality which can help you write better, more organized solutions when things get more complex.

Not the answer you're looking for? Browse other questions tagged or ask your own question.