Over my years as a software developer I have on occasion stopped and asked myself, "What exactly is the point?" I realize that on the surface that sounds a bit depressing, but you have to recognize that over the last forty years virtually every type of program you can think of has already been written. Sure there are a handful of exceptions, but generally most of us aren't doing anything particularly new.
If you spend real time thinking about this, you'll end up coming up with all sorts of related questions. For instance, how did we manage to do so much with software in the past when we had so little in the way of hardware resources available to us? We had a pittance of RAM, storage and processor power yet despite all this, I was still able to cruise the internet on a 486/33 with Windows 3.11 back in the day. I was still able to type up papers using word processors that ran in MS/DOS. I was still able to email people. I was still able to play awesome and engaging games. I was still able to communicate with people in far off places.
Decades later, things certainly look nicer. We definitely have a lot more options in terms of which software packages we can choose from to complete a particular task. Of course as a retro gamer, I'm not even going to really mention video games. If you are at all aesthetically concerned then the games of yesteryear are probably not going to be your cup of tea regardless of how fun they may or may not be. Beyond that I can't help but to wonder: Are me making effective use of the resources we have available to us today? Do today's software packages represent more of a revolutionary advancement when compared to yesterday's packages or an evolutionary one?
After thinking about this a great deal, I believe the answer is firmly rooted on the evolutionary side of the fence. In this industry we spend a lot of time crowing about innovation but the honest truth is that most of us aren't innovating at all. We are largely spending most of our time writing code that attempts to automate a manual process or provide a tool that enables a user in some sort of specific way. So if we follow that thought process to its logical conclusion, it begs the question: Should we just keep using old software? That's where things get a bit tricky. In theory I believe we could use a lot of old software packages and suffer minimal drawbacks. But there are many exceptions to that general rule of thumb. There may even be enough exceptions to render the exercise worthless.
So here's an example. I have a client whose sister company uses an old DOS based DBASE software application to handle most of their accounting. Interestingly enough, it works pretty well for them. That's not to say that it doesn't have its drawbacks but most of those drawbacks can be addressed by making use of other third party pieces of software designed to specifically address them. The biggest drawback by far involves printing reports. Printing in the DOS days was a far different beast than it is now. Not to mention that modern 64 bit x86 hardware booting a 64 bit operating system cannot run old 16 bit code directly. Nevertheless it is possible to overcome these hurdles using specifically targeted versions of DOSBox and utilities like DOSPRN.
So obviously I made it so that they could keep using this application. It works entirely from within Windows and it has enabled them to keep from spending money on software upgrades that in theory could be better spent elsewhere. Could this success be extended into other use cases? Almost certainly yes, but there are a few factors that will really muddy up the waters in my experience.
The primary factor you need to consider is whether or not the application will be exposing any sort of service over the network. If it is and it hasn't been updated in awhile then it is almost certainly a liability that you should go out of your way to avoid. The second thing to consider is interoperability. Sure I could use a classic word processing application like Word Perfect to write up my resume, but if nobody can read the files it produces, would it actually be worth it? Probably not. The third factor is ease of use and maintenance. While it may in fact be possible to use a legacy piece of software to accomplish a specific task, it isn't really worth it if enabling it to do so requires a great deal of maintenance or effort on your part. In addition if the piece of software was questionable and buggy to begin with, it probably wouldn't be worth the effort to reuse.
That's an interesting word though: Reuse. It reminds me of recycling. The idea of recycling legacy software really appeals to me on some level. I wonder though: How will we feel about the software we are writing today when we look back 20 years from now? Will it still appeal to some? Will anybody even know about it? Those are all great questions. While I continue to mull them over, I believe I'll keep my focus on trying to write more efficient software that works reliably. If I can accomplish that in the short term, then perhaps somebody down the road will have the opportunity to appreciate and ultimately recycle my work in the long term.
As a software developer, at the end of the day, I can't think of a higher compliment than that.