Ten Principles of Good Design Redux | Part 7
Note: This post is technically out-of-order, but if modern microprocessors can do it then so can I.
I very recently designed a system to replace a “legacy” application that has been running for over 25 years. I have no doubt that this is not unusual and that there are still millions, if not billions, of lines of code still running that are well into their 20s. One would imagine that we learned our lesson with the whole Y2K nastiness, or simply by observing just how long some code manages to stay running.
Unfortunately I have seen time and time again how ostensibly competent Software Architects tacitly design software systems for a limited life-span. And it is the clients of these architects who have to bare the costs, since by the time the software starts to display the symptoms of its accelerated decrepitude they are usually heavily locked into the software, and have to do significant surgery to make critically-needed changes. Typically these changes add significant technical debt to the system and make subsequent changes harder still.
This can all be avoided if one designs the software with the assumption that it might live for a quarter of a century or longer. Of course you may be given a specific lifespan as a formal requirement. It is up to you to decide just how much salt to take those requirements with given your understanding of the client or domain. The case might also be made that designing relatively short-lived software is a way to guarantee your own job security. That may hold true in some cases but I would assert that if you are morally so flexible then you should be in politics rather than software development.
This comment was made about this post on Facebook:
ReplyDelete"Agree there is often lack of skills, training and integrity driving bad architecture and implementation, but a lot of the time it is not that sinister - Time pressure to get something out the door by a deadline takes precedent over thinking about impact 25 years down the line. It's always about the triangle -Time, Resources and Features :)"
t was not "time and pressure to get something out the door by a deadline" that caused developers to choose to use only two Bytes to represent the year portion of a date. It was driven by economics, and naive economics in my opinion. The costs of memory and storage were relatively astronomical in the three decades preceding Y2K; engineers were driven to reduce the memory and storage footprint wherever they could. A lot of very smart software engineers gave this a lot of thought and they nearly all failed to see the future defect that they were introducing into their systems. If those same software engineers had considered the implications of that design choice over the entire lifetime of the system they might have made a more appropriate choice. One has to wonder how much money was saved on hardware costs because of this optimization, and how that compares to the over $300 Billion spent on fixing the Y2K issue. The Time-Resources-Features Triangle is not a law of physics. There are no quanta for resources and features! I would assert that we should ban that particular meme; it is one of the many dangerous mythologies that permeate the industry.
ReplyDelete