Old Computers

Around 5 years ago, a trend began across the IT business. Prior to that time, businesses had largely wanted to “own” the IT on which they depended. There was a thriving industry providing them with data centres, filled with servers and storage. However, those businesses had started to question the economics of paying for a private data centre, versus renting space on a shared “cloud”. Basically, they started to trade a “guarantee” of reliability of operation for reduced cost – and particularly capital cost. But there’s a secondary cost to cloud computing and it’ll be starting to hit about now. I am out of the IT business, but I wonder how the impact is being addressed.

When I was helping companies plan to move their business systems to the cloud, there was one, recurring issue: old computers. Some, very old. So old that the software they were running could not simply be moved to a newer platform. The most common challenge, stopping those systems from being migrated, was years (decades) of modification of the software. Recreating all of those changes on a new system was often prohibitively expensive – so expensive as to wipe out more than a years’s benefit from the lower cost of cloud. So those old computers were often left in a private data centre.

But I am not referring to those old computers; I am referring to the challenge that prevented them from being moved, and how that challenge still applies to cloud computing. Many companies could not imagine allowing the IT department to tell the business how to operate; it was always the other way around. So the IT department would dutifully adapt commercial software and write new modules [or pay contractors to do so] to add the functions required by the business – at the lowest cost they could. And it’s that behaviour that made the old systems hard to update and then hard to move.

The problem for IT departments is that when they signed their cloud contracts, they agreed to allow the cloud provider to update their systems, even if the business software would no longer work. And that point is now approaching. A great many companies must now be facing a range of issues such as the withdrawal of support for 32-bit software. Unlike the past, they can’t just “dodge” the cost of updating their software and keep the old servers running. And they face a “double whammy” because this cost is coming towards them at the same time as businesses face a slowdown in the global economy. I suspect that many CIO’s are having difficult conversations with their boards.

Right now, I suspect that my erstwhile colleagues are trying to help those CIO’s to find solutions. One option, of course, is to move the most difficult systems out of the cloud and back to private data centres, but that is just a “sticking plaster” solution. Because the systems they don’t move will still be ageing, and companies will face the same problem again, in a further 5 years time. Because that’s roughly the period of obsolescence in computer technology.

The only long term solution is to stop the ageing, but that’s easier said than done. Although the methods for doing so are reasonably well understood – and have been around for at least 20 years – applying them adds complexity and cost. And business leaders traditionally press down hard on their IT departments – even where the core of their business depends as much on the IT as on other tools. The additional complexity of the methods also affects the level of skill that is needed within the IT department – both to create the custom changes required by the business and then to verify that they are “robust” against future obsolescence.

If the IT budget is being squeezed, the only avenue that allows a CIO to “balance the books” with higher development costs is to restrict the amount of new development. Instead of bowing to every change request, the CIO must be prepared – through his or her team – to push back on the business and get it to change, instead. Or at least, to expose the costs in the business more openly, so that a rational, business decision can be made. Adopting the same solution as other businesses, where there is no great benefit to uniqueness, means less customisation of the IT. But of course that challenges the autocracy of business unit leaders.

So I do wonder … Will companies “bite the bullet” and accept that their IT needs painful surgery? And that even the relationship between the business and its IT needs change? Or will they continue to hope that the annoying pain just goes away?

Author: sbwheeler

Retired IT consultant.

Leave a comment