Martin Gubov, Infrastructure Business Unit Manager
—————————————————————–
We’re in a world where we still rely heavily on up to date hardware. The Internet of Things alone will be a gigantic tapestry of devices, we are regularly told, from toasters and kiosks to business intelligence tools. At the same time, there’s a gigantic mountain of server and networking resource that’s also growing bigger.
Equipment trackers Technavio recently forecasted that the global data centre server market will grow at a compound annual growth rate of 5.73 per cent between now and 2019, while IDC says the world acquired 1,210 petabytes of new storage capacity during the second quarter of this year alone – up 44.8 per cent compared to the same period a year ago.
Given that a petabyte is a 1,000 terabytes, it’s easy to just get completely lost in the jungle here. More and more handhelds, laptops, clusters, blades, routers and everything in between come out of the factories at an ever-increasing rate, it seems.
Even five years ago, the smart CIO would have said yes. For the vast majority of the history of commercial data processing, businesses have known that competitive advantage lies in having, if not always the very best tools on the desk or the glasshouse, then at least ‘par value’. You might let slip a generation or two of Windows to save on licensing costs, but you really couldn’t let 286 machines still be used in a 486 world, let’s face it. And anyone who’s ever worked in the City will know that financial service companies would never tolerate working without with the fastest, latest and greatest workstations.
Today, though, it’s a lot less clear. While some parts of the market will probably still want the latest out of the catalogue – like, financial services firms, some parts of government (especially defence and intelligence services), or for R&D or University research – the reality is that most typical businesses can save on OPEX by using more standardised kit.
That’s because the rise of the cloud means a lot of the number-crunching gets done on very efficiently run server farms, that can operate at economies of scale that means the delivery of the functionality you need – from the software – can be tapped into remotely.
But hardware, as we have seen, isn’t going away. You still need actual electronics around. This is where the combination of the cloud for software and renting or hiring of actual ‘iron’ is starting to help customers, say the experts.
That’s right – we have a very healthy (and useful) market now around the outsourcing of hardware. By working with a supplier that knows what it’s doing, you can contract for ‘good-enough’ kit that you know is well supported and maintained… but which you can switch in and out when it suits you.
Many customers are waking up to the fact that outsourcing hardware can be a great way to minimise cost (purchase, yes, but also all that internal overhead in having your own in-house engineers) but also getting easily powerful enough machines for your specific current purpose.
If that means you don’t get to keep the IT version of a Ferrari in the garage… sorry.
But it does mean you can have a fleet of zippy electric cars capable of doing all you want – at a lot less expense!
Is it time you applied the same logic to your hardware assets as you have to your software licences?
Seems so, doesn’t it?
This article was originally published in http://www.pcr-online.biz/.
We are in the process of finalizing. If you want to be redirected to our old version of web site, please click here.