Server provisioning, where aspects of your IT infrastructure become a dynamic managed resource, is coming – and with good cost/benefits for manufacturing organisations, writes Dom Pancucci
Manufacturing is one of many business sectors now pulling through rapid growth in deployment of computer servers. IT infrastructure investment is rising steadily as companies need to support new, more complex business processes or refresh existing ones. Yet the tendency is for users to assume tactical mode when it comes to acquiring additional server resources so that servers are added on a piecemeal basis.
All well and good, but such ad hoc patterns of deployment lead inevitably to a hike in the cost of IT ownership and an overall reduction in efficiency – even though that's not always immediately apparent.
A concept called 'server provisioning' is now emerging to deal with these issues. This is a pragmatic and holistic approach to delivering cost-efficient usage from computing investments. It's about raising the level of automation applied to the infrastructure, ultimately rendering IT delivery along the lines of ITIL-based service management. Computing would then be delivered 'on demand' to users across an enterprise, and even beyond the firewall.
More importantly, server provisioning is a major step towards overall utility computing – the latest enterprise beacon to be identified by IT industry analysts. At this point, IT real estate becomes almost completely virtualised, meaning that server resources are adaptable and can be deployed according to need, as opposed to merely supporting a single application.
Developments in technology are helping to bring server provisioning from concept to reality. Relatively recent innovations, such as Blade and grid computing, effectively treat IT infrastructure as a utility. Blade computers operate from a single high-speed bus, concentrating processing power. Grid is about pointing the combined resources of several systems at a particular task, often in areas like design. Both make virtual ownership of variable server power entirely realistic.
The question is, how to take advantage? IT consulting and services providers have spotted the opportunity to assist companies as they strive to deal with consolidating, automating and integrating largely disparate enterprise level IT resources, and Computacenter is one such that's directly addressing outsourced server provisioning. "The issue to deal with is that the explosion in IT infrastructure is based on stovepipe implementation of applications and servers, where capability and processing power have been addressed in piecemeal fashion," insists Simon Gay, consultancy practice leader at Computacenter.
Gay points out that the resulting issues and costs are legion. "It can take a mere two hours to install an Intel server, but rapid deployment can knock on to higher management costs later if that server is not properly integrated," he says by way of example. The point is server provisioning for him doesn't just focus on raw compute power: the service angle pulls together, automates and outsources computing processes, procedures and people in pursuit of maximum efficiency, lowest costs and highest availability, robustness and flexibility for a business' mission critical computing.
"Actually, one of the keys to server provisioning lies in nothing more complex than good governance and compliance with operating policies," observes Gay. "All IT resources can be based around a multi-layer service management model, applying across infrastructure layers and corresponding systems. SLAs [service level agreements] then define the delivery."
To achieve that though requires some work. Indeed, for smaller manufacturers, outsourcing the IT base is often going to be the only way to gain the kind of cost reductions and performance improvements we're talking about here, at least according to Gay.
It's a fair point, but there are instances where alternatives just have to be followed, one example being at furniture-maker Orangebox. The firm considered outsourcing its servers to a third party as a means to lowering operating costs and achieving higher efficiencies. However, according to Steve Jenkins, IT analyst, at the time the highly customised nature of its products – and thus the specialised nature of its product configurator application and ERP – militated against using what was then termed an application service provider (ASP).
"We considered the specialised nature of our software, in particular our configurator, and we decided not to add another layer of pain. We rely on daily information exchanges as a vital part of our business and any breakdown in communication with an outsourcer could have had a big impact," says Jenkins. And from that perspective, there's a trust issue to address.
Orangebox operates 10 servers standardised around Windows Server 2000. Jenkins outlines priorities for his company as first, boosting the in-house web environment to better utilise web-orientated features in its core software, and then longer term moving to a server-centric architecture, with network attached storage (NAS) and Citrix as its way of beating the infrastructure complexity trap. The belief is that a thin client solution will help reduce the cost and hassle of infrastructure management.
Compelling approach
Nevertheless, server provisioning is a compelling way forward for many. Veritas is Computacenter's key partner in providing services around server provisioning. The two organisations have already worked together successfully to assist the development of end-user strategies to control the cost of storage and introduce methodologies to maximise usage of expensive IT.
George Homs, director for Europe, IT infrastructure and automation at Veritas, concedes that moving on to tackle servers with a similar approach is a major step up in terms of complexity, but he believes the lessons learned in the storage management challenge provide the basis for moving onto servers.
In his view this has to be done. "Moving towards utility computing is not an overnight project and existing approaches to managing servers cannot satisfy the demands of business now and into the future, so more automation is key," insists Homs. He reckons success in server provisioning means dealing with the network, servers and storage on an inclusive basis, and he suggests starting by reducing and eventually eliminating the manual work involved in server upgrades, anti-virus patches and other mundane but infrastructure-essential tasks.
That alone can yield significant and visible cost reductions – in some cases reaching 70%. But Homs also points to other hidden costs around, for example, business recovery and downtime due, for instance, to outages and interruptions resulting from manual upgrades and restores – where a more holistic approach to infrastructure management could help. Consolidating and standardising on fewer platforms alone reduces costs in all sorts of ways – not least in terms of system configurations required.
As Homs points out, manufacturers have mission-critical links to maintain between back-end systems and their real-time manufacturing environments. Any disruption to these connections can severely impact the bottom line. Defined methodologies, governance, automation and flexible deployment of servers helps guarantee uptime.
Utility computing, a variant on the server provisioning concept that's finding favour with some analysts, will also come. The Clipper Group, for example, reckons the value here will be in services delivered in multiple tiers.
Infrastructure will be consolidated logically in an effective single pool of computing, it says. A single management console will be able to see the entire state of the IT infrastructure. And a fully automated environment will be based on policies, with little manual intervention required. Cost efficiency will then peak and IT become a dynamic resource to be turned up and turned down as required.
It's a changing world. Get with it.