Utility Computing: For Technology Cost Savings, Switch to Electric Company Practices
Utility Computing: For Technology Cost Savings, Switch to Electric Company Practices
By Mark Bregman
Every day, millions of people plug devices and appliances into electrical outlets without giving a second thought as to whether power will be available. Even with spikes in demand, electric utilities consistently deliver reliable service without interruption or exorbitant cost.
Likewise, Information Technology (IT) departments are beginning to operate this way. Utility computing gives citizens, businesses, and agencies access to data and applications at all timesin much the same way that an outlet provides instant access to electricity in the home or office. Utility computing helps IT organizations drive down operational and infrastructure costs, allowing the resulting savings to fund new projects that improve services to citizens.
Utility computing is important to the procurement officer because the process aligns human and capital resources with government priorities and decreases hardware expenditures by allowing users to pool resources and share access to software and data storage. In other words, utility computing means less infrastructure, and thereby reduced costs, through more effective use of resources. Utility computing helps both government procurement officers and IT directors better coordinate current IT resources with changing demands on the agencies.
Part of President Bushs Management Agenda includes 24 e-government initiatives intended to change dramatically the way in which government agencies interact with and serve citizens. Most agencies strive to become more budget conscious, results-focused, and service-oriented. Many have already made improvements, but such a massive undertaking will require time and long-term planning to reach the ultimate goal.
To assist individual agencies with these e-initiatives, the Office of Management and Budget (OMB) has developed the Federal Enterprise Architecture (FEA), a blueprint for how an agencyor a function that spans multiple agencies should be structured to operate more efficiently.
Using the five FEA reference models as a guide, agencies are now focusing on aligning IT investments with the Presidents Management Agenda. Implementation of the e-government initiatives will be the first real instantiation of the FEA and the first step toward making federal agencies more accessible to every citizen.
To improve delivery of services to the public, agencies are focusing on three goals. The first is to provide continuous access to information through cross-agency collaboration. The second is to ensure a resilient IT infrastructure through protection and security of mission-critical data. The third is to integrate Web-based applications into the IT infrastructure, enabling collaboration while enhancing system performance. These goals will be realized by maintaining a shared pool of IT resources that can be leveraged across agenciesthe model of utility computing.
The utility computing model requires that data and applications always be available, that services be delivered at specified levels, and that IT processes be automated. On the supply side (the IT department side), this means reduced cost of ownership, more efficient use of resources, and the ability to allocate costs to specific agency departments. On the user side, the idea is to create an environment in which IT resources are optimized and aligned with changing needs of each agency. Above all, utility computing is based on the principle of creative cost savings: getting the most out of what is already there.
Power Up Always-On Availability
The first requirement of utility computing is that data and applications always be available. Users should be insulated from disruptive events ranging from server failure to complete site outage. Despite the fact that eliminating downtime is already an IT preoccupation, always-on computing remains a challenge. According to a report by IDC Research, when disaster strikes, an organization can expect to experience approximately three to seven days of downtime per event. Decreasing hardware costs have made it possible for many agencies to protect data with layers of redundancy, but that redundancy makes some IT structures more difficult to access.
What can IT managers do to maximize availability levels? Consider the following:
Is all mission-critical data regularly backed up?
Data in field and home offices and on desktops and laptops is unquestionably valuable, but because of costs and logistical problems, this data is not usually backed up. The utility computing model calls for centralized, automated, cost-effective backup of these resources.
How is data backed up and recovered?
Data volumes mirrored at one or more remote sites can now be reliably replicated over Internet Protocol (IP) networks to reduce the amount of data exposed to loss and to expedite disaster recovery in the event of a national security emergency. Automated server and application provisioning, the process of providing users with access to data and technology resources, eliminates error-prone, manual recovery techniques.
Clustering, connecting two or more computers together so that they operate as one, optimizes availability by automatically detecting application and database performance bottlenecks or server failure and moving critical services to other servers within the cluster.
Failover, a backup operation that automatically switches to a standby database, server, or network if the primary system fails or is temporarily shut down, should include not only the data application, but also the application state, reducing the effects of a failure to minimize impact to end-users, the agency, and its mission. Under the heading of data availability, the utility computing model includes virtualization and pooling of storage resources, which enables IT departments to increase storage utilization rates and reduce costs. Storage virtualization also reduces administrative costs by providing centralized control of heterogeneous resources from a single graphical user interface (GUI).
Effective data life-cycle management further reduces the costs of data availability by automatically migrating data to the most cost-effective storage medium and allowing enterprises to access information selectively for regulatory compliance.
Plug in to Optimize Performance
Utility computing boasts the ability to scale computing resources to the needs of any agency, optimize end-user response times, improve the overall quality of service, and detect and remedy causes of performance degradation, all in real time.
This process requires tools that can manage the entire application stackfrom the Web browser or client application to the storage deviceeven in complex, heterogeneous environments. If end-user response times are lagging, IT staff can break layers down tier by tier to pinpoint problems. By using a dashboard-type client to send alerts and reports, IT staff can receive early warning of developing problems, along with pointers to appropriate remedial action. Or, if a database is running too slowly, storage management networks can accelerate access for faster overall operation.
Networks, applications, and data continue to grow, while government IT teams work to bring down walls between once-disparate agencies. The utility computing model is non-partisan, manages the complexity of mixed-vendor technology environments, and provides performance optimization tools, which will become more significant and valuable to IT departments as agencies begin to align technical resources.
Realign Resources to Give IT Employees an Outlet for Creativity
With the continuous decline in hardware costs, people are now the greatest expense in any IT department. Handling routine tasks in todays evolving, heterogeneous environment is a costly, unnecessary task. Automating processes releases IT personnel from day-to-day tasks, allowing them to focus on more strategic activity and application development. Automation should enable IT resources to adjust to changing priorities without operator intervention.
Automation does more than free up costly staff members for more productive work, however. It also speeds up processes to improve availability, ensures that jobs are done right the first time, and saves costs through more effective management of resources. Here are several examples of how automation technology can bring an entity closer to the utility computing model:
–Virtualize and pool storage devices to drive up storage utilization and reduce hardware costs.
–Simplify storage management by automating common tasks from simple graphical interfaces.
–Virtualize and pool computing capacity. Server utilization is notoriously lowat best, 20 percentand applications vary over time in their need for processing. Drawing processing resources from a pool of servers drives up server utilization to align the needs of the agency.
–Provision a second server anywhere in the world when a server, operating system, or application fails. Automated migration of the application makes the failover practically unnoticeable to users.
Regardless of IT infrastructure size, or span, agencies at the federal, state, and local levels are already reaping the benefits of utility computing. The U.S. Coast Guard and Coast Guard Reserve, for instance, implemented a centralized data protection system across the entire entity in early 2004. The system provides support and on-demand computing for 38,000 users at 1,100 sites around the world, including all U.S. Coast Guard cutters. IT directors have the ability to remotely schedule and launch data backup and restore processes across the entire enterprise.
As the first line of defense with search and rescue, port security, and maritime law-enforcement responsibilities, the Coast Guard must work within strict mission-critical data requirements set forth by the Department of Homeland Security (DHS) and operate in real-time, all the time.
Charge Forward With Utility Computing
Utility computing offers a new way of harnessing the power of information technology and mission-critical data across the government when that data is needed most. In addition to the e-government initiatives, IT directors also consider the Continuity of Operations Planning (COOP) directive when making decisions about Web-based systems and data backup and recovery.
In todays national security environment, agencies must be able to respond immediately, with or without warning, to any type of disaster, so that essential operations may continue without disruption. Utility computing speaks directly to this need by protecting and securing critical data and providing continuous availability to that data, no matter what.
Editors Note: Mark Bregman is Chief Technology Officer for VERITAS Software, a provider of software tools that enable utility computing. For more information, visit http://www.govinfo.bz/4356-309