Implementing the Defense Department Cloud Computer Strategy Poses New Challenges

SIGNAL Magazine
December 2012

By Paul A. Strassmann

original article

A few staff experts can formulate new strategies in a short time. Over the years, the U.S. Defense Department has accumulated a large collection of long-range planning documents. However, none of the plans ever was fully implemented, as new administrations kept changing priorities.

The just announced Defense Department Cloud Computing Strategy presents a long list of radically new directions. Ultimately, it will take hundreds of thousands of person-years to accomplish what has been just outlined. Several points stand out.

In one, individual programs would not design and operate their own infrastructures to deliver computer services. Users would develop only applications. This approach will require tearing apart more than 3,000 existing programs. A pooled environment will be supported by cloud computing that depends on different processing, storing and communications technologies. Small application codes then can be managed separately, relying exclusively on standard interfaces. The challenge will be how to manage more than 15 years’ worth of legacy software worth about half a trillion dollars, but in completely different configurations. Making such changes will require huge cost reductions of the infrastructure that currently costs $19 billion per year.

Another point is that cloud computing will reduce the costs of the existing computing infrastructure. The Defense Department will have to virtualize close to 100,000 servers and integrate that construct with 10,000 communication links. The department will end up with a small number of enterprise-level pooled and centrally managed operations. This is a short-term multibillion-dollar effort that can be financed only from rapid savings, because no new funding will be available.

The strategy also would enable components to rapidly construct and then deploy applications that meet mission needs in a matter of days. A small number of shared universal platforms will offer the capacity to support functionally oriented code generated from thousands of software templates. The Defense Department will need to prescribe the detailed technique for creating code that will be portable across global hardware platforms. A large share of existing contractor services will become obsolete.

Also, global data and cloud services will be available regardless of access point or the device being used. The Defense Department will operate with software-defined standard networks so that the protocols of any individual computing device — inclusive of sensor inputs — will remain isolated by means of a layer of code. The department will dictate interface standards. Much of the Global Information Grid, which is structured to operate with stand-alone browsers and switches, will be replaced.

The Defense Department chief information officer (CIO) would be responsible for the enterprise architecture that will define how the defense cloud is designed, operated and consumed. Program managers, contractors and service providers will be audited for compliance with comprehensive directives, which define how every defense program is constructed to achieve real-time interoperability. The new directives will require a massive reorientation of existing practices. The power of the department CIO will grow.

The Defense Department also will implement enterprise file storage to enable global access to data by any authorized user, from anywhere and from any device. The total separation of data storage from data processing will require the creation of new datasets for combined access that will allow cross-system data search and exploitation. Some projects have more than $30 billion of annual spending, and they will have to migrate exabytes of files into a new setup. How such a massive conversion can be accomplished is not clear.

The strategy has the Defense Department taking a data-centric approach to cloud services for ensuring enterprise real-time interoperability. This will require a large increase in the quality, availability, accessibility and usability of data at levels that approach 100 percent uptime as well as sub-second latency. Standards will dictate machine-readable formats from Web services and sensor inputs. Enterprisewide shared metadata tagging will apply, which is a huge effort to eliminate redundant data definitions.

The department would apply a pay-as-you-go pricing model for services on demand rather than procuring entire solutions. Utility metering of computing and communication requires a complete overhaul of the operating software for every application as well as the construction of new administrative control systems. A standard cost accounting methodology would be dictated by the Office of the Secretary of Defense.

Defensewide computing will not be limited to components but instead will be open to others, such as those from mission partners, commercial vendors and throughout the federal government. Such an arrangement will require the acceptance of open source standards for customer identification, transmission assurance and resource architecture. This necessitates an overhaul of the existing protocols that have been installed over decades.

All told, this latest cloud computing strategy mandates implementation tasks for the new way of organizing Defense Department computing. It not only is what needs to be done but also suggests the ideal timing for its execution. Without a rapid extraction of billions of dollars — primarily from the current infrastructure — progress will be throttled. From now on, the department will have to make progress at an unprecedented pace that is measured in days and weeks, not in years or decades.

Evolution will be paced by the ability of the Defense Department CIO to steer the redeployments from a fractured infrastructure into a pooled enterprise. Unfortunately, the existing defense CIO charter as yet is not sufficient for controlling the allocation of program funding.

Paul A. Strassmann is the distinguished professor of information sciences at George Mason University. The views expressed are his own and not necessarily those of SIGNAL Magazine.