If you think competition in the cloud computing market is getting crazy, then you’d better brace yourself, because it’s set to get crazier. The White House has outlined its approach to cloud computing for the next year that will see dozens of legacy systems go, as well as a new set of uniform security requirements that contractors will have to meet.
This has been in the works for a long time and many of the bigger cloud computing bruisers have been waiting around to see what they produce before launching a full-scale assault on Washington.
But it’s not just the White House or all the bureaucrats up there in Washington that are being targeted; these guidelines affect the entire federal IT infrastructure across all federal agencies, and if you can’t march in step, you’re going to be out of the running for federal contracts.
Cloud Computing in Washington
Microsoft, Oracle and IBM have all been making goo-goo eyes at IT in federal governmentfor a while, and were joined last week by HP, which signed an agreement with Microsoft to provide applications like Office 365 on HP private clouds.
There’s also a lot of smaller deals happening, too, that are bringing smaller companies into the federal orbit through alliances with larger companies like HP again, which also signed a new cloud deal with Box last week.
As a result, the new regulations are going to trickle right down through the cloud space and will become one of the standards that will be applied to security and cloud development, in much the way that DoD 5015.2 has been the standard in records management.
So what exactly is Washington talking about? In a White House blog post by Federal Chief Information Officer Steven VanRoekel, we get some idea; we also get an idea of what the Feds think about the cloud and what exactly they hope to get out of it.
Federal Cloud Computing Stragegy
In fact, in the first two paragraphs it outlines what it wants out of it, and what it believes is wrong with current legacy systems, which, by the way, are going to get dumped as cloud use evolves, but more of that later.
Van Roekel begins:
In a lean fiscal environment, organizations look for ways to take existing resources and use the latest advances and tools to do the seemingly impossible: improve and expand services while cutting costs. It is no different with the Federal Government.”
In fact his “more with less” reference to the relationship between more cloud functionality for less taxpayers dollars just about sums up the whole Federal policy on IT for the future.
He says that by being more rigorous on IT spending this year, the government has saved the taxpayer around US$ 1 billion through Techstat accountability sessions that gave agencies the power to terminate failing projects at an agency level.
And this kind of “efficiency” is to be carried into the cloud computing arena.
Data centers also took a hammering this year, and are likely to get more of the same in the coming year:
We set our sights on data centers — the energy hogs of federal real estate that spread rapidly (and inefficiently) over the last decade. At the end of 2012, the Federal Government will have closed over 472 data centers. And this year we expanded the initiative to include data centers of any size and plan to close nearly 1,000 data centers by the end of 2015.”
The Shared First scheme also implemented some ruthless cuts over the past year and will be looking for more scalps in the coming year.
Instead of operating hundreds of small businesses, federal agencies will be required to buy services that many agencies can use, to operate more like a large enterprise than an SMB.
Over the past year, this resulted in 40 services moving to the cloud out of 79 identified, with the elimination of 50 legacy systems, which can’t be good for the big systems vendors like IBM and Oracle.
And it’s going to continue. This is not random cost-cutting the way governments do in many other areas; there is a method to this:
With the ability to expand capacity at a moment’s notice without having to procure new servers, add new data centers, and hire new staff, the cloud is key to the Federal government’s ability to be flexible as demands change.”
To make this feasible for government departments, security has to be addressed and Van Roekel does just that.
Cloud Computing, Government Security
Leaving aside the problems of data security, a lack of a consistent security policy across agencies has resulted in each agency having to assess cloud IT procurement.
On average, Van Roekel says, it takes between six and 18 months and countless person-ours to properly assess and authorize the security of a system before it grants authority to move forward on a transition to the cloud.
Handling each of these transitions separately wastes millions of taxpayer dollars. The new Federal Risk and Authorization Management Program (FedRAMP), which will fundamentally change the way cloud is secured and procured within the Federal Government.
Developed over the past two years, FedRAMP received input from the CIO Council and bodies like the Information Security and Identity Management Committee (ISIMC), as well as a whole bunch of private and public academics and IT people.
He describes the approach they developed as a “do once, use many times” framework that will cut the need to conduct security analysis for each and every project.
With the FedRAMP, the government can expect to save 30-40% of these costs when using a solution that has been put through FedRAMP.
The program will be rolled out in phases and become operational in six months, according to the Office of Management and Budget.
Souirce: CMS Wire