Estimating that as much as $20 billion of the $80 billion federal IT budget is spent on essentially preserving old and aging technologies, Va. Rep. Gerry Connolly joked the upside is that “the Chinese don’t know how to hack into those legacy systems.”
But the topic at hand during Tuesday afternoon’s Committee on Oversight and Government Reform was no joke: the fate of FDCCI, or the Federal Datacenter Consolidation Initiative.
“We have something like 3,100 datacenters in the federal government. That’s an astounding number,” Connolly said. “If we’re not seized with a sense of urgency about that mission, fed agencies are going to have to do less with less and that won’t service the public very well.”
From 1998 to 2009 the federal government went from 432 to more than 1100 datacenters, according to Mr. Bernard Mazer, CIO of the Department of Interior. OMB, in 2010, launched FDCCI to reverse this trend and the associated costs.
[See also: Will it actually cost more to close datacenters than fed agencies ultimately save?]
“Has OMB dropped the ball?” asked Fla. Rep. John Mica, chairman of the committee. Neither OMB nor GSA were in attendance at the hearing and Mica said he intends to hold a subsequent session that will involve them, whether agency reps are willing or by other means.
“OMB has actually set the goals well, [but] they’re not driving it to closure,” answered David Powner, director of information technology management issues at the Government Accountability Office, adding that the GAOs work looking into OMB, GSA and the Department of Interior found that they are doing well but “all three organizations need to do more from a leadership perspective.”
What’s more, federal agency CIO’s need to be empowered for FDCCI to succeed, according to Steve O’Keeffe, founder of MeriTalk.
“It doesn’t seem there’s real support for CIOs to stand up against mission owners,” O’Keeffe said, explaining that given what happened to Richard Spires of the Department of Homeland Security, other CIOs won’t likely feel particularly empowered.
And FDCCI matters, Mica added, because “we’re talking about saving billions and operating more efficiently.”
O’Keeffe outlined 5 recommendations during the hearing. From the written testimony he submitted:
- Don’t Hide – Gerrymandering FDCCI to make it look like a success by combining it with PortfolioStat is not the way to go. Let’s put the cards on the table – set realistic goals in the open and publish the real status on success and failure. OMB has a data center TCO model. In this era of open government, why is this model kept secret? Publish the model so that government and industry understand how we’re keeping score
- No Money – Recognize that there is no new money for data center optimization. Empower the CIO to rationalize applications – it’s the only way to fund the path to make things better. Agency leadership and Congress need to support the CIO in the inevitable clashes with the mission owners and components
- Application Rationalization – Touched on this above, but it bears another mention. We do not need 600 HR systems. Prioritization is the key to changing the value and financial equation
- Marry IT and Facilities – One data center executive needs to understand and own the budget for total data center cost. GSA owns most of the facilities and pays the electricity bill. Why not publish the energy bills for each data center? How do we pick which data centers to close if we don’t know what they cost to operate? Interesting to note that, according to Uptime Institute, 12 percent of data center operational cost is electricity.
- Public-Private Partnership – We need more and deeper public-private collaboration. Why don’t we recognize that government is not the only organization that operates data centers? There are existing definitions of what is a data center – why not embrace these rather than keep creating our own in government? Why not utilize industry standards, data center efficiency measurements – like Power Usage Effectiveness (PUE)? Leaders like Jake Wooley at the Department of Energy can help data center leads all over government spark new energy efficiencies. How long did it take NASDAQ to do its data center optimization? What steps did it take? How much money did it save? Is the mission exactly the same? No. But, can we learn a huge amount from industry? Absolutely yes.
Indeed, Teresa Carlson, vice president of worldwide Public Sector at Amazon Web Services, pointed to the Department of Health and Human Services as a first-mover toward cloud services with its work on Fema.gov, Ready.gov and the CDC’s BioSense.
[See also: 5 tenets of OMB's 'open and machine readable' federal data policy.]
“They’re looking for ways to provide citizens services that are effective, reduce costs, and are able to scale when they need it,” Carlson added.
It is difficult, however, for agencies to consolidate datacenters and move to cloud services when there are only two applications certified by the FedRAMP program – and as Rep. Connolly noted, Congress won't likely continue funding the program if cost savings are not a significant result.
“Federal IT reform is like a bad reality TV show,” O’Keeffe cracked. “There’s no budget, the actors are powerless, but somehow we keep watching.”