US Federal Government and Big Data Storage: A Big Challenge

The US federal agencies are an exemplary example of a source of more data. One of the largest exporter of IT outsourcing, the federal government is proposed to spent $78.9 billion in IT budget for fiscal year 2013.

Big data, for last four years are more or less has been relatively flat, but big data storage seen a substantial increase. According to report, the feds’ data storage requirements have been growing 30 percent to 40 percent a year during the same period.

Data centers of several agencies are saturated with data and currently they have in total nearly 1.61 petabytes of data collected. The amount is expected to increase to two fold over the next two years.

Most federal agencies data storage models are based on tiered storage architecture that include physical media (NAS, SAN, and others), along with policies and services to govern the storage environment.

According to the report by Deltek, Inc., federal spending on cloud computing by the US government will go up to $3.2B in 2017 from $724M in 2012. The initiatives to reduce costs through “Cloud First” and data centers consolidation have given rise to more use of NAS, SANs, and magnetic hard disks. Whereas, the use of solid-state disk drives are reserved where it requires very fast real time access.

As the Federal Government aims to make use of the enormous volume of digital data generated daily, the Obama Administration announced a “Big Data Initiative Research and Development”, supported by more than $200 million in commitments to begin.

Through the new initiative of Big Data and the associated monetary investments, the Obama Administration is committed to greatly improve the tools and techniques needed to access, organize and collect the discoveries of large volumes of digital data.

”Government has a gold mine of data at its fingertips.  The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data, enabling them to more effectively execute their missions,” said Mark Weber, president of U.S. Public Sector for NetApp.

Big bet on big data storage would be the next important challenge for Pentagon. The US agency will invest annually about $250 million (including 60 million in new research projects) in the initiatives aimed at the use of large amounts of data in new ways, as well as the union of recording and interpreting data in order to create a truly autonomous systems to independently maneuver and make decisions.

Leave a Reply

Your email address will not be published. Required fields are marked *