Report: Big Data Needs Broad Strategies for Public Sector Adoption

Division of Big Data Commission organization TechAmerica Foundation released a report “Demystifying Big Data: A Practical Guide To Transforming The Business of Government,” which provides an analysis of the major trends in the segment processing systems of large data sets.

The report provides a definition of big data, and their implications for business, practical examples of the use of large amounts of data, based on technology and related policies. The main characteristics of big data are the volume, velocity, variability, reliability of the data.

The report describes ten major projects, the use of large amounts of data to the U.S. federal agencies, educational institutions and private businesses, and several projects carried out in Sweden, Denmark, and CanadaAccording to experts, big data will be popular technologies based on MapReduce (like Hadoop), data warehousing technology and OLAP (OnLine Analytical Processing). The experts leadership and commissioners in the group hail from Big Data and Business Intelligence companies biggies like Cloudera, Splunk and MicroStrategy, IBM, SAP and Microsoft, Amazon Web Services, Dell and HP, storage giants EMC and NetApp and software services firms Grant Thornton and CSC..

Government agencies at the Federal, State and Local level are confronting the same challenge that commercial organizations have been struggling with in recent years: how to best capture and utilize the increasing amount of data that is coming from more sources than ever before. Making sense of that increased volume of data and using it to drive informed policies and offer better services for citizens demands new tools and new approaches by government IT staff and analysts, says Mike Olson, CEO of Cloudera and TechAmerica Commissioner.

For example, data captured from smart grids can be used to better understand energy production and demand to help drive policies that let utilities and consumers utilize energy more efficiently.

The report suggested that training the federal agencies workforce will be a key to harness the potential of big data. Among the group’s recommendations are big-data projects specific to agencies, internship programs focused on data analytics for college students, and creation of a leadership academy to provide big-data training and certification.

Cloud computing and big data projects are currently under way as part of federal agencies cloud adoption program to use new technologies to reduce fraud, boost health care, respond to natural disasters and improve public safety. A recent MeriTalk’s “Mission-Critical Cloud: Ready for the Heavy Lift?” report predicted that government agencies can save $16 billion annually by moving their infrastructure to cloud computing.

2 comments

Leave a Reply

Your email address will not be published. Required fields are marked *