Electricity providers will soon be up to many challenges as consumers learn how to use “smart meters” and how to upgrade their utility set-up. They need to be prepared for the influx of information more than what they are prepared to handle.
Managers at utility companies need to get on their feet and figure out why a standard huge electric utility is sprawling the network with so much energy only to find out that they contain a very small amount of data. This problem shows a poor visibility of the entire electrical grid. There may be peculiar situations when power shortages provide no trace from the monitoring system; instead they only get informed when calls from the customers reach them complaining that the neighborhood’s power is shut down.
The scarcity of real-time data to customers has called the attention of consumer provider companies to develop a software called the “smart grid” that will address this problem. This software will provide users a more detailed information history of their power consumption. Allowing them to see data on their existing power supply as compared to their electricity demand; then it will give them the control in managing electricity use efficiently. This software will literally permit consumers to assign strategically their electricity usage during peak and off-peak hours to prevent costly power plants to activate more during max out time.
This move in technology will definitely create an IT challenge for utility provider because consumers demand it. This clamor will require them to upgrade their IT infrastructure because not doing so will make them unprepared for an onslaught of huge data information. This scenario will make a lot of service providers unprepared to handle the system which is why it has to be addressed before it can be implemented in a widespread basis. Donald Kintner, Jr. spokesperson for utility-sponsored Electrical Power Research Institute said that smart grid is, “in its infancy.”
Smart-grid developer, Jeff Taft of Cisco Networking Systems said that, “What energy companies are about to experience isn’t simply a doubling or tripling of the amount of data they will be getting.” He also added, “Instead, it’s going to be an increase of multiple orders of magnitude. The industry knows this and is slowly making the transition. But energy is one of those areas where you can’t just rip everything out and start all over.”
Another problem identified with the utility business IT computer systems inability to handle big data could be caused by their central monitoring operations platform that are running on a mix of old computer systems. And these computer systems are not even communicating with each other.
Glenn Booth, head of the marketing at Green Energy that develops the grids management software said, “I’ve been in control rooms where operators need to sit in chairs and swivel between six different monitors to keep everything running,” He also added, “We’re talking about our national power grid here, and that’s just not the way we should be running things.”
For the end user however, it is rather simple to use a “smart meter” because the equipment will automatically generate real-time data by the second on the power usage of a household. The problem is that utility companies do not have the software and the computer system yet to transmit this data. And for those who already upgraded and installed the smart meters they are able to load the data only once a day because they are avoiding being overwhelmed with data storage that they cannot manage.
In California the primary utility provider, Pacific Gas and Electric Company has installed smart meters to 80% of their consumers and expects the remaining homes to have it by next year. Greg Snapper, the spokesperson for PG&E stated that for this system to be fully operational they will need, “complex event-processing engines” which up to this time is still in its development stage.
Related Research from the CloudTimes Research Library
With the explosion of unstructured content, the data warehouse is under siege. In this paper, Dr. Barry Devlin discusses data and content as two ends of a continuum, and explores the depth of integration required for meaningful business value. He explains how a unified information store can provide the platform for deeper insight and examination by bringing together data and content from diverse sources, without disrupting the integrity of the information as it is stored.
See also our Review on Big Data in a Special Feature Section.