Utilities: How data acceleration is needed to support the energy industry

By David Trossell, CEO and CTO of Bridgeworks.

  • 8 years ago Posted in

When most people talk about big data they are usually talking about how it can be used to either predict weather patterns, or to analyse it to improve their marketing strategies. In reality, increasing volumes of data are being used by all of the world’s industries – including in the utilities sector. Quite often this data is needed fast: without speed, the data may become inaccurate and unusable. This data can come from a variety of sources: sensors, seismic data, and smart meters for analysis in the cloud are some just of them. The data touchpoints are often innumerable. 

Increases in computing speed and capacity have enabled collection, analysis and storage of increasing volumes of information. Big data, the Internet of Things, wireless mesh networks and cloud computing all contribute to [changes in the industry and advances in IT]”, says journalist Penny Hitchin in her Power Engineering International article ‘IT Advances Open New Scope For Power Plant Condition Monitoring’.

She adds, for example: Br?el & Kj?r Vibro monitors over 7000 operational wind turbines from its data centres in Denmark, the US and China. Around 120 measurements are collected from a typical offshore 6 MW wind turbine. Each reading has a time signal which gives a substantial amount of data, which can be measured remotely on demand by the user or the diagnostics centre. In addition to serving the wind farm operators, collating and analysing this data enables the company to refine its own monitoring techniques and also provide feedback to equipment designers”.

Device proliferation

Alyssa Farrell, Manager of Global Industry Marketing at SAS writes in her article ‘What Utilities Expect From The Internet Of Things’ for Energy Central: Across all industries, Gartner forecasts that 6.4 billion connected devices will be in use worldwide in 2016 – up 30 percent from 2015 – with the number reaching 20.8 billion by 2020. In 2016 alone, 5.5 million new devices will be connected each day.”

She then comments: For utilities, the smart grid era unleashed not only millions of these new Internet of Things (IoT) devices, but also more data that utilities need to analyse and understand to make better decisions about their networks.  In fact, 63 percent of utility respondents in a recent SAS survey indicated that IoT was critical to their companies’ future success.

Sensor investment

Laura Winig talks about ‘GE’s Big Bet On Data and Analytics’ in her article, for the Massachusetts Institute of Technology’s Sloan Review. She writes:  [General Electric] has bet big on the Industrial Internet — the convergence of industrial machines, data, and the Internet (also referred to as the Internet of Things) — committing $1 billion to put sensors on gas turbines, jet engines, and other machines; connect them to the cloud; and analyse the resulting flow of data to identify ways to improve machine productivity and reliability.”

She cites Matthias Heilmann, Chief Digital Officer of GE Oil & Gas Digital Solutions who reportedly says: GE has made significant investment in the Industrial Internet.” With Steve Collier writing In Electric Energy Online that big data is an intelligent and sustainable grid, one could argue that utilities companies should be invest more and more in the new technology landscape too. However, what each writer doesn’t mention is that utilities network data can be forestalled by latency issues. This can lead to poor and inaccurate data analysis, poor maintenance of utilities networks, and an inability to access critical data and IT services. Latency can also increase business risk to business continuity, preventing utility firms from being able to restore their operations in the face of a disaster. 

Addressing cyber-security

With SAS claiming that cyber-security is one of the key issues that concerns organisations, organisations need to minimise business risk by enabling them to rapidly access data from a remote disaster recovery site when a manmade or natural disaster occurs, and the need to be able to analyse data in a timely manner. Utilities organisations must also consider investing in data acceleration solutions. Yet there is also a need to be able to collate and analyse historical data in order to provide a clear picture of past, current and future trends. This has to be part of a continual monitoring process. 

My advice, with regards to cyber-security, is that where sensitive data is traversing the Internet, make sure it is encrypted.

Improving efficiencies

At this juncture, it’s worth noting that the sensors that are connected over the internet have enabled industries to monitor processes and conditions remotely to a level that was seemingly impossible just a few years ago. With the advancement in the range and cost reduction in these sensors, it is now technically and financially viable to monitor process and functionality that is normally only about industrial functionality - the health and efficiency of production and distribution. They can now extend the reach out to the premises of their customers - both industrial and residential - with such things as smart meters.  Combining it with artificial intelligence (AI) can lead to greater energy efficiencies. To achieve the required analysis aspect of it, data often has to be moved around the cloud.

With increasing sensor usage and big data volumes, the computing power required to compute the potential outcomes based on certain scenarios can be vast.  Whilst the larger energy companies may have the resources, many of the independent smaller companies use the cloud – the ability to supply computational power and storage on demand is a perfect scenario for cloud usage. The data can be used for analysing the consumption and network distribution requirements on an hourly daily, weekly, monthly, annually basis and then it can all be tied in with weather patterns, which may stimulate or reduce demand. For example, more electricity will often be used during the winter months of the year.

Better maintenance

One of the key areas that can provide massive cost benefits and customer satisfaction is in the area of equipment maintenance.  The ability to identify unusual or outside normal conditions from patterns that a human operator could easily miss can prevent equipment outrages. Equally so, much equipment is taken out of commission for preventative maintenance on a periodic basis. With a detailed analysis, it may be possible to extend these periods or wait until a subtle condition occurs before maintenance is performed.  In all of these cases of putting big data to use, there are 4 key elements that need to be put in place for big data projects to succeed. They are volume, variety, velocity and veracity. It is the velocity that can derail the quality and performance of big data analysis if the data required is not efficiently transported across the networks due to latency.

Mitigating latency

Combining IoT big data with AI has the possibility to add another level of sophistication to the analysis of big data, but modern networks can prohibit its movement and prevent the data from being moved with sensible timescales. To resolve issues like this has the potential to save the energy companies millions of pounds, particularly as many of the IoT installations is that the sensors are located in remote locations, so they are subject to latency. This can dramatically affect network and operational performance. There are many techniques used to improve the performance over these long latency networks such as compression, but this is  ineffective when used with data that is already compressed or  if the data is encrypted . To be effective and efficient in moving data other techniques have to be embraced, and this will require the deployment of solutions such as PORTrockIT.

It is also more effective to consolidate data into larger blocks before transporting it in bulk using an Acceleration Engine than it is to use traditional WAN optimisation techniques. With the right solution data can be accelerated, latency can be mitigated, financial savings can be achieved, and it becomes possible to analyse data in real-time in a way that improves data accuracy. The benefits of data acceleration therefore offer invaluable support for the energy and utilities industry.


By Oz Olivo, VP, Product Management at Inrupt.
It’s getting to the time of year when priorities suddenly come into sharp focus. Just a few...
With Richard Jones, VP EMEA, Confluent.
By Guy Eden, VP Product Management, BMC.
By Syniti’s head of solutioning Will Hiley .
By Mithu Bhargava - Executive Vice President and General Manager, Iron Mountain Digital Solutions.
By Nik Acheson, Field Chief Data Officer at Dremio.
By Francesca Colenso, Director of Azure Business Group at Microsoft UK.