Ashden Awards 2016: Open Energi Case Study

To keep the lights on, National Grid has to keep electricity demand and supply exactly in balance, and when faults occur a rapid response is needed – within two seconds! Traditionally this was provided by gas and coal power plants running below full power, so they can adjust output quickly, but this is inefficient, expensive and increases CO2 emissions. Open Energi has developed an alternative – cutting-edge software which can automatically switch energy-hungry equipment on or off when required, without disrupting business operations.

Large energy users like water companies identify which items of equipment are not time-sensitive in their operation and this equipment can then increase or decrease its consumption within agreed parameters to provide a rapid response service to National Grid.

Open Energi Harnessing the power of IoT for cleaner, more efficient and affordable energy

David Hill, Business Development Director at Open Energi speaks to theCUBE about how Open Energi is harnessing the benefits of connectivity to bring customers more efficient, more affordable and, ultimately, cleaner energy.

“We were an IoT company before we even knew what IoT was,” said David, discussing how Open Energi was founded pre-Hadoop. Becoming Hadoop customers was a “huge leap,” and Hortonworks Dataflow services are enabling much more cost-effective integration that has Open Energi extremely excited about the future.

David spoke to theCUBE whilst attending Hadoop Summit 2016, Dublin – http://bit.ly/1qIrgEN

Powering a Virtual Power Station With Big Data

Michael Bironneau, Data Scientist at Open Energi, discusses powering a virtual power station with big data.

At Open Energi in order to prove that we’ve delivered our Dynamic Demand service to National Grid and kept it running at optimum, we need to analyse large amounts of data relatively quickly. We’re also making our service smarter so that more assets will be able to participate in Dynamic Demand than before. This is where Big Data and Hortonworks Data Platform come in.

Big Data is a phrase that has been floating around companies like Google for the last two decades. It has never really had a precise definition, but when used casually it usually means that someone somewhere is running out of space for your data and/or computing power to analyse it, this can also mean that your data is so unorganised it is difficult if not impossible to analyse. Data is the most important asset when considering Dynamic Demand, it tells us when to flex certain assets, it proves we are providing a service and it allows us to better understand our portfolio.

Michael was speaking at the 2016 Hadoop Summit, Dublin – http://hadoopsummit.org/dublin/