Powering a Virtual Power Station with Big Data

Tech image

Michael Bironneau, Data Scientist, Open Energi

At Open Energi, we think of our service as an automated, virtual power station. Whenever the electric grid experiences sudden, unforeseen surges in supply or demand, assets under the control of our Dynamic Demand algorithm automatically pick up the slack – just like a power station would but cheaper and cleaner.

In order to prove that we’ve delivered this service and keep it running at optimum, we need to analyse large amounts of data relatively quickly. We’re also making our service smarter so that more assets will be able to participate in Dynamic Demand than before. This is where Big Data and Hortonworks Data Platform come in.

Big Data is a phrase that has been floating around companies like Google for the last two decades. It has never really had a precise definition, but when used colloquially it usually means that someone somewhere is running out of space for your data and/or computing power to analyse it. Sometimes it also means that your data is such a mess that it will take a sentient super-robot to make sense of it (thankfully, that’s not our case).

In the context of Dynamic Demand, data is our greatest asset: it tells us when we’ll be able to turn a certain asset on or off without disrupting its primary function, which could be critical to an industrial process. It also allows us to prove to National Grid that an asset we claimed participated in Dynamic Demand actually modified its power consumption to help balance the grid. We want to do more with our data: understand our portfolio of assets better, reduce integration difficulties with assets and accept new assets into our portfolio that don’t meet certain technical requirements. This means running more analyses on live streams of data and integrating many additional datasets together – for us, these problems are Big Data because we can’t cost-effectively tackle them at scale with our current systems.

Hortonworks Data Platform (HDP) is a collection of open source software built on and around Apache Hadoop designed to deal with Big Data, developed partly by large companies like Yahoo and eBay, and partly by the community. Hortonworks’ added value is in the way the tools are seamlessly configured to work as one and the support they provide. They staff core contributors of various projects and technical experts, so help is rarely more than a couple of emails away.

One of the most widely known tools in the HDP toolbox is called Apache Hive. This tool excels at integrating different types of data and allowing them to be queried as one, spreading out the computational cost of the query on as many machines as we can get our hands on. We’re planning on using this for most of our ad-hoc analysis and some batch jobs. Because it is easy to extend with custom logic, we can program Open Energi analytics straight into it. For example, we can call into our Python code which contains functions we may want to evaluate over various pieces of data, without worrying about how that data is stored or whether it is even coming from a single source. Because we deal with a lot of timeseries data we also need to perform resampling operations on a regular basis – these can be painful in regular query languages such as SQL but Hive’s extensibility makes it a breeze.

For our low-latency applications, Hortonworks package a piece of software called Apache Storm. This software is designed to run an entire graph of computations on a live stream of data, adding reliability in case parts of the graph fail. For example, when a device sends us a power reading, we can correlate it with Dynamic Demand state, train machine learning models and update a cache powering a live dashboard – all without leaving Storm.

Hortonworks take security very seriously and so do we. HDP comes with tools such as Apache Knox and Apache Ranger that deal with questions of who should have access to which piece of data. Even though enterprise-grade security is a relatively new concept in the world of Big Data, HDP has fully caught up and the core systems now support transparently encrypting data in movement and at rest, with central management in Ranger allowing us to effortlessly define security policies that comply with our business requirements and the high standards that are expected of us.

For future use, we’re excited about Apache Kylin, a brand new piece of software originally created by eBay. While not yet part of HDP, it is built on the same software ecosystem and can easily be integrated. Kylin allows for a different type of data modelling using metrics (or KPIs) and dimensions (eg. “client”, “date” or “type of load”). Roughly speaking, the engine stores pre-computed aggregates of the metrics over the space spanned by the various dimensions. For example, suppose a metric is “power consumption” while the dimensions of interest are “time” and “asset type”. Kylin could return answers almost immediately to questions such as “what was the mean power consumption last month for all assets of type ‘bitumen tank’?” At Open Energi we have many KPIs we want to keep track of and drill into – this is something Kylin should be excellent at managing for us.

Finally, we’re looking into Hortonworks’ new product called Hortonworks DataFlow (HDF). Originally created by the NSA, then integrated into the Apache Foundation’s portfolio under the name Apache Nifi, this project does what it says on the tin: it helps create and manage flows of data. It solves many technical problems, such as what to do when one component in the dataflow can’t keep up with the volumes of data it’s receiving, or how to prioritise which data to send at a given time. While our bespoke systems already solve many of these issues, HDF can do more, such as querying data that lives on individual devices without them ever having to send the raw data back. We’re always looking for ways to get more out of ephemeral data that lives on assets in the field but never gets sent back to our database, so we’re looking forward to trying this out.

Ever ready: will batteries power up in 2016?

Open Energi Banner ADE

David Hill, Business Development Director, Open Energi

Open Energi tends to extol the virtues of Demand Side Response as a solution to the energy storage challenge.  It provides a no-build, sharing economy approach which is cheap, sustainable, scalable and secure.

By harnessing flexible demand and tapping into the thermal inertia of bitumen tanks or the pumped energy stored in a reservoir for example, we have created a distributed storage network able to provide flexible capacity to the grid in real-time without any impact on our customers.

But flexibility comes in many forms, and as the cost of energy storage systems tumble, it looks like 2016 might be the year when commercial batteries become a viable part of the UK’s electricity infrastructure, with recent analysis suggesting they could deliver 1.6GW of capacity by 2020, up from just 24MW today.

The price of energy storage systems is expected to fall sharply over the next three decades, with Bloomberg New Energy Finance predicting the average cost of residential energy storage systems will fall from $1,600 per KWh in 2015 to below $1,000 per KWh in 2020, and $260 per KWh in 2040.

As costs have fallen we have seen increasing interest from industrial and commercial customers keen to explore the benefits of installing batteries on-site and looking at systems capable of meeting 50%-100% of their peak demand – depending on their connection agreement (although it is worth noting an export licence is not a prerequisite).

In addition to providing security in the event of power outages, battery systems can help companies to reduce their demand during peak price periods, enabling them to seamlessly slash the astronomical costs – and forecasting difficulties – associated with Triads, and minimise their DUoS Red Band charges.

When they aren’t supporting peak price avoidance – which may be only 10% of the time – batteries can help to balance the grid – earning revenue for participating in National Grid’s frequency response markets. For example, discharging power to the system if the frequency drops below 50 Hertz and charging when the frequency rises above 50 Hertz.

National Grid’s new Enhanced Frequency Response market has been developed with battery systems in mind – requiring full response within 1 second – but isn’t expected to be up and running for a year or more.

In the meantime battery systems can generate significant revenues today via National Grid’s Dynamic Firm Frequency Response market, tendering alongside loads from companies like Sainsbury’s, United Utilities and Aggregate Industries, to help balance the grid, 24/7, 365 days a year.  And in the longer term the opportunity exists for companies to trade their batteries’ capacity in wholesale electricity markets.

With these saving and revenue opportunities in mind, we’re now at a point where battery systems can be installed behind-the-meter and deliver a ROI within 3-5 years for industrial and commercial sites. The ROI will be subject to certain factors, such as geographic location, connection size and of course the cost of the battery system itself, but these figures would have been unthinkable only a few years ago.

There are important technical factors to consider, including both the battery sizing in terms of its kW power rating and kWhr energy storage capacity, and also the underlying battery chemistry.  By taking into account the physical location of the battery along with models of different markets that it will operate in, it is possible to narrow down to the most appropriate technical parameters.  Another consideration is the gradual effect of wear and tear on the battery with continuous usage.  By analysing these effects it is possible to reduce some of the uncertainty around battery lifecycles (likely to be in the region of 10 years) and get better predictions of the likely revenue in each year of operation.

But whilst a payback of 5 years seems reasonable from an energy infrastructure perspective (where 15-20 years is more typical) for most companies used to a ROI within 2-3 years on energy projects it is not easy financing battery systems.

Some larger, capital rich companies may have the appetite and money to finance these projects themselves, but the majority of the companies we are talking to are keen to take these assets off balance sheet and finance installations via banks and other investors under third party ownership.

In these circumstances, managing the performance of battery systems – so that they meet their warranty and their lifecycle is maximised – whilst optimising their potential as a flexible resource able to cut energy costs, earn revenue and deliver a vital uninterruptible power supply  during outages will be key to their commercial success and scale of deployment.

The Challenges of Device Management in Energy

The Challenges of Device Management in Energy

1248 CEO Pilgrim Beart chats with David Hill, Business Development Director at Open Energi.

P: “David, tell us a bit about what you’re trying to achieve at Open Energi”

D: Open Energi is deploying ‘Internet of Things’ software within electricity consuming assets and paving the way for a new energy system; a system that is cleaner, cheaper, more efficient and more secure than the system we have today.

Our software is able to measure and monitor machine and appliance behaviour in real-time and subtly adjust electricity consumption in response to signals from the market, preventing fossil fuelled power stations from coming online and maximising our use of renewable energy, without any impact on consumer living standards or business productivity.

Every appliance or machine that we connect to using our Dynamic Demand software is another step towards removing power stations from the electricity system all together – and the money that would have been paid to those power stations is instead paid to British businesses.

P: “That sounds like a classic IoT application to me – making better use of a resource, with benefit to all the parties concerned – and helping address climate change too. What stage have you’ve reached in this ambitious goal?”

D: To date we have signed over 25 customers and deployed our software within over 1230 individual assets across 275 sites. Appliances range from Heating, Ventilation and Refrigeration Appliances in Sainsbury’s, to Water Pumps and Waste Management Equipment in United Utilities. The total power of all the devices that Open Energi interacts with is about 45MW, of which we currently bid about 15MW – around 30% – into energy markets operated by National Grid.

30% represents what is typically flexible at any given moment in time. The remaining power is being used for its primary purpose, e.g. heating supermarkets or pumping water. It is only by being able to identify flexibility on a second by second basis that we are able to provide this service.

P:”And can you give us some idea of the sort of challenges that you’ve encountered as you’ve grown in scale?”

One of the key challenges we face is being able to interface with different industrial and commercial processes – ranging from water pumps to bitumen storage tanks to refrigeration systems.  These systems operate in very different ways and generate different kinds of data, so a key challenge for us is being able to collect and view this data in a coherent way, and then understand how all these different processes are consuming energy. 

Overcoming this problem is an essential part of how we can provide a consistent and reliable service to National Grid from many thousands of individual assets.  Today many aspects of our interface with each of these processes inevitably become bespoke, which then leads to challenges in how we maintain and operate the software on all of our devices, particularly as we scale up.

P: “Your growing need for device management rings very true for us – it’s a common story as a connected service starts to achieve scale. We encountered something similar at AlertMe as we scaled from 1,000 to 10,000 devices, which is why we decided to focus on that challenge at 1248. Can you put some more meat on the bones for us?”

A key area for us is being able to simplify how we connect and exchange information with so many different control systems and processes.  Standards could play a key part here – both in terms of how we connect to other devices and what information we exchange.  This would ease the process of connecting and maintaining devices as we scale up and allow us to focus on getting the maximum value out of the energy flexibility that we’re tapping into.

For example, in the future we believe that all appliances with the ability to provide flexibility to the electricity market will come with those principles built in off-the-shelf, so that an air conditioning controller designed to maintain room temperature is every bit as used to responding to electricity market signals as it is to temperature, likewise with water pumps and bitumen tanks.

Device management is very important when trying to operate service based on very large numbers of connected devices; we can only understand and quantify energy flexibility if we have an accurate understanding of which devices are online, and whether they are functioning correctly. Moving from the world we are in now, to the future I just outlined, will need collaboration with companies like 1248.

P: “David, thanks so much for sharing this with us.”

This blog post was originally featured on www.1248.io

The IOT technology meeting the UK’s grid balancing needs faster than a power station

Tech image

Chris Kimmett, Commercial Manager, Open Energi

It’s a well reported fact that electricity margins are tighter than they have been for a number of years and, as we move towards winter, talk will increasingly turn to the need to balance energy supply and demand in order to mitigate the risk of power black outs in the UK.

Almost all of the UK’s grid balancing has traditionally been done by coal and gas. But the EU’s Large Combustion Plant Directive has limited running hours at a number of plants and in the past twelve months both Longannet and Eggborough power stations, which currently provide around 5% of the UK’s capacity, have announced they will be closing their doors in 2016.

Add to this the solar and wind explosion – by the end of 2015 experts predict the UK will have 10GW of solar power, a benchmark most thought wouldn’t be reached until 2020, and by 2020 National Grid’s Future Energy Scenarios indicate that small-scale, distributed generation will represent a third of total capacity in the UK. This considered, we see that tomorrow’s electricity landscape will look very different to that of today.

The transformation of the energy system away from centralised generation to small-scale, distributed power means that speed of response to changes in energy supply and demand will be more important than ever.

Indeed, while most people are focusing on the tight capacity margin between supply and demand, the real blackout threat could come from generators being unable to respond within the required window to balance instantaneous shifts on the grid.

For more than 12 months, energy data analysts at EnAppSys have been monitoring grid frequency and analysing large deviations which, if not managed, can lead to instability. EnAppSys’ director Paul Verrill says that while we need to ensure the system has sufficient supply to meet demand, the real risk of blackouts could come from this second issue that often falls under the radar: a lack of capacity able to deliver additional power within the required timeframe.

Grid agility and flexibility will be essential as we move away from models of centrally dispatched generation, and National Grid, via its Power Responsive campaign, has already asserted that demand side response (DSR) will play an increasingly vital role in building a resilient, sustainable and affordable electricity system for the future.

This is especially pertinent given the results of new research by Open Energi, National Grid and Cardiff University, which suggests that smart demand side response (DSR) technology can meet the UK’s crucial grid balancing requirements faster than a conventional power station.

The latest research paper, which forms part of the ongoing collaborative research programme between Open Energi, National Grid and Cardiff University, titled Power System Frequency Response from the Control of Bitumen Tanks, looks at the feasibility of DSR to provide a significant share of frequency balancing services.

To test the scale of the opportunity for industrial heating loads to provide frequency response to the power system, bitumen tanks (which contain the glue that binds our roads together) equipped with Dynamic Demand technology were tested in combination with National Grid’s model of the GB transmission system.

Dynamic Demand deployed at scale is able to contribute to the grid frequency control in a manner similar to, and, crucially, faster than that provided by traditional peaking power generation – not to mention more cleanly and cheaply.

Field tests showed that full response could be provided in less than two seconds, as compared to 5 – 10 seconds for a thermal generator. Large scale deployment of Dynamic Demand will reduce the reliance on frequency-sensitive generators and ensure that the grid stays balanced in a cost-effective, sustainable and secure manner.

The research simulations help to shape National Grid’s understanding of DSR as a replacement for frequency-sensitive generation and will be used when they are planning their requirements for grid network operation in the future – with huge impacts on the future of our energy mix.

When launching Power Responsive, National Grid CEO Steve Holliday said: “The move to a low carbon economy coupled with rapid advances in technology and innovation are transforming our electricity supply. But supply is only half the story. The challenge now is to exploit new opportunities to radically evolve our energy system by changing the way we use electricity.”

And this is why the research findings are so significant.

With more renewables and decreased thermal generation, ‘inertia’ on the Grid will decrease, making frequency more unstable. To counteract this effect we need faster response, so by rolling out Dynamic Demand today we are future proofing the Grid.

With their new Power Responsive campaign, National Grid has recognised the need for a new source of flexibility and have stated they are committed to scaling the smart DSR industry.

Demand side response is intelligent energy usage. By knowing when to increase, decrease or shift their electricity consumption, businesses and consumers will save on total energy costs and can reduce their carbon footprints. It is the smart way to create new and efficient patterns of demand.

Future proofing London: Regeneration in the age of IOT

Storage London Skyline of Gherkin

July 2015: David Hill, Business Development Director, Open Energi

Planning for the redevelopment of London’s Old Oak Common is now in full swing with the appointment of the Old Oak and Park Royal Development Corporation (OPDC) board. What lessons can the team behind the project learn to ensure the scheme is futureproofed and can meet the needs of Londoners for generations to come?

In February 2015, London’s population reached a new high of 8.6 million people, exceeding the previous record set back in 1939. The city’s population is set to continue to expand, with current estimates predicting it will reach 11 million by 2050.

There is an urgent need for new housing in the UK capital to help manage this growth. The Greater London Authority (GLA) has outlined ambitious investment plans to improve the capital’s infrastructure which could require £1.3tn of spending from now until 2050, most of which needs to go on housing and transport.  As part of one of the largest regeneration schemes in London for decades, plans are now fully afoot to transform brownfield land in Old Oak Common and Park Royal into a sustainable New Town close to the heart of the city.

At present, the Mayor of London’s office suggests that development in north-west London will create up to 24,000 homes and more than 55,000 jobs. According to the GLA, the scheme will be an exemplar in accessible, high quality and ‘smart’ regeneration which, over the next 20 years, will strengthen London’s role as a global city.

Within this wider regeneration project in the currently underutilised region of west London, plans are also being drawn up by the London Sustainable Development Commission (LSDC) to create a world-leading clean tech hub. LSDC, which advises the Mayor on the city’s low carbon economy, hopes the hub will attract forward-thinking start-ups and large green companies from across Europe, especially once major planned train lines open, including Crossrail and HS2.

Accordingly, the GLA’s Draft Old Oak and Park Royal Opportunity Area Planning Framework (OAPF), which was produced with contributions from Transport for London (TfL) and the London Boroughs of Brent, Ealing and Hammersmith & Fulham sets out an ambitious vision to ensure that the Old Oak and Park Royal area is an exemplar of low carbon development.

The GLA has already committed to achieving the highest standards of energy efficiency and low carbon technology and, to this end, has pledged to produce an Energy Strategy and subsequent Energy Masterplan for the area.

The Mayor has set a target for London to self-generate 25% of all electricity consumption by 2025 to improve system resilience and reduce the cost of transmission. Local energy in London includes solar power and heating networks supplied by plants which are close to where energy is used and which generate heat and power at the same time.

The problem with these approaches is that they require space, which is already at a premium in London. Added to this, not only is gas for combined heat and power (CHP) tied in to volatile global energy prices, but it is also carbon emitting – a particularly problematic scenario for a city which is already struggling with an air pollution crisis. The city is in urgent need of a high-tech energy solution and, as this swathe of London begins its transformation, it is essential that the GLA fully embraces the huge opportunity for system change to ensure the scheme is futureproofed and can meet the needs of Londoners for generations to come.

Cutting edge software and an Internet of Things approach to energy-consuming assets are enabling advanced forms of demand response technology to be rolled out across a range of equipment – including heaters, pumps, chillers, refrigerators and air conditioning units – turning them into smart, automated and autonomous devices that can react instantly to changes in electricity supply and demand across the network to free up capacity, while also delivering new revenues for consumers in return for this improved grid resilience.

The UK has historically tried to deal with capacity issues by increasing supply rather than addressing the root of the problem but, to illustrate the potential scale of success, we should look to the US, where the use of demand response technology has already shaved off ten per cent of the country’s peak energy demand.

In the UK, National Grid urgently needs more flexibility from the demand side to support intermittent renewable use and meet rising energy demand, and has already announced targets to increase demand side balancing capacity from 700MW to 3GW by 2020. In London alone, there is around 250MW (equivalent to five per cent of peak demand) of flexibility in our energy system that could be easily utilised using demand response.  This would effectively remove one whole peaking power station from the grid. Of the £1.3 trillion OPDC infrastructure plan, £150 billion of spending is slated for energy. If we apply the five per cent flexibility logic above, this equates to instant savings of £7.5 billion.

Demand flexibility resides in a range of city areas. For example, eighteen per cent of London‘s energy consumption comes from commercial buildings, of which at least twenty per cent is flexible.  Two per cent of power consumption comes from the water sector, of which eighty per cent is flexible.  In aggregate, this flexibility can provide London with a ‘Virtual Power Plant’, meeting the needs of the growing population without the need for any new infrastructure.

The business case for demand response already exists without any need for intervention or support – and is already being applied effectively by organisations from National Grid to energy intensive corporates, such as Sainsbury’s. From a sustainability perspective, too, demand response makes sense in enabling businesses to move beyond their own footprint and supply chains to help deliver system-wide change.

As development progresses, the Old Oak Common and Park Royal project is a prime candidate for smart grids and demand side response at both building (new and retrofitting existing) and aggregate levels to optimise capacity investment, reduce energy demand, balance local energy supply and demand, including peak energy across the site, and reduce the need for network reinforcement.

HyperCat City’s work in promoting IOT standards, and then involving these in planning and design phases already provide OPDC with some of the crucial tools needed to deliver real cost reduction benefits.

As London expands there is a huge opportunity to capitalise on power demand flexibility to drive major cost and carbon efficiency benefits for the city. To achieve that we must first create a comprehensive map of where flexibility currently resides in the system which will show the level of generation actually required to power new build projects, such as Old Oak Common and Park Royal.  Those new build projects present the opportunity to map demand flexibility at a highly granular level, i.e. by building, which will creates a true image of where capacity lies, as well as building in resilience from the ground up.