Mayday Under the Big Top: Inside the Grid’s Day of Crisis

The calm before the storm. That’s perhaps the best way to describe the atmosphere at the National Energy System Operator's control room in Wokingham, Berkshire, on the morning of May 29. As staff began their 7am shift, a seemingly ordinary day quickly transformed. It became a desperate scramble to prevent widespread blackouts across the UK; an unrehearsed tightrope walk where a single misstep threatened to bring down the entire performance.
This wasn't a show for public consumption, though, but a high-stakes, behind-the-scenes performance to restore grid stability. What unfolded was a masterclass in grid balancing under duress, laying bare the critical fragility of our evolving energy infrastructure.
That day alone, the grid necessitated a staggering 24,742 balancing actions - orders to power stations to either increase or decrease output - thousands more than typically observed. This frenetic activity underscores the immense pressure faced by operators tasked with maintaining the delicate equilibrium between electricity supply and demand.
The is the story of the day the ringmaster fought to keep the entire energy circus from imploding.
What went wrong?
The primary cause of this operational chaos was a combination of high wind generation and critically flawed forecasting by the system operator.
Initial day-ahead predictions had optimistically projected wind output to consistently exceed 14 GW, even reaching over 21 GW during the morning. This outlook pushed down British day-ahead power prices, incentivizing significant energy exports to more expensive European markets via undersea interconnectors.
However, reality diverged sharply from these projections. As morning demand surged, it became clear that the wind forecasts were materially inaccurate. Even more concerning, same-day forecasts proved to be even less reliable. While wind generation was indeed high, it fell considerably short of anticipated levels.
This significant gap between expected and actual wind output created an enormous management challenge for the control room. Such discrepancies typically necessitate extensive "re-dispatching," triggering a flurry of new instructions for power stations to adjust their operational plans.
The ripple effect of these inaccuracies was evident in the erratic behaviour of Britain's undersea electricity cables connecting it with continental Europe. The interconnector flows became almost contradictory at times. Britain found itself simultaneously importing power from France through IFA2 while exporting through its sister link, IFA1. Similar paradoxical patterns emerged with the North Sea Link (importing from Norway) and the Viking Link (exporting to Denmark).
Up until 2pm on May 29th, NESO executed multiple interconnector trades, all of which were exports and, remarkably, all at negative prices. In other words, the UK was effectively paying other nations to take its surplus power, a clear indication of the system's stress points.
Control room staff were left to juggle with these cascading consequences. Gas power stations, such as Marchwood and Grain, received a plethora of conflicting instructions to ramp up and down, mirroring the fluctuations in wind output and interconnector flows.
Regionally, gas stations in the Midlands and North of England, like Staythorpe, West Burton, and Keadby, were largely commanded to power down, consistent with robust local generation from wind and solar. On the flip side, both Deeside and Rocksavage gas stations in Wales and the North West experienced more consistent "bid up" commands.
The overarching narrative of May 29th was one of intricate, continuous system balancing. Inaccurate wind forecasts, coupled with pre-existing export commitments, created periodic supply shortfalls. Add in the disparate geographical placement of windfarms together with grid constraints and the end result becomes one of grid instability. Despite falling short of forecasts, the high wind output still generated localised surpluses that proved challenging to manage.
As the time reached 2pm, the control room had already executed nearly 6,500 balancing actions - averaging 15 per minute. It was a relentless pace, akin to a juggler with an endless stream of clubs, constantly adjusting to keep them all aloft. As the day went on the pace did not slacken, with the remainder of the day seeing over 17 actions per minute.
How should a modern power grid behave?
To truly grasp the implications of such a day, it's crucial to understand the foundational principles of our power grid.
Our system primarily operates on alternating current (AC), generated by large turbines in conventional power stations, providing a steady 50 hertz frequency (i.e. 50 cycles per second).
The energy transition, however, has ushered in a rapid deployment of intermittent renewable generation. Wind and solar energy, by their nature, produce direct current (DC), which must then be converted to AC to integrate with the existing grid. This introduces a fundamental challenge: more DC generation attempting to function seamlessly on an AC-centric infrastructure.
A critical characteristic that wind and solar largely lack is "inertia." Traditional generators are heavy, rotating machines that resist changes to their rotational speed. This resistance acts as a natural brake, dampening fluctuations in grid frequency.
As we progressively replace conventional generation with renewables, we inevitably reduce the overall inertia on the grid, diminishing its inherent resilience to system faults. This reduction in inertia was a contributing factor to the April system-wide blackout in Spain, triggered by an as-yet-unidentified grid fault.
Grid operators are perpetually tasked with "balancing" the grid - ensuring that supply precisely matches demand at every point, at all times, to maintain the ubiquitous 50 hertz frequency. This is achieved through the constant re-dispatching of orders to power stations. It was precisely this "balancing mechanism" that was pushed to its absolute limits on May 29th.
What are the takeaways?
The show cannot go on like this; the events of May 29th starkly highlighted the complexities of integrating intermittent DC renewable energy into our AC grid. The current regulatory focus on rapid wind farm deployment has neglected grid limitations, often forcing wind farms to be paid to halt generation because existing infrastructure can't transmit their power.
A recent report commissioned by Drax with academics at Imperial College London found that grid congestion led to the curtailment of 8.3 TWh of wind power, representing about 10% of total wind generation. This lost energy, enough to power more than two million homes, cost consumers nearly £400 million because the existing grid infrastructure couldn't transport or store it.
Grid management is further compounded by unreliable weather forecasting and insufficient investment in both physical infrastructure and advanced tools. The outdated 1980s software for grid balancing, prone to failure, posed a serious risk on May 29th; a system crash would have forced manual, near-impossible adjustments at 17 actions per minute, threatening grid collapse and widespread blackouts. In such a scenario, operators would likely have reverted to traditional methods, curtailing wind and ramping up gas to restore stability.
The dedicated control room staff perform an essential service, navigating these challenges with inadequate resources - like acrobats balancing on a fraying rope. May 29th unequivocally underscores the urgent need for strategic investment in modern grid infrastructure, enhanced forecasting, and robust management software to secure Britain's evolving energy landscape.
About Big Energy Group
Big Energy Group is a privately held, British-owned energy brokerage with an established track record of helping clients successfully navigate the energy market. The company has offices in Harrogate and the Tees Valley and serves more than 500 businesses across the UK. For more information, please visit bigenergygroup.co.uk.