Part of the value SVB brings to its clients is access to experts, potential partners, customers and investors, increasing its clients’ probability of long-term success. On April 8-9, 2010, SVB hosted its first Cleantech Leadership Summit ¬ "Crossing the Cleantech Divide" ¬ at Stanford University, hosting more than 100 cleantech insiders. The Summit was designed to bring together a select group of leaders from the venture capital, entrepreneurial, public policy, energy, academic and NGO communities with the goal of focusing attention, insight, and energy on the question of how best to promote the development of high-growth, innovative technology companies in the energy generation, energy storage and energy efficiency sectors, over the course of the coming decade.
The program featured conversations with successful cleantech entrepreneurs, customers, and energy industry executives, and facilitated small group break-out sessions in which participants interacted directly with each other to explore and develop new ideas. One of the resulting conversations is captured here:
The increasing integration of intermittent renewable energy sources such as wind and solar onto the grid has the potential to negatively impact overall electric reliability. Utilities and regulators at the state and federal level have promoted a number of pilots to explore new grid storage technologies, including flow batteries, compressed air energy storage, and flywheels. However, the benefits to the grid straddle jurisdictional boundaries between generation, transmission, and distribution. How should storage be paid for and by whom? What will be the public reaction in terms of siting, permitting, and local NIMBY resistance? How big a role can storage ultimately play in achieving renewable energy mandates, such as California’s 2020 RPS?
The need for affordable bulk storage of electricity is becoming glaringly evident as the grid comes under increasing stress from intermittent renewable resources, evolving customer demands for power quality and reliability, and the emergence of new end uses, such as plug-in vehicles (large mobile loads with as yet unknown charging habits).
As one participant noted in our session, “It’s like we’re driving [the grid] at 100 miles an hour with no shock absorber”.
The aggressive Renewable Portfolio Standards (RPS) in states like California have catapulted storage into the national energy debate. The grid impacts of highly intermittent wind generation, in particular, are already impacting operations in states like Texas and New Mexico (not to mention markets like Denmark and Germany) that have already achieved wind saturations above 10% of the electric mix. Storage appears to be the only viable solution. Where a gas-fired peaker might provide a similar buffering benefit to the grid, under an RPS, it now becomes a liability to hitting the target, as natural gas is a non-renewable fuel that adds to the denominator, even when it allows more effective integration of a highly intermittent wind resource. However, the addition of storage presents a thorny jurisdictional conundrum for utility regulators, as its costs and benefits may straddle the traditional state/federal and distribution/transmission divides that govern ratemaking (and therefore cost recovery for utilities) in the U.S.
While the pain is already being felt, it’s still too early in the development and deployment of grid storage technology to get clarity on the different applications, and the technical requirements and value proposition to either the utility or end-customer. The consensus of the group was that it is therefore premature for regulators to focus on how to structure incentives and tariffs for e.g. pricing, interconnection, and siting of new bulk storage facilities. “We need to walk before we can run,” noted Dan Rastler, EPRI’s Program Manager for Energy Storage.
The good news is that the federal stimulus will be funding some 100 different grid storage projects, including a wide array of technologies, over the next several years. We will no doubt learn a great deal as these experiments go forward about what systems work best in different applications and locations, and how best to design rules of the road that correctly balance the desired goals of reliability, increasing renewable energy penetration, and affordability.
There are unique technical challenges associated with storage that complicate our understanding of deployment strategy. While there are a plethora of technologies (some “28 different battery chemistries currently commercially available” by one participant’s count; not to mention pumped storage, thermal storage, flywheel, and compressed air), each one has unique constraints with regard to, for example, charging cycle (both time and symmetry of charge/discharge), deep discharge capability, cost, weight, toxicity of materials, maintenance, and lifetime. In short, there is not likely to be a “one size fits all” approach for all applications and it may even require a combination of technologies to meet the requirements at a given location, including cycling patterns that may vary from one usage mode to another at the same location.
An example was cited in the case of utility AEP’s widely reported program to test NaS batteries. The original project specification called for daily cycling to support a wind farm, under the assumption that the primary benefit would be a day/night load “firming”. But in initial field trials, operators were cycling the batteries as much as 20 times a day, utilizing the system for frequency and voltage support on the local grid. As a result, the batteries started degrading well ahead of their 10 year design life and had to be swapped out after a little more than 2 years of service, with obvious negative consequences for the project economics.
One consensus from the expert group was that the best place to start improving our understanding of storage economics is through investment in more and better grid measurement. Currently electric networks are only lightly instrumented downstream of the distribution substation. Little is known about the actual conditions in real time and therefore about the locational value of ancillary services (such as reactive power compensation) on the medium and low voltage network. Traditional analog revenue metering on customer premises (and even most mass market electronic meters) are of little use in evaluating momentary fluctuations, such as sags, spikes, and harmonics. While the cost of advanced monitoring is high (a figure of $5000 per measurement point was cited), given the cost and complexity of the storage challenge, this will almost certainly turn out to be a small price to pay as we climb the learning curve.