July 16, 2018

Nine Questions with Devon Williams, Risk Manager at Grant County Public Utility District.

Recently, I had the opportunity to sit down with Mr. Williams. In the following interview, he provides sage advice on risk analytics and how to grow awareness of risk-based thinking within an organizational culture.

Q1:  Mr. Williams, as the Risk Manager for Grant County Public Utility District, what are your primary responsibilities?

Chiefly I am tasked with developing our enterprise risk platform and integrating it with company operations.  That includes developing staff both in my group and outside, and sponsoring processes to improve risk-based thinking.  We are focused on doing so right now with our audit function, our capital budgeting and operational budgeting functions. Also, we review deals proposed to hedge our power portfolio and customer rates.

Q2:  Establishing a culture that is aware of risk methodologies and risk-based decision making is difficult.  How are you doing this?

From a high level we are sharing and sponsoring the concepts of probabilistic thinking in the planning and evaluation process both on the front end and back end of our efforts.  Sometimes this happens formally through our job duties.  A lot of it happens informally.  Luckily, we have the confidence of our executives.  We get asked to advise committees and groups or projects as well.

Q3:  Can you provide an example of how this way of thinking impacts a risk management process, at Grant County PUD?

For example, we are doing test evaluations of capital projects.  Instead of using an IRR type of criteria, we are proposing a risk-adjusted return on capital.  That is serving in the discussion long before the project would ever be done.  It serves to flesh out what are the project risks and encourages the discipline on the part of the proposers that know best about it to attempt to quantify those risks and concerns, not only of the project but of its outcomes.  Not all avoided costs are certain but let’s go ahead and put what we think are the avoided costs in the return.  This is actually very beneficial if we are willing to step in the risk space as opposed to keep things on a very engineering economics level.   

Q4:  You seem to be advocating for a more dynamic approach to risk management, rather than a static use of standard risk metrics.

Yes.  There is the concept, just like the outcomes can always change, that so too can the inputs.  It’s a little bit like an agile software release, where we need to continually refresh the information we are getting and the information we are bringing to our meetings in different ways as we learn more and make more decisions.  Keep the discussion at a level that keeps people engaged, because that is how you continuously get better and better information going into your work product. 

Q5:  You mentioned an agile approach.  When you think about analytic models or an analytic process, what do you look for?

I like to be able to have very clear access to the inputs, controls, and settings in the model.  I think that sometimes what gets lost with analytic models, powerful wrenches that they are, is the communication of the sensitivities and central assumptions that are governing the work product.  The better we do that and the better we teach other people to use what we created, our work becomes more usable and accessible by everyone across the business and people are a lot more willing to apply it.  So, instead of a dashboard, have a control panel on the model.  For example, we are coming up with a different rate for a different class of customers and my group was asked to develop some risk-based components for that.  We came up with a model that is more of an analytic framework.  It explains the inputs and key assumptions, and suggests how these need to be refined.  We highlighted, placed comments and explanations of what these inputs and assumptions are and how they could be applied.  This really builds our engagement and greatly enhances the likelihood that the work product will be used. Therefore, you have a chance to impact the company culture.     

Q6:  These are important lessons for communication and utilization.  When talking to senior management about risk, how do you communicate risk in a way that is actionable? 

I think it’s a two-step process. First this to share some conclusions in plain English, such as “our analysis is indicating X & Y and it is based in these sensitivities and root causes”.  And then dig into the foundation of why we think that.  Some more technical calculations may come up but having the context of what conclusions we are drawing from it makes its easier to understand the calculations and appreciate them. I like to focus on the simple first and get into the underlying complexities after we have engagement at the higher level, which is the reason we are there.

Q7:  Do you make use of risk metrics like VaR or CFaR/GMaR, and how do they influence concrete actions? 

There is a project where we have done a portfolio VaR and a VaR stress that turns into a cash-flow test.  At no point in the discussion, writeup, conception did I use the words VaR or cash-flow-at-risk.  Those are words that mean plenty to me, but they don’t necessarily mean much to many of our stakeholders.  I see it as part of my job and my team’s job to demonstrate interpretations of the results and share the specifics of how those results were achieved in a later discussion.   I think that is probably the key.  If they are presented as concrete concepts upfront, you have a chance to getting to concrete actions.  If you weigh people down with technical concepts up front, they are likely to disengage.  Its on me to make my work product work for everyone else.  That makes me no different than a chef or a contractor.  We all have to please our customers.

Q8:  Earlier you mentioned improving access to capital.  What are primary lessons that you have learned to make sure access to capital is maintained?

Investors can and do read financial statements, but I think getting that quantification of potential underlying outcomes, and what likely and concerning events could cost, having them understood and having the value of their limits understood is probably the greatest challenge to improving access to capital for the company.  Explaining to people why we might impair a deal’s value or a forward projection with probabilistic outcomes can be difficult, but it helps the over-all product considerably. 

Nothing is really Tab-A into Slot-B in our business.  There are a lot of things that can go right or wrong or different.  But having an appreciation for the likely path and some potential surprises serves to make a more understandable operating envelope.  By including the exposure to both the upside and downside our projections are more believable and this improves access to capital. 

Q9:  Finally, what is a relatively new risk factor that has emerged for you in the last few years?

While not entirely new, one would be the claims and litigation culture.  This continues to build, and the degree and weight are greater than what we would have thought 10 years ago.  Consistently low gas volatility is another one.  That is a new world for electric utilities and their cost models.  I would also add, in a way that is surprising people with the internet of things, the costs and security of tech integration is a major opportunity and risk for utilities that we continue to monitor.

This interview was conducted by David Leevan. David is the Managing Director of, a SaaS platform for energy analytics. 

Need a solution for your energy analytics? We can help! Request your free demo today.


May 16, 2018

Small-Scale LNG in the Caribbean

Natural Gas demand in the Caribbean is growing.  Yet natural gas supply has not caught up to this demand.  There are structural inefficiencies in the market that have prevented demand from being fully met.  These inefficiencies represent a market opportunity.  This article seeks to identify the underlying reason for the lack of natural gas supply as well as some recent developments to address this problem.    

Turning Natural Gas (NG) into Liquified Natural Gas (LNG) is the primary method for international transportation of this commodity.  The advantage of condensing natural gas is that the liquefied state occupies 1/600 of the volume of the gaseous state.  Within the United States and Canada, where there is a broad network of pipelines that transport fuels across the country, natural gas can be easily moved from where it is produced to where it is consumed without worrying about volume. For international demand however, installing pipelines through the ocean is economically unfeasible. NG is transformed into LNG, piped into large ships, and transported to serve international demand. This solution satisfies our large-scale distribution requirements. However, smaller scale demand to regions like the Caribbean remains unmet. 

Global vs Caribbean Market 

LNG exports from the United States to Asia, Europe, and have seen incredible growth in recent years. Based on figures published by the EIA, exports to Asia grew seven-fold from 2016 to 2017 and exports to Europe grew five-fold over the same period.


In Latin America the story is much different from these flourishing LNG global markets. Only Mexico has seen comparable growth in LNG imports, while the remainder of the region has seen little to no increase. The primary issue is one of scale:

  1. The majority of new LNG shipping vessels are of 120,000–140,000 m3 (4,200,000–4,900,000 cu ft), smaller size vessels are not cheaper or easier to fuel, therefore the small shipments are actually more expensive per MMbtu.
  2. Demand is made up of many, small, independent market participants. These dynamics make for reduced buying power and efficiency losses. Efficiency is crucial for a transportation process that is so complex and sensitive to costs.
  3. Some of the participants do not have the credit ratings required to land favorable interest rates. Large contract volumes and contract lengths with take-or-pay conditions create large risk burdens.
  4. Vessel and terminal availability is limited. The current infrastructure is operating at capacity. While commissioning a new ship has to be done up to two years in advance and building new terminals is expensive.

Small Scale Market Developments

Nonetheless, yesterday’s problems may not be tomorrow’s problems. As in any open market, whenever there is an unfulfilled demand, forward-thinking and innovative organizations will figure out how to create more supply. In the case of the Caribbean LNG market, many major energy players are currently looking to invest in the infrastructure needed to increase supply. Wärtsilä, traditionally a Finnish engine and power plant manufacturer, has recently entered the LNG market in the Caribbean looking to facilitate the many stages of distribution. AES is finishing its second LNG storage facility in the Dominican Republic. New terminals are being constructed in Florida by Eagle LNG Partners, while SPEC LNG has built a terminal in Colombia. These terminals can service smaller vessels and expand the shipping capacity of the entire region.

Not only is the infrastructure improving, but other financial barriers may also be improving. Ryan Lawrence, VP of Shell NA LNG, noted at a recent conference that the length and volume of LNG contracts is beginning to decrease. Lawrence presented figures that showed that the length of contracts has already decreased by almost 50% from 2014 to 2017. Credit rating requirements have also loosened up in the last two years. This is significant.  In the past, only market players with great credit and deep pockets were able to procure LNG contracts that typically spanned twelve or more years and involved large volumes. These developments indicate that barriers to market entry are decreasing on both the supply and demand sides, allowing smaller players to enter the LNG market.

Caribbean LNG Demand is Here to Stay

LNG is coming of age in the Caribbean for the same reasons as throughout the rest of the world. Natural gas burns cleaner than other fossil fuels, thereby pleasing environmentally concerned citizens. Natural Gas is also a better fuel in terms of energy density. At a price of $2.75/MMBtu natural gas produces 365,760 btu of energy, while oil at a price of $74.80/Barrel produces 77,537 btu of energy.

The growth of renewable energy is also increasing natural gas demand. Although, many communities hope to be 100% renewable, the intermittent nature of these energy sources requires that grid operators have quick starting, fast ramping, resources to turn on as the sun sets or wind falters. Often, these flexible generation resources are fueled with natural gas.

Need for Market Analytics

Looking at the graph below, Brent crude oil is projected to go from around $12/MMBtu to $16.25/MMBtu over the next 15 years. During that same time period they estimate that the price of natural gas will go from around $3/MMBtu to $4.10/MMBtu. This makes LNG a more economic option for electricity generation, assuming the required infrastructure is in-place. Does this mean that all thermal asset owners should start investing to convert their assets to run on natural gas? Should new assets be built to be dual fuel from the get go?

The price graph below may make it seem like adapting an asset is a sure bet, but asset owners should carefully consider risk around price projections. What happens if there is another “polar vortex”, like the severe winter storm of 2014, or if snows in Texas again, like it did in 2017? Since the price of natural gas is firmly tied to demand in the United States, events like these can cause spikes in the price of natural gas that could easily make LNG unaffordable.

In the Caribbean, being mindful of risk around LNG investments will be important. This region is just now seeing the economics turn in their favor, but there are many sources of risk and the margins are still slim. To properly simulate risk, gas prices need to be simulated stochastically. Unlike deterministic models, stochastic models, don’t just evaluate one price path, or maybe a few scenarios. Instead, they give a distribution of prices that capture a large range of market conditions. This will allow those looking to invest to consider returns even in the worst-case scenario. Furthermore, having a stochastic price simulation model can lead to better contracting strategies that hedge against price swings. In short there no business decision will be truly sound unless there is consideration for volatility and uncertainty in commodity prices.


The viability of a robust LNG market comes down to economics. It seems that the economics favor LNG demand growth over the long-term. Small-scale LNG in the Caribbean is starting to take off as well. This cleaner burning fuel is being paired with a new wave of renewable energy and storage.  The result could be a more cost-effective, reliable, environmentally friendly, electricity grid. 

As the present infrastructure issues are solved, the stage is set for LNG to become the fuel of choice in the Caribbean. Although, there are many new opportunities in the LNG market, those that may be considering those opportunities should make sure to do careful, risk aware, analysis first.

Sebastian Kadamany is the Manager of Latin America at, a SaaS energy analytics platform that helps companies understand their physical and financial exposure in today’s energy markets.

Are you ready to improve your energy portfolio analytics? Request your free demo today.


April 24, 2018

Moneyball for Energy Portfolio Analytics

Energy portfolio managers have an on-going dilemma.  When it comes to market and portfolio analytics, when is your analysis good enough?  Unfortunately, financial constraints often drive this decision.  I hear this all the time, “My management will not allow us to add another expensive solution, so we are doing that work on a spreadsheet.”

Does this describe your situation?

  • Energy portfolio is increasing in size or complexity
  • Current analytic tools do not capture the portfolio’s full value or risk
  • Need better portfolio analytic tools, but
  • Cannot add more staff
  • Cannot spend more money

Conclusion: I cannot improve my portfolio analytics.

Can the principals of Moneyball help?

Moneyball shot to attention in 2003 with Michael Lewis’s book “Moneyball: The Art of Winning an Unfair Game”, telling the tale of Oakland Athletics baseball team and their incredible success against the odds, spearheaded by general manager Billy Beane.  Beane took a team on a limited budget to an American League record of 20 consecutive wins, on the way to finishing top of the American League West in 2002.

The basic principal of Moneyball is to correctly value your own assets and other available assets in the market to construct the best outcome on a limited budget.

Like baseball, the energy market is an uneven landscape with big, rich companies and smaller resource constrained companies. Managers of smaller energy companies need to find better analytics to compete with larger energy companies.

The traditional methods to improving energy portfolio analytics are to (1) hire your own Ph.D. quants, then build and maintain in-house tools, or (2) hire expensive consultants for analytic projects, or (3) buy expensive ‘enterprise’ software. These options require significant upfront costs, time, and a lot of optimism that it will work.

The most popular alternative to the options above is to maintain a simple spreadsheet and hope that you do not make a mistake.  The problem is that eventually you probably will overlook something important, and other market participants will be there to take advantage of you.  In fact, your counterparties may be taking advantage of you today. offers the energy manager a new option to consider.  Our sophisticated analytics are available on-demand in our web application.  We offer analytic models for thermal & renewable assets, storage, load, market analysis, hedge valuation and risk management.  There is no waiting, no IT projects, and no hidden fees. 

That’s why we like to say: eliminates 100% of the waiting, 100% of the risk and 80% of the cost of adding sophisticated analytics into your management process. 

Maybe it’s time to add more power to your portfolio analytics. can help you to start winning big today.  Want to see examples of customer driven analysis?  Read our Use Cases.

David Leevan is the Managing Director of, a SaaS platform for energy analytics.    

Are you ready to improve your energy portfolio analytics? Request your free demo today.


March 20, 2018

Duck with a Side of Energy Storage: Why batteries pair perfectly with high-penetration solar

In 2013, the California Independent System Operator (CAISO) first published the iconic “Duck Curve“, forecasting what would happen to the shape of hourly net electricity demand under high solar photovoltaic (PV) penetration scenarios. Since this initial publication, the situation in California has unfolded in remarkably similar fashion to what was forecasted. Mid-day solar generation has dramatically reduced the net electricity demand that must be supplied by fossil-fueled power plants. However, the large amount of mid-day solar generation begins to ramp down exactly as the evening electricity peak is ramping up. This results in a more pronounced afternoon-to-evening ramp and puts added strain on the more traditional power plants that must be relied on to pick up solar’s slack after the sun sets. By helping to reduce the strain on these generators, adding batteries and other flexible generation resources to the grid can help support the integration of high levels of solar PV while maintaining reliable electricity supply. What is a bit less obvious at first glance is that this trend also holds in reverse: high levels of solar PV on the grid actually make it easier to integrate high levels of battery storage. 

A recent report from the National Renewable Energy Laboratory (NREL) outlines exactly why this is the case. Essentially, batteries are great at providing power over short timeframes but less ideal for maintaining high power output over long periods of time. Put another way, batteries are well-suited to deliver power but are less adept at delivering energy. As increased solar penetration levels reduce system net demand on the front-end of the evening peak, the overall system peak in net demand becomes “sharper”. That is, the highest levels of electricity demand persist for a shorter amount of time than in the absence of solar generation. This allows batteries to more easily provide peak reduction benefits to the grid because they can discharge for a shorter period of time while achieving the same reduction in overall system peak demand.

“Those that fail to leverage today’s ever-more-accessible data analytics to monitor their portfolios amid a changing market may find themselves eating crow while the rest of the industry sits down to duck.”

The overall dynamic between solar and batteries, as the authors note, is one of synergism. In the language of system dynamics, it’s a reinforcing feedback loop. Increases in battery storage and flexible generation on the grid allow more solar to be integrated, which in turn increases the value proposition of battery storage and flexible generation. That is, batteries are good for renewable energy is good for batteries–and round and round we go. 

There’s also a third component to this dynamic that the NREL report doesn’t really address: market prices. To-date, the Duck Curve has already lead to decreases in the mid-day price of electricity in California, with marginal wholesale electricity prices frequently even dipping below zero. This means that generators (regardless of type) that put electricity onto the grid may actually have to pay to do so. In two recent articles, we discussed how the net load dynamic that results from high levels of renewable generation is killing the economics of the baseload operational paradigm where large centralized generators run at constant output for long periods of time. However, battery storage has the unique ability to increase both electricity production and electricity consumption on the grid, albeit at different times. If properly scheduled, the increased electricity demand from batteries could become a powerful stabilizing force for both the electricity demand curve and, correspondingly, for market prices.

As the grid transforms and market expectations evolve, we are likely to see dramatic shifts in what becomes “normal” for wholesale electricity market prices. Stabilization of the hourly shape of electricity demand could mean reduced energy prices and less market volatility. Organizations with physical assets or long-term power purchase agreements could see the erosion of value in their existing assets and contracts. Load serving entities could find it more economically optimal to purchase power on the spot market than to fire up aging generators, forcing these units into early retirement.

Whatever these fundamental shifts in electricity supply will bring, businesses with a stake in the game should carefully monitor the value of their assets and actively pursue downside protection through targeted risk mitigation strategies. Those that fail to leverage today’s ever-more-accessible data analytics to monitor their portfolios amid a changing market may find themselves eating crow while the rest of the industry sits down to duck.

Brock Mosovsky, Ph.D., is Director of Operations and Analytics at, a SaaS energy analytics platform that helps companies understand their physical and financial exposure in today’s energy markets.

Need a solution for your energy analytics? We can help! Request your free demo today.

Request Your Demo Today

March 6, 2018

Nine Questions with Ivo Steijn, Head of Model Risk Management at Silicon Valley Bank

This past week, I had the opportunity to sit down with Dr. Ivo Steijn. I have known Dr. Steijn for several years and always found his perspectives to be invaluable. In the following interview, he provides sage advice on energy analytics, model validation and how to see major market changes coming.

Dr. Steijn, you have had an impressive career in quantitative analysis & risk management for several firms in both Energy and Finance.  You have worked in senior roles at TXU, Southern California Edison, State Street and now most recently at Silicon Valley Bank (SVB).  Can you tell us what you are doing at SVB?

Sure, I head the Model Risk Management department.  We are responsible for the validation of all quantitative models in the company, together with all of the administrative superstructure that goes with that: our model inventory, change logs, etc.

What is your view of the relative sophistication in portfolio analytics and risk management between the energy & finance sectors?

I think the Finance world is a little ahead.  They have got the whole portfolio approach to risk management served up to them in the 1950’s by Markowitz ( and it has been a standard paradigm there ever since. 

For the Energy industry it’s a lot newer.  I also think in the financial world, they throw more money at the problem.  They develop more systems; they hire more developers.  In the energy world its still fairly new.  There is not a lot of really good software, although that is changing over the last couple of years.  So, energy is a little behind finance.   

Do you think there are untapped competitive advantages in the energy sector for companies that want to do analytics better?

Oh absolutely!  In the first place, there are not a lot of good choices in the energy world.  That is strike number one.  In the second place, the choices that are available are fairly inflexible.  You buy a giant software package, and then you spend weeks installing it, more weeks to customize it and then it might not give you what you want.  There is certainly a need for a solution that is more tailored to a wide variety of customers, that people can use in different ways, that provides people with more options. 

What is a common oversight that you encounter in businesses which use quantitative models? 

The oversight that I most commonly encounter is thinking that once your model is okay, there is no model risk.  A lot of our work consists of hunting down bad models and fixing flaws in them.  But a major oversight is that even after the flaws are fixed, the model still has significant uncertainty around it.  It may contain estimated parameters, which is a best guess at an uncertain number. 

Your model is not something that is handed down to you on stone tablets on a mountain top.  It is a result of a messy statistical process which gives a fuzzy result at the end of it.  This has consequences for how we should be looking at models and using the output.      

Can you give us an example of model uncertainty?

My favorite example.  Let’s say you have a standard two commodity portfolio of power & gas.  Your Monte Carlo price simulation process uses a correlation structure between power prices & gas prices.   The correlation structure is calibrated from historical data or from options.  That correlation parameter, used by the model, is often viewed from this point forward as a constant of nature.  It is not!  There is a lot of uncertainty around it.  Once you start taking the uncertainty of critical model parameters into account, the results of your models can change. 

How should analysts address model uncertainty? 

The easiest and fastest way is to play with sensitivity analysis by wiggling the parameters around to see if the model results change significantly.  This is easy to do, and we think that sensitivity analysis is something that everyone should do for every model.   The gold star solution is to start your Monte Carlo approach by doing Monte Carlo analysis on the model parameters themselves.  Generate a Monte Carlo parameter set and do your price simulations based on those simulated parameters.  Repeat this for each simulation. This is a much more comprehensive approach.

Should companies be worried about structural changes in energy markets? 

If they are not worried about it, they are not reading the newspapers.  Anyone working in energy, particularly in CA where I worked for a long time, knows that gas fired power plants used to be the marginal asset.  You worried about gas and power prices, and that was it.  These days, for many hours each day, renewables are the marginal asset.  That completely changes your price process.

This structural change snuck into our portfolios gradually.  Other structural changes can come up faster, but this one was more gradual and it is spreading to the rest of the country.  This is a major regime change that has consequences. You basically have to redo your whole portfolio analysis.   

How can we see these regime changes coming?

The standard truth about regime changes is that nobody sees them coming.  I think that, for some regime changes, you actually can see them coming, you just don’t know when they will arrive.  My advice is to do your modeling work in advance.  Model the new world where renewable power is the marginal asset and then play with different scenarios around when the regime change occurs.  That will give you a more realistic view of what the long-term outlook for your portfolio may be.  You don’t know if it’s coming this year or the next, but you know it’s going to happen.  Well, start running your scenario analysis now. 

Finally, what is your view of how technology changes such as cloud computing is affecting the analytics landscape?

The old paradigm of a giant piece of software sitting somewhere in your building is changing.  Increasingly, we are working with very thin client solutions and accessing all of our computational machinery via the cloud.  Why would you have all that computational machinery at your company?  It’s just not efficient.  I don’t see lot of this kind of development happening yet, but it’s happening more and more.  The old paradigm of locally installed software, I think that is going away.   

Ivo Steijn is Senior Director, Model Risk Management for Silicon Valley Bank where he is responsible for all model validations and chairs the Model Risk Management Committee. Prior to joining Silicon Valley Bank he was a VP in Model Risk Management at State Street in Boston, and he headed the Model Validation department at Southern California Edison for 12 years. He holds a MA and PhD in Econometrics from the Free University in Amsterdam, the Netherlands.

This interview was conducted by David Leevan. David is the Managing Director of, a SaaS platform for energy analytics. 

February 15, 2018

Renewables Have Killed Baseload…Now What?

***This article is the second in a two-part series on the effects of high renewable penetration on thermal generation operational paradigms. The first article discussed how intermittent renewable generation has transformed the grid to the point where many combined cycle plants are unable to operate profitably.***

So baseload power is on its way out and intermittent renewable generation is on its way in. There’s been a lot of buzz about how this represents a “tremendous transformation” or a “historic transition“, and the trend has even received attention in the national political arena with the controversial Grid Resiliency Pricing Rule proposal of late 2017, since defeated by the Federal Energy Regulatory Commission. Suffice it to say there’s been a lot of rhetoric around the declining relevance of baseload electricity supply. However, there has been little practical insight or guidance offered to organizations with a stake in the energy game—precisely those that will be most affected by the transformation. For example:

  • What does the end of the baseload supply paradigm mean for energy companies and purchasers of long-term renewable PPAs?
  • How can companies participate in the renewable energy revolution without being destroyed by it?
  • And what role should data analytics and energy risk management play throughout the impending transition.

In this article, we offer some practical perspectives on these questions and discuss what the death of baseload power really means both today and in the near future. For more on how and why baseload is becoming increasingly irrelevant, be sure to check out the first article in this two-part series.

Long-standing structural energy market dynamics will be challenged

You can’t talk about the future of electricity supply without addressing the future of electricity demand, and we’ve observed an important trend over the better part of the past decade that is becoming harder and harder to dispute: demand growth has stagnated. Continued growth in electricity demand is a long-standing assumption that has been baked into electric utility supply planning and budget forecasting for decades. A late 2016 report by Lawrence Berkeley National Laboratory includes a striking figure that illustrates just how resistant supply planners are to giving up the load growth assumption (see Figure 1 below). Challenging this assumption will provide an important perspective that can ultimately inform an outlook on future wholesale electricity prices.



Figure 1. Comparison of forecasted and actual customer demand for a utility in the western U.S. Image credit: LBNL Report

When stagnant electricity demand is viewed in the context of the evolving electricity supply stack that is undergoing rapid expansion of renewable generation capacity and significant retirement of fossil-fueled generation, additional long-held market assumptions come into question. One of these is the assumption that natural gas and electricity prices will tend to move in the same direction at the same time. The ratio of the electricity price in $/MWh to the natural gas price in $/MMBtu is referred to as the “implied market heat rate”. Measured in MMBtu per MWh, the market heat represents a measure of generation efficiency akin to a miles-per-gallon rating for a car. As market prices fluctuate, the real-time market heat rate loosely reflects the efficiency of the generator called on to provide the “next megawatt”, or the marginal unit of power to satisfy electricity demand. Since this marginal unit often runs on natural gas, the price of natural gas determines the cost of production, and correspondingly influences the marginal price of electricity. 

However, as the electricity supply stack evolves, there is increasing likelihood that the marginal power needed to satisfy demand could be provided by a generation unit that does not run on natural gas, e.g., by renewables, batteries, or hydroelectric generation. In this case, the price of natural gas may have little to do with the marginal cost of power, causing electricity and natural gas prices to decouple. While it’s not likely the decoupling effect will persist across all hours of the day anytime soon, even its occurrence across a few hours when renewable generation is peaking could produce significant financial consequences for organizations with natural gas or power positions on the books. This includes power producers, load serving entities, and counter-parties to physical or financial power purchase agreements (PPAs). Such a break-down of market heat rates would directly contradict fundamental assumptions that underpin many long-term energy procurement and physical/financial hedging strategies. As such, these strategies should be reevaluated within the context of a market where baseload generation has become obsolete.

Wholesale electricity market prices over the coming decades are extremely uncertain

Demand stagnation and decoupling of market heat rates are virtually unprecedented dynamics in wholesale energy markets, and the market’s response as these trends become more pronounced is anyone’s guess. In particular, long-held assumptions that electricity prices will continue to rise into the future should be viewed as highly suspect. In fact, it is possible that wholesale electricity prices could actually fall in the coming years. The combination of demand stagnation, added renewable generation with little to no production cost, and continued retirement of outdated and inefficient fossil generation implies that the average cost of generating electricity will likely decrease.

As the grid evolves in response to the changing supply paradigm, new transmission and distribution costs, or other costs associated with managing grid reliability, may partially offset declines in the cost of electricity itself. However, long-term agreements that reference wholesale electricity prices may not capture these added grid-based costs, resulting in significant downside risk to counterparties. For example, many corporations today are signing long-term renewable PPAs that are financially settled against wholesale electricity prices at a particular grid location. In a scenario of falling electricity prices, these PPA buyers could see themselves making significant monthly payments for the losses incurred on their contracts. As a result, any purchase decisions that hinge on a long-term forecast of electricity prices should incorporate analysis that challenges the conventional notion that these prices will increase over time.

Generation flexibility will be highly valuable

Because renewable generation facilities can exhibit dramatic fluctuations in production output on short timescales, there will be a serious need for dispatchable generation that is responsive enough to make up the difference. These generation facilities must be able to ramp up and down quickly and to cycle on and off frequently in response to highly variable renewable production output. Such desirable operational traits have become synonymous with the term “generation flexibility”, and they will by highly valued in an era without traditional baseload generation. The increased emphasis on flexibility means that generators with high fixed startup costs incurred each time the unit turns on will be at a disadvantage. Additionally, generators whose efficiency significantly degrades as they ramp down will have a difficult time remaining profitable. Essentially, the same dynamic that is hurting baseload units today will be taken to its logical conclusion, and will eventually elbow out all but the most responsive technologies for delivering power to the grid.

The elephant in the room here is–you guessed it–batteries, or more broadly, energy storage. Batteries, flywheels, and other forms of energy storage can be extremely responsive, providing almost instantaneous power when needed. However, there is an important interplay between a storage technology’s ability to provide power and its ability to provide energy. For example, a battery may be able to provide a large amount of power to fill a short-lived gap in renewable energy generation caused by a large cloud passing over a solar facility or a momentary lapse in wind. However, using batteries to supply the total energy needed to fill a gap in solar production from an entire day of rain showers or snowfall is much more difficult. Batteries and other storage technologies have come a long way in recent years, but their ability to provide sustained power output over long periods of time still has a long way to go before they represent viable solutions for completely replacing flexible fossil generation. At least for the foreseeable future, there will be a significant need for highly-flexible fossil-fueled generation to ensure electricity demand in a baseload-free era is always met with adequate supply.

Data-driven energy risk management will be more critical than ever before

The impending fundamental shifts in electricity supply and demand and the corresponding impacts on market prices all mean that uncertainty and risk will permeate every aspect of the energy sector, from physical operation management to financial decision making. An intimate relationship with risk is nothing new for participants in electricity markets where wholesale prices that normally hover around $30/MWh can spike as high as $9000/MWh in the blink of an eye. However, in response to aggressive corporate sustainability targets and pressure to display environmental stewardship, there has been a growing trend of new businesses directly procuring long-term energy through PPAs. In many cases, the core business function of these companies is not directly tied to the energy markets, and they often do not have the in-house expertise to properly evaluate and continually monitor risk associated with their purchases. These organizations are particularly exposed to downside risk resulting from wrong-way market moves or breakdowns in long-standing structural market relationships. As more and more companies begin interacting directly with volatile energy markets, they will experience a growing need for sophisticated energy risk management and targeted hedging strategies that protect them from downside exposure.

Even for seasoned veterans of the energy industry, the fundamental shifts resulting from the end of baseload generation as we know it mean that many of the risk management strategies and assumptions they are familiar with will break down. Developing and testing new risk management strategies that leverage all data available will require access to sophisticated modeling capabilities and sound analytical methods. Models must be flexible enough to accommodate rapid scenario analysis custom-tailored to an organization’s particular portfolio. Just as flexible generation will be essential to reliable electricity supply, flexible analytics will be essential to maintaining long-term profitability and solvency throughout the energy transformation. 

So renewables have killed baseload…now what? Now we adapt. At, we’re helping organizations adapt to the ever-changing energy landscape with our industry-leading cloud-based energy analytics platform. From renewable PPA valuation, risk assessment, and ongoing value monitoring to thermal asset valuation, cash flow reporting, and hedge analysis, we provide access to the advanced analytics your organization needs in an era without baseload generation.

Need a solution for your energy analytics? We can help! Request your free demo today.

Request Your Demo Today                                                                                                                 

January 29, 2018

Energy Risk Management: Five Steps to Improve Your Process

Choose the right risk metric.

Energy portfolios vary greatly from company to company and location to location.  No two energy portfolios are the same, and risk management strategies must be tailored to the unique risk factors of each portfolio.

Many companies use a traditional Value-at-Risk (VaR) metric to report risk to their Board and to highlight risk to the portfolio in quarterly financial reports.   But what action does your organization actually take if your VaR goes up by 25% in one week?  If you don’t have a great answer, VaR may not be right for your portfolio.  At, we use both VaR and Cash Flow at Risk (CFaR) methodologies to assess and report portfolio risk, depending on the type of portfolio and the goal of the risk reporting.  Here is a simple comparison of the two:

Which is more appropriate, VaR or CFaR?

  • VaR is a risk metric that provides not only the value the portfolio could lose over a given time interval in a “worst case scenario”, but also a measure of the statistical confidence around that estimate. For example, computing the VaR for a large portfolio of financial contracts may suggest there is a 95% chance that portfolio will not lose more than $1M over the next 5-day period.  VaR is most appropriate for assessing risk over short time periods (less than one month), particularly for positions that can be unwound in a few days.
  • CFaR is a more appropriate risk metric for companies with physical assets, customer load or complex structured transactions.  These types of portfolio elements cannot typically be sold or significantly modified in response to short-term market moves. CFaR provides a detailed analysis of cash flows through time based on many simulations of possible future states (more on simulations in item #3). The result is a distribution of future value and costs for any time bucket (day, week, month or year) on any portfolio item (gen asset, storage, customer, deal, commodity, counter-party), or for the portfolio as a whole.  The results can be proactively used to reduce risk, lower cost and improve portfolio performance.
  • Why use both?  Many energy portfolios contain both long-term and short-term assets.  Using both CFaR and VaR can provide advantages for more diverse portfolios.  Additionally, more sophisticated VaR models allow users to slice through portfolio dimensionality to report net position by commodity, value at risk by trader or counter-party, and other useful metrics that can be used to surgically target and mitigate risk from specific portfolio components.  Taken together, combined VaR and CFaR analysis can enhance active risk and portfolio management.

Choose the right risk factors.

What is driving the risk to your particular portfolio?  Price volatility, weather uncertainty, basis risk, customer migration, congestion, government policy or regulation?  Understanding the underlying “risk factors” in your portfolio is crucial to modeling your risk appropriately.  As your portfolio responds to changes in energy markets and supply/demand dynamics, even the sharpest intuition can mis-identify the riskiest portfolio elements.  A good risk management process can help keep the focus on what poses the greatest threat to your company financials.

Choose a Great Simulation Model.

It seems that everyone has a simulation model these days.  At the simplest level, these are Excel plug-ins, closed form models or a bunch of legacy code from some analyst that used to work at your company.  Here is the danger – a poor simulation engine can be much worse than having nothing at all.  At least when you have nothing, you know it.  A poor model may add uncertainty in a way that is not market-consistent, may misrepresent important relationships between related commodities (e.g., correlation between power and gas prices), or may simply drift out of calibration as market conditions evolve.  Inaccurate simulations can result in actions that are completely inappropriate for your real risks.  Its important to validate any simulation model against actual market data regularly to ensure it’s doing its job.

Include everything in your model. 

Too many companies get lazy with this.  “My model includes about 90% of our portfolio, but the more complex transactions we value in a separate process.”  It may be the case that those exotic transactions are driving an outsized portion of your risk!  Recognize that big risks can come in small packages and allocate staff (or vendor) time to include all transactions into your model to get a comprehensive and accurate risk analysis.

Turn results into actions. 

Surprisingly, this is the most common gap.  Perhaps you have done all the hard work to set up a comprehensive risk management system, but the results still aren’t affecting real portfolio decisions.  Try holding monthly risk committee meetings that include your financial, trade and asset managers along with at least one C-suite sponsor.  The volatility in energy markets is the risk manager’s best friend.  The next Polar Vortex or Bomb Cyclone is lurking just around the corner.  Eventually, you’ll be able to say, “I told you so”, and win the day.

Was this article helpful?  Are there additional subjects that you want us to write about?

Click here to leave feedback.

Need a solution for your energy analytics? We can help! Request your free demo today.

Start Your Risk Management Analysis Today

This article was written by David Leevan. David is the Managing Director of, a SaaS platform for energy analytics. 

January 22, 2018

cQuant Insight: GE Rings Funeral Toll for Baseload Generation

***This article is the first in a two-part series on the effects of high renewable penetration on thermal generation operational paradigms. The next article will discuss how utility supply planners, generation asset managers, and renewable PPA purchasers can leverage today’s best practices in energy analytics to position their portfolios for a future without baseload power.*** 

The fact that our generation mix is transforming is no surprise. What may surprise you, however, is just how transformed it’s already become. General Electric, a global leader that produces some of the world’s most efficient natural gas combined cycle generating technology, is already finding its plants uneconomical in electricity markets with high renewable penetration. A recent post on GE’s blog cites intermittent patterns in renewable generation as the main cause for this operational transformation.

It’s no secret that renewable energy has been coming onto the grid at an ever-accelerating pace. Over the past decade, technology costs for solar and wind have experienced dramatic declines and consumers have advocated with increasing fervor for “green” sustainable energy. The result has been a focus on renewable generation for utility-scale capacity expansion like never before. Since 2014, renewables have accounted for more installed capacity in the U.S. than all other forms of generation combined, while at the same time over 40 gigawatts of outdated coal and natural gas fossil generation have retired. The way electricity is generated in both the U.S. and around the world is rapidly transforming.

One of the most insightful windows into this transformation is the way thermal generation technology is being forced to respond. Combined cycle natural gas plants used to run for long periods of time to satisfy baseload electricity demand. Baseload demand is essentially the lowest point on the daily and weekly cyclic patterns of electricity demand in a given region; that is, it’s the amount of demand that can always be counted on no matter what the time of day or day of week. When renewable generation peaks, it reduces the amount of “net demand” that fossil-fueled generators are needed to satisfy. When that net demand begins dropping below the capacity of some of the baseload power plants themselves, these plants have to ramp down to follow suit, or, in some cases, shut down completely. This becomes extremely expensive, since fossil plants often incur high fixed costs each time they start back up and usually run at significantly lower efficiency when ramped down below their maximum generation capacity. The increased operational costs of cycling can quickly eat away at profits for once-baseload generating facilities, forcing them into early retirement.

Here’s the kicker: this dynamic is already happing. GE’s article notes that the Irsching combined cycle plant in Germany was forced to close despite being the most efficient power plant in the world at the time. Despite its high efficiency under baseload operation, it was simply not flexible enough to operate within Germany’s electricity supply stack, which had been transformed by large quantities of wind and solar generation. Other European markets are seeing similar baseload-destroying trends, such as the UK, where the country’s largest energy services provider has noted it’s giving up on combined cycle generation altogether. In these markets, it’s not an exaggeration to say that the very concept of baseload power has become obsolete.

So how much renewable generation does it take to kill baseload power? If you want to see the energy issues the U.S. will likely face ten years from now, just look at Western Europe today. Take Germany, for example: today, roughly 27% of Germany’s electricity comes from renewable energy, whereas this number was just 9% a decade ago. According to the U.S. Energy Information Administration, renewable energy accounted for about 15% of U.S. electric generation in 2016. With recent trends showing an acceleration in renewable generating capacity, it’s likely we’re already well within a decade of where the German and British electricity supply stacks are today.

The U.S. is firmly on a path toward a future where baseload power is a thing of the past. Now, with GE second-guessing its own efficient baseload technology, the funeral dirge for baseload power in the U.S. may have already begun:

“And therefore never send to know for whom the bell tolls; it tolls for thee, Baseload. It tolls for thee.”

 Read Full Original Article Here: Evolution of Combined Cycle Performance: From Baseload to Backup

Need a solution for your energy analytics? We can help! Request your free demo today.

Request Your Demo Today

January 18, 2018

Energy Analytics Revolution is Here!

I have been writing recently about the risks associated with buying enterprise software, advances in cloud computing, and the benefits of software-as-a-service. I have tried to keep those articles fairly vendor agnostic. This article is going to be more self-serving. I am basically going to describe why we at decided to build a new type of energy analytics company.

What is wrong with the current choices? How can the energy industry benefit from something new?

As many of you know, energy companies are driving their business with analytic decision making. The best energy companies spend $millions on analytic solutions and are constantly looking for ways to improve. Smaller organizations often outsource analytic decisions to outside firms because they cannot afford to spend much on analytic solutions and analytic staff.

 Why is this happening? Is there another way?

Let us first examine the current options for adding or improving your analytics. There are basically three choices that you have as a customer:

  1. Build it yourself – this requires the customer to go into the software business. The company must hire staff to build the quantitative models, the UI, the database, the system integrations, etc. Then maintain and improve it over time.
  2. Hire a consultancy – this is outsourcing the software development. Now you must pay consulting fees for the life of the project, which is often quoted in months but requires years to complete (funny how that works).
  3. Buy a vendor solution – analytic software vendors normally have solutions that are both enterprise ready and highly configurable. Unfortunately, they also require large upfront fees, deployment projects, and annual maintenance contracts.

These options require the buyer to shoulder large upfront risk. Push your money in the middle of the pot and hope that it all works out. To address this risk, many energy companies employ bureaucratic procurement processes.

It can take years to get the solution in place.

We decided that the energy industry needed another choice. Our goal was to eliminate 100% of the risk, 100% of the waiting and over 80% of the cost associated with the existing choices.  So, we began working with top quantitative experts to build sophisticated energy models and we deploy those models in a modern cloud computing platform.

Now energy analysts have a place to go to find and use analytic models – on demand! Our models can be used for a day, a week, a month, or a year. We are adding new models every few months.

In truth, we are just at the beginning of our journey. Our team has tremendous plans to revolutionize the energy analytic landscape. You deserve more and better choices. We hope to be a part of this revolution.

This is just the beginning and I hope that you will join us! 

This article was written by David Leevan. David is the Managing Director of, a SaaS platform for energy analytics. 

Request Your Demo Today

January 7, 2018

“Bomb Cyclone” Highlights Need for Energy Risk Management

For many on the east coast, the end of 2017 brought more than Yuletide cheer and presents. A weather event called a “bomb cyclone” brought over a foot of snow followed by weeks of cold weather that ticked new record low temperatures in many of the northeastern United States. Many states experienced temperatures 20-30 degrees below normal and strong winds plunged apparent temperatures even lower.

As icy conditions lingered, power outages struck tens of thousands of people and heating demand for natural gas strained the distribution system to its limit. The result was record high natural gas demand, record high natural gas prices at several eastern trading hubs, and correspondingly high electricity prices in areas reliant on natural gas as a primary fuel for generation.

The severe weather event harkens back to early 2014, when a downward shift in the “polar vortex” brought similar weather conditions to the U.S. and similar turmoil to eastern energy markets. Then too, the natural gas system was stressed due to prolonged above-normal heating demand, gas prices spiked, and energy prices followed suit. The 2014 event left many unsuspecting energy companies financially wounded or bankrupt.  This year will be no different.

Companies with unprotected short positions in natural gas or power can be bankrupted in a matter of days when these severe weather events occur. To put the extreme prices in perspective, late December 2017 forward contracts showed the expected price of January gas at the Transco Z6 hub in NY to be around $6. However, on January 4, 2018 prices traded as high as $175, an almost 3000% increase above the prior expectation! Imagine if, just for one month, your mortgage payment suddenly increased by 3000%, turning a $2000 monthly payment into $60,000.

For energy companies exposed to volatile natural gas and electricity spot markets, the way to protect against “bomb cyclones” and “polar vortexes” is by savvy financial hedging. This means taking financial positions in the market to offset some or all of their expected market exposure, locking in rates or providing optionality to protect them when prices take a turn for the worse. In turn, understanding “expected future market exposure” can be complex and requires rigorous analysis accounting for a company’s unique portfolio of contractual commitments, physical assets, and in-place financial positions.

At, we’ve built an energy analytics platform with easy-to-use models that can help companies protect themselves from adverse market events. Our web-based interface is always available and provides access to powerful analytic models that let you understand your portfolio’s market exposure and take steps to mitigate your risk. With both a “polar vortex” and a “bomb cyclone” in just three years, don’t let the next buzzword-worthy weather event destroy your company’s future. Contact us today to learn more, or visit us on the web at

Request Your Demo Today

Scroll to top