Articles Nine Questions with Ivo Steijn, Head of Model Risk Management at Silicon Valley Bank

Nine Questions with Ivo Steijn, Head of Model Risk Management at Silicon Valley Bank

March 6, 2018

This past week, I had the opportunity to sit down with Dr. Ivo Steijn. I have known Dr. Steijn for several years and always found his perspectives to be invaluable. In the following interview, he provides sage advice on energy analytics, model validation and how to see major market changes coming.

Dr. Steijn, you have had an impressive career in quantitative analysis & risk management for several firms in both Energy and Finance.  You have worked in senior roles at TXU, Southern California Edison, State Street and now most recently at Silicon Valley Bank (SVB).  Can you tell us what you are doing at SVB?

Sure, I head the Model Risk Management department.  We are responsible for the validation of all quantitative models in the company, together with all of the administrative superstructure that goes with that: our model inventory, change logs, etc.

What is your view of the relative sophistication in portfolio analytics and risk management between the energy & finance sectors?

I think the Finance world is a little ahead.  They have got the whole portfolio approach to risk management served up to them in the 1950’s by Markowitz ( and it has been a standard paradigm there ever since. 

For the Energy industry it’s a lot newer.  I also think in the financial world, they throw more money at the problem.  They develop more systems; they hire more developers.  In the energy world its still fairly new.  There is not a lot of really good software, although that is changing over the last couple of years.  So, energy is a little behind finance.   

Do you think there are untapped competitive advantages in the energy sector for companies that want to do analytics better?

Oh absolutely!  In the first place, there are not a lot of good choices in the energy world.  That is strike number one.  In the second place, the choices that are available are fairly inflexible.  You buy a giant software package, and then you spend weeks installing it, more weeks to customize it and then it might not give you what you want.  There is certainly a need for a solution that is more tailored to a wide variety of customers, that people can use in different ways, that provides people with more options. 

What is a common oversight that you encounter in businesses which use quantitative models? 

The oversight that I most commonly encounter is thinking that once your model is okay, there is no model risk.  A lot of our work consists of hunting down bad models and fixing flaws in them.  But a major oversight is that even after the flaws are fixed, the model still has significant uncertainty around it.  It may contain estimated parameters, which is a best guess at an uncertain number. 

Your model is not something that is handed down to you on stone tablets on a mountain top.  It is a result of a messy statistical process which gives a fuzzy result at the end of it.  This has consequences for how we should be looking at models and using the output.      

Can you give us an example of model uncertainty?

My favorite example.  Let’s say you have a standard two commodity portfolio of power & gas.  Your Monte Carlo price simulation process uses a correlation structure between power prices & gas prices.   The correlation structure is calibrated from historical data or from options.  That correlation parameter, used by the model, is often viewed from this point forward as a constant of nature.  It is not!  There is a lot of uncertainty around it.  Once you start taking the uncertainty of critical model parameters into account, the results of your models can change. 

How should analysts address model uncertainty? 

The easiest and fastest way is to play with sensitivity analysis by wiggling the parameters around to see if the model results change significantly.  This is easy to do, and we think that sensitivity analysis is something that everyone should do for every model.   The gold star solution is to start your Monte Carlo approach by doing Monte Carlo analysis on the model parameters themselves.  Generate a Monte Carlo parameter set and do your price simulations based on those simulated parameters.  Repeat this for each simulation. This is a much more comprehensive approach.

Should companies be worried about structural changes in energy markets? 

If they are not worried about it, they are not reading the newspapers.  Anyone working in energy, particularly in CA where I worked for a long time, knows that gas fired power plants used to be the marginal asset.  You worried about gas and power prices, and that was it.  These days, for many hours each day, renewables are the marginal asset.  That completely changes your price process.

This structural change snuck into our portfolios gradually.  Other structural changes can come up faster, but this one was more gradual and it is spreading to the rest of the country.  This is a major regime change that has consequences. You basically have to redo your whole portfolio analysis.   

How can we see these regime changes coming?

The standard truth about regime changes is that nobody sees them coming.  I think that, for some regime changes, you actually can see them coming, you just don’t know when they will arrive.  My advice is to do your modeling work in advance.  Model the new world where renewable power is the marginal asset and then play with different scenarios around when the regime change occurs.  That will give you a more realistic view of what the long-term outlook for your portfolio may be.  You don’t know if it’s coming this year or the next, but you know it’s going to happen.  Well, start running your scenario analysis now. 

Finally, what is your view of how technology changes such as cloud computing is affecting the analytics landscape?

The old paradigm of a giant piece of software sitting somewhere in your building is changing.  Increasingly, we are working with very thin client solutions and accessing all of our computational machinery via the cloud.  Why would you have all that computational machinery at your company?  It’s just not efficient.  I don’t see lot of this kind of development happening yet, but it’s happening more and more.  The old paradigm of locally installed software, I think that is going away.   

Ivo Steijn is Senior Director, Model Risk Management for Silicon Valley Bank where he is responsible for all model validations and chairs the Model Risk Management Committee. Prior to joining Silicon Valley Bank he was a VP in Model Risk Management at State Street in Boston, and he headed the Model Validation department at Southern California Edison for 12 years. He holds a MA and PhD in Econometrics from the Free University in Amsterdam, the Netherlands.

This interview was conducted by David Leevan. David is the Managing Director of, a SaaS platform for energy analytics.