About:
Computer modelling is widely used for decision support in energy policy and in capital planning decisions. One key feature of these is the need to manage uncertainty in the planning background - for instance demand growth, where generation will locate, and how market players will respond to different policies.
The Maxwell Institute for Mathematical Sciences, UKERC, the Centre for Energy Systems Integration, Supergen Hubnet, Durham Energy Institute and the International Centre for Mathematical Sciences organised a tutorial day on this subject, to broaden knowledge in the energy systems community of the range of technical methods available.
The day divided into three sections. The first two were on different aspects of large scale computer modelling. Michael Goldstein (Durham) and Amy Wilson (Edinburgh) spoke on statistical methods for quantification of uncertainty. The key methodological point is that management of uncertainty generally involves performing multiple runs for different scenarios - however it is typically not possible to perform in a finite time every run one might wish to and thus provide very dense coverage of the range of possible scenarios, and as a consequence a statistical emulator is introduced which quantifies uncertainty in knowledge of the behaviour of the model for scenarios for which it has not been run (or more generally the emulator might be used to quantify uncertainty in the relationship between model outputs and the real world). Optimization or planning can then be performed with respect to the fast-evaluation emulator, and include consideration of consequences of the modelling being an imperfect representation of the real world.
The second section presented efficient solution techniques for computational optimization approaches to modelling and planning. In such an approach, a single optimization model is specified, incorporating (for instance) both the planning problem and the operational subproblem - however where planning is performed against multiple scenarios, it is desirable to exploit this problem structure to solve the optimization problem more efficiently. Andreas Grothey and Ken McKinnon (Edinburgh) presented a range of methods for decomposing problems into individual scenarios, while still providing a solution to the original overall planning problem.
The third part of the day explored broader issues of decision analysis. Stan Zachary (Heriot-Watt/Edinburgh) described issues of decision criteria, and in particular how one should take decisions under uncertainty when it is hard to quantify precisely the relative likelihoods of different scenarios for the planning background. Should one then seek decision criteria which do not involve explicit assignment of probabilities to scenarios, or should one take an alternative view that even if assigning probabilities is challenging and explicitly subjective, this is necessary in order to reflect the nature of the decision problem?
Finally Kevin Wilson (Newcastle) presented methods for systematic elicitation and quantification of expert knowledge, as inputs to such models. This covered both qualitative (e.g. what matters, and what influences what), and quantitative (how big these effects are) aspects. Further topics covered included realistic quantifications of uncertainty in estimates (human nature being to be overconfident in the precision of estimates) and combining the judgments of multiple experts.
Videos of the presentations: Kevin Wilson, Michael Goldstein and Ken McKinnon