The numbers we put on recoverable reserves are one of the critical factors taken into account when evaluating potential field developments, yet pinning down exactly what has gone into defining this figure is more of a magical mystery tour.
So observed consultant Tony Corrigan when he addressed the SPE Aberdeen Section earlier this summer, although in the talk that followed he made some constructive suggestions about stripping away some of the magic and basing reserves estimates on a firmer foundation.
"The importance of a reserves number - always recoverable for the purposes of this presentation - cannot be underestimated," he said. "It provides one of the few ways that those of us working in the subsurface area get to communicate with senior management. In turn, they pass on the number to industry analysts who will use it to formulate a view on how well the company is doing. Even the briefest media report on a given development will include a reserves estimate."
How well have we done in the past as an industry at estimating reserves? Tony suggested firstly that our views have changed as to what we actually regard as reserves. Experience in the North Sea has shown the important impact of time on the volumes of hydrocarbon that we can produce, and the impact of our ability to produce unwanted fluids such as water and/or gas. The development of unconventional wells is now extending the limits further of what we regard as producible hydrocarbon.
Each company operating in the UK has to report year on year the current reserves estimates for its fields to the Department of Trade & Industry, and these numbers are collated for publication in the annual "Brown Book" - The Energy Report. Tony had undertaken an exercise to see how the reserves of each field had changed with time, although he stressed that he was more interested in noting the trends and big changes rather than the absolute values. "The value of collating these numbers is that they are provided by the operators themselves, presumably calculated more or less in the same fashion each year," he said.
Forties is a prime example of a field which has shown a continual increase in reserves over time. Magnus' reserves have also increased year on year, and the number is still rising. It is the same story with Maureen. Tony pointed out that one of common factors for the three fields is that they are all submarine fan deposits with limited tectonic complexity - that is, with few faults. "Fields with little disruption of the sand bodies have tended to do rather better than originally expected," he said. "One reason for this is that the length scale of the flow units is much larger than anticipated, and the general absence of faults helps a great deal. A broad general rule is that these fields are high quality sands with quite high permeabilities - but, more importantly, without great permeability contrasts."
Tony went on to consider Piper and Tartan which lie within ten kilometres of each other in almost analogous positions on opposite sides of the Witch Ground Graben. He said that Piper in particular has done extremely well, with reserves tending to increase as time goes on. Although it is a shallow marine delta top sequence rather than a submarine fan field, it shares with the latter a lack of tectonic complexity and a tendency for very correlatable, continuous sand units.
The problems encountered in many of the early Brent sand fields are well known. Dunlin, for example, is more tectonically complex than realised at first and it also suffered from the "Etive/Rannoch problem" where zones with high levels of permeability contrast were not separated in the original simulation modelling. Without the presence of barriers between the rock units, it is often difficult to assess the potential for depletion of the low quality rock which often contains a significant volume of hydrocarbons even after the high permeability rock has been completely swept.
Tony explained that some of the downside can be clawed back if there is time and money to drill additional wells. Thistle is an example. Here the original reserves estimate was driven by simple simulation models and an expectation of piston-like displacement through the Etive and Rannoch horizons.
"This is by no means a criticism of the people who developed these fields the way they did, as nobody could have predicted the potential problems. Luckily, most of the fields had large numbers of well slots and big platforms with room for additional water handling facilities, so they could rectify some of the problems.
"Nowadays, we are moving into a situation where we are much 'cleverer' about designing platforms in terms of expected reserves and we have to accept that if we run out of slots it costs a lot more to rectify the situation," he said.
Moving on to identify some of the shortcomings of the process currently used for estimating reserves, Tony emphasised the requirement to identify complexity as early as possible. The capability to acquire and interpret 3D seismic and of doing long-term tests has been a huge bonus. Both these steps are critical in getting the dynamic data and the necessary view of structural complexity. The key parameters controlling economic well rates and efficient matrix drainage must also be identified. "Unless this information can be incorporated into simulation models, there is no hope of getting sensible reserve estimations out of them," Tony remarked. "Simulation models must duplicate key parameters and retain critical complexity."
He warned about today's tendency to trust completely in simulation models to estimate reserves. "Full field models in particular are gross things, where we simplify data and then treat them statistically - hence the title of my talk, Lies, damned lies and reserves. A major problem arises when dealing with extreme values - extreme permeability contrasts, for example - or discontinuities like faults. It is difficult to get this information into simulation models, and becomes harder as the size of the model and the grid blocks increases."
Although Tony stressed that he is not "anti simulation", he aired a concern about "pseudos". "Pseudos are a necessary evil but can be very dangerous as the level of heterogeneity increases," he said and recommended that they should only be put into the simulator when an understanding is reached of how the critical controls - extreme values and discontinuities - affect fluid movement and connectivity. "It is not a good idea to jump from nothing to a full field model which pseudos everything and expect to have an effective method to estimate reserves," he added.
This becomes more of an issue as the field goes into decline, and there is more water spread around the system. Kx/Ky trends are a further important consideration but are also difficult to assess - particularly in submarine fan fields where the channelised systems will tend to give strongly directional flow paths and result in asymmetric drainage volumes. A proper understanding of the drainage geometry being tapped is key if horizontal wells are planned.
Look too for tortuous linkages between channels. Do channels intersect and so form a continuous flow path, or do we need to link them with smart wells?
Tony believes that high-quality integrated teams provide the best means to manage reservoir and reserves uncertainty - where geologists and engineers understand fully how each other's data are going to be used. However, with the current emphasis on asset teams where there may only be one geologist and engineer per team, he said it is crucial to ensure these people have the right background and experience for the job.
Can advances in hardware and software help us? And does it make a difference that we can now build geostatistical models with two million grid blocks? To a degree, thinks Tony, although he pointed out that the industry has great difficulty handling simulation models with even 80,000 grid blocks. Again access to very sophisticated geostatistical software can only be of limited value unless the quality of the data is assured. Tony suggested these packages are wonderful for visualisation but that there was a long way to go to simulate the movement of fluids.
Recent advances made in drilling have, ironically, made life harder in some ways for those dealing with the subsurface. Now that wells can access places previously undreamed of, the result is fewer wells and thus fewer data points. Smart wells might depart up to four kilometres from the nearest data point which makes it a real challenge to understand the reservoir environment the wells are going through. "This limits our ability to control the fluids we want to produce as opposed to those we don't," Tony remarked.
There will be an increasing value in understanding smart well performance and high-quality integrated teams will be the answer to unlocking this potential. Tony's parting advice to an audience largely made up of engineers was quite clear: "If you don't have geoscientists working with you on the interpretation of well tests and production performance in a horizontal well, you're in trouble."
SPE London Home Page | SPE Review Articles Page |