Previous Lesson Complete and Continue  

  1. Geomodelers and Geomodeling

Lesson content locked

Enroll in Course to Unlock
If you're already enrolled, you'll need to login.

Transcript

- [Voiceover] Ladies and gentlemen, I'm happy to have you all be here today for this presentation. And I'm also happy that A-F-P-G has put this all together and we're going to have an exciting talk, talking about this topic, Geomodeling. So, there are a couple of things we're going to look at for the agenda today. We're going to first of all talk about geomodelers and geomodeling, and that's supposed to be an intro for people who don't have a lot of experience building models or have not been worked in geomodeling for a long time. And that's the introduction you have. For those that have a lot experience, and for those who are familiar with this topic, the good stuff comes when we begin to talk about how to make your model fit for purpose. And then, after that we're going to go ahead and look at different kinds of input data. And we're going to look at how you get to condition this information, how you put it in your model so it brings out the best value that you have. There's always that issue of scale. We have information coming from so many different sources, informational from core, from core plugs, some information from SEM analysis, and then we also have information coming from seismic. Those are so many scales apart. How do you work all of those together to help build up an integrated platform, or to build that integrated tool, which is a model to use it and enliven your reservoir. A big topic is how to Q-C the model you built. It's always important because a model that's not Q-C, you don't know if it's fit for purpose or if it's sufficiently useful to provide you the answers that you want to put to the model in process. And so, there are a couple of case studies but all through this talk I'm gonna be mentioning examples, which are like mini-case studies, and there are some case studies for the next presentation. So what is a geomodel? To a lot of people, a geomodel just looks like a 3D image and they probably don't understand that this, the solution that allows a geologist be able to translate geological concept, and to be able to represent rough properties in a numerical manner so that it can be transferred across discipline and be used for simulation and be used to make a lot of decisions with people who are not geologists. So the geomodeler, in many cases, he also is a geologist and this is the domain where he sits. He gets to do a lot of the data crunching and translates all of this data into parameters in the model. He also translates all of the sequence, cryptographian, all the concepts that are very geological, into hard facts. And that also goes into building the model, and eventually when the results are done, he's able to present those results to people who are outside of the system in a manner that they can appreciate how geology influences the weather. This slide just shows you how useful geomodels are. Some people, sometimes, are not so certain about how wide this application is. But it has really wide application. And, from the most fundamental, which is allowing people to be able robustly estimate volume, to the very specific-- when you're trying to do things that have to do with production, surveillance, or you're managing a flood in a reservoir, geomodel is a very useful tool. Every time a weather is gonna be planned, there is no better platform for integration and for collaboration than a geomodel. The big picture here is that, it will save you cost all through your life, and help you know when to exit pretty quickly, that way you save yourself cost. So we're talking about The Conventional Workflow. I've broken this off into five stages here. Stage one, is where you frame the model. You have to look at the data to be able to build your concept of the area. The geological concept that when you have that done, it allows you to be able, through a checklist of your uncertainties, to frame the model properly. And when you got that completed or nailed down, you can go right ahead, build a structure. Build properly that structure with 3D facies model and go right on to stage 4, where you do the petrophysical modeling, which is a very interesting stage because it is different for any type of environment, for all the different environments that we get to build your models for. Stage five is the collaboration stage. And it is important because that's where a lot of people who are not geologists first get to see the model. They have to see the best of the model and they have to see that the model adds value to what they're doing. This slide just shows you that, how this conventional workflow is implemented in typical industry software. This slide grab is from the Jewel Suite but it looks almost like what you get if you are gonna do it from Schlumberger's Petrel, or Paradigm's, or Landmark, or even Roxar's RMS. Almost all of them have these stages. The difference is lies on how the algorithms are implemented in each one of these softwares. That's where the difference is, but the stages, it does seem like the industry does have what fair, have come to some agreement on the key stages in this process. And so, I'm gonna summarize this conventional modeling workflow here. I'm gonna summarize it for us. The first thing you do is to look at your data. And when you look at your data, you wanna review all the information you know about the field. You look at the logs. You should have log with vintages that are spread across 10 years, 15 years, in very mature fields. What you get to do here, is you have to make sure a petrophysicist works to provide you with normalized log. And that your seismic interpretation is gonna produce certain results. These results include horizons, faults, and it can also provide you with stratigraphic analysis that you get to use your conceptual model. How to apply all of these in a little bit detail as we move on. After you did your work correlation, you also get to interpret your facies on wells, that is if you have a lot of wells. If you have very few wells, you get to use seismic. And we're gonna see how that's done also. I've got a few slides that talk about that further down to look. And then you get to build your structure. And once you build your structure, you fill it in with facies, you describe a few other petrophysical property. And the darkest green box you see down here, it allows you to go right ahead and produce those outcome. If you are really new to modeling, this one slide captures everything for you in a simple straightforward way. And that's what you get to pass on to the next guy, who is your reserved one who takes you on. And then, that's also the point or the stage where you get to do a 3D volumetrics. I've mentioned Petrophysical Modeling and said that it was really interesting because you can do it in so many different ways. You do it depending on how much information you have. If you have very little information and you wanna do something, and the purpose is such that you wanna do reserves, then you go deterministic. And if you have a lot of information, or different kinds of information, and you're just doing field development planning, you can stay stochastic. Or you can use trend functions to model properties such as water saturation. This is how you get to use to do a petrophysical modeling. But to the interesting phase and very unique. When you have a fine scale model and you wanna send out a model, or by input simulation, sometimes, there may be a need to reduce the size of the model. It may be because of computing the limitations. It could also be because you have a situation-- You need to be able to reduce those properties because the reservoir is not gonna do a lot of wants and he wants a simpler model. That way, he can manage his wants better. And so here, what you tend to do is to upscale. And upscaling, simply what we get to do here is we just look at the permeability. We calculate or estimate a flux across at a unit area here in the five-scale model. So it's just an example. So what you have is this unique area. You got some cells in the z-dimension. And you also have some cells in the XY dimension. All of these cells together, they triggered some of the flux, which is the pressure dropped across this area. You use that to estimate an average permeability. That varies what you translate into the upscale model. There are several other ways, methods of upscaling. But the one method that has been found to give very robust, very consistent, is the flow-based method. And it is implemented a little differently in Petral, or in the Roxar, or even in GOCAD it's implemented a little differently but flow-based upscaling has the same principle and that's what we talked about. Trying to estimate an average based on a specific shape, you look at the flux that dropped across that shape and that average is represented in the course scale model. For those that have very little experience to intermediate experience, you might need to brush up on geostatistics because a lot of the concepts were used here. A lot of the concepts that are common in 3D modeling or in geomodeling are strongly tied to geostatistics. And these are the five main geostatistical processes that your data gets to pass through. So I'm gonna talk about them. And you know these processes now. You get some familiarity now and it provides the basis to build them all in it. The first stage, as always, is what to do with the information you get at the beginning of the model building process. What are you gonna do with it? You need to be familiar with that information. You wanna check that information doesn't have obvious errors. Look out for behaviours or patterns in that information. Whatever pattern you see, you wanna replicate that pattern throught the course of your 3D model. Three things we do, you can do a depth-related plot of the property if you can well this property. You can also do cross-plot, where you take two different properties. That's multivariate analysis. And then, you're gonna have to block your well. That's discreditization. You're averaging your properties, a very fine log scale to a bit of a quarter scale. And then, data transformation. Most of the algorithms implemented in many of the geomodeling software as deemed that your data has a normal distribution. Because of that, you have to transform your data form whatever distribution it has to a normal distribution. All the algorithms in geomodeling allow you to do all of the back transformation built in at the end of the process. They always back-transforms your data. Because you did it with the information that varies with depth and distance, as you move it in any direction, there needs to be a model that associates the different samples together. And that model is called a variogram. And that's what you have there. Kriging. Kriging allows you to interpolate based on a certain set of rules and guided by a variogram. So the variogram holds the information about spatial location and it provides that information to the kriging algorithm. And that kriging algorithm is able to interpolate and factor in the distance apart. Between sample point, it interpolates random samples here into a very fine grid. This is really good. It's an important step because now, the tool can make this grid close or can make this grid far. So this is the 2D grid. Now, to go ahead and build a 3D grid, there are different types of 3D grid: structured and unstructured. Structured, as you can see over here, that is the grid that is very regular, rectangular, or almost square-shaped or rectangular shape, very regular. And then, unstructured grid are those that have different or heavy unstructured shape. They have heavy grid. And if you really pay close attention to them, you notice that you can make 'em come in different sizes or sometimes, they are homogeneous as you see here. And if you look at the last box that we have down at the bottom here, if you notice the dark spots that get really darker, those are places where you have more cells and they're wrapped right around wells. It allows you to focus on the more well bore area. Conditional simulation allows you to, on the basis of a certain-- So, now we have a reservoir for our cells that have fanned and shield, one to populate that we have a few well. Sequential indicator simulation allows you to sequentially estimate a probability of one cell being fanned or shield, because of it's distance to the wells that are nearby that have fan or shield in them. That's what it does for you. There's variant to that algorithm but it's based on pixels. So it's a pixel-based algorithm. There several of that pixel-based algorithms listed here in that box below. Then, there's the object-based algorithm. Object-based algorithm, what they do is they allow you to build objects and they associate each cell to an object, so it allows you to go back and do more work on that. After that, you estimate your volumes and you want too, that you get that result and you move one.