Kansas Geological Survey, Subsurface Geology 12, p. 5-6
W. C. Ross1, J. A. May1, J. A. French2, and D. E. Waus1
1Marathon Oil Company
2Kansas Geological Survey
Quantitative stratigraphic modeling is emerging as a new tool in stratigraphic analysis. By systematically varying the major controls on depositional-sequence development (e.g., sea level, subsidence, and sediment supply), quantitative stratigraphic modeling provides new insights into the relative contribution of each variable. When tied into a basinal data base, this technology offers a means for predicting lithologic associations away from existing control. Geologic-data sets play a critical role in constraining both the model design and the range of possible solutions in a modeling exercise.
Stratigraphic-modeling packages consist of an assemblage of mathematical algorithms that are simplified representations of complex geologic processes. The choices of which processes to include in the model and descriptions of their interdependence constitute hypotheses for how natural systems operate. The process of evaluating these hypotheses is accomplished by running the model repeatedly, with different input variables, and comparing the results with a chosen geologic-data set. Alterations to the algorithms or input parameters are made until a satisfactory match is achieved. No illusion is held that the model is the uniquely "correct' solution to reproducing a geologic section. Rather, a "successful" model solution is defined as one where the output adequately matches a geologic-data set. The more geologic data the model can reproduce, the more rigorous or useful the model, and the greater the likelihood that the model will produce reliable predictions in areas of no control.
Therefore, for quantitative models to gain our confidence as tools for stratigraphic analysis and prediction, they must be tested, i.e. compared to and calibrated with large, high-quality geologic-data sets. Unfortunately, the level of stratigraphic detail available in most geologic-data sets is relatively poor and this limits the level of detail at which model-to-data comparisons can be made. As a result, model output that is more detailed than the data base available for comparison cannot be verified. Along with the geologic-data base limitations, there is a desire to keep the number of model-input parameters to a minimum in order to limit the number of possible model solutions. Consequently, a model design strategy has been formulated that matches the complexity of the model output to the level of data quality available in the chosen data sets. To test the model, we devised strategies for 1) describing the geologic record for the purpose of model-data comparisons, 2) statistically analyzing the data to determine the sources of variance, and 3) assigning a measure of uncertainty to model predictions based on repeated model runs (using random sampling of probabilistic variable distributions).
We have utilized two data sets thus far to constrain both model design and model solutions: a series of measured sections along a dip-oriented outcrop belt of the Book Cliffs, eastern Utah, and a subsurface-data set from the Washakie/Red Desert basin, south-central Wyoming. The relative strengths and weaknesses of the outcrop and subsurface data sets provide "ground truth" for different portions of a clastic depositional system. The closely spaced (<2 miles [3.2 km]) sections of the Book Cliffs provide detailed correlation and facies information in the transition between coastal plain and shallow-marine shoreface environments. The description of and relationships between the shoreline sub-environments (foreshore, upper and lower shoreface, and transition zone) also provide constraints on wave-process and storm-current modeling. The subsurface-data set provides a basin-scale view of depositional systems where chronostratigraphic units can be correlated from topset to bottomset environments and where regional relationships between turbidite and shoreline sand systems can be defined. Although lacking in detailed facies, the basinal correlation framework provides information on three-dimensional sequence geometries, gross lithologies, and total sediment supply through time.
Strategies for describing the data sets involve the correlation and digitization of time lines, facies boundaries, and individual lithologies from each section. For the purpose of model-to-data comparison, the stratigraphic data are decompacted to reconstruct depositional geometries. Decompacted, restored sections prove invaluable in resolving correlation problems, such as those resulting from differential compaction between coaly coastad-plain environments and sandy shoreface systems.
The process of comparing model results to the reconstructed data set involves numerous iterations of the modeling program utilizing different combinations of model-input parameters. To limit the number of variables included in the deterministic-model design, an analysis of variance is performed on the reconstructed data set(s). This procedure evaluates the number and relative importance of the variables (or their proxies) that control the distribution of lithologies. Armed with this information, the relative complexity of the deterministic-model design (i.e. the number of variables) can be adjusted.
Finally, we attempt to define the uncertainties in the deterministic-model predictions that are associated with the uncertainties contained in the model-input variables. To the extent that the basinal-data set constrains the ranges of these variables (e.g., base-level change = topset thicknesses), the level of uncertainty can be reduced. Ranges of variables that cannot be constrained by the data set (e.g., rates of sediment supply, rates of sea-level change) must be estimated or guessed based on information or experience from other basin studies. Given a frequency distribution of possible variable values, the stratigraphic model can be run repeatedly utilizing a random sampling scheme of the unconstrained variable ranges. This procedure produces a solution set that can be used to estimate the uncertainty in predicted lithologies at each location in a basin or along a cross section.
Thus, to best evaluate the results of a stratigraphic-modeling package, we must have detailed knowledge of the geology we are trying to match. In most cases the complexity of the model result exceeds our knowledge of the geologic system. Advances in stratigraphic-modeling technology must also be paralleled by the collection and description of high-quality data sets (i.e. with detailed correlation and facies interpretations) which can constrain both the model design and the range of possible model solutions.
Figure 1--Present-day and decompacted stratigraphic cross sections along the Book Cliffs illustrating the effect of differential compaction on marine to nonmarine correlations. The decompaction exercise demonstrates the time equivalence of vertically stacked shoreface sandstones, T9 through T13, to coaly coastal-plain facies.
Kansas Geological Survey
Comments to email@example.com
Web version May 4, 2010. Original publication date 1989.