Mary_Wheeler - Department of Computational and Applied

Download Report

Transcript Mary_Wheeler - Department of Computational and Applied

Computational Framework for
Subsurface Energy and
Environmental Modeling and
Simulation
Mary Fanett Wheeler, Sunil Thomas
Center for Subsurface Modeling
Institute for Computation Engineering and Sciences
The University of Texas at Austin
Acknowledge

Collaborators:
• Algorithms: UT-Austin (T. Arbogast, M. Balhoff, M.
Delshad, E. Gildin, G. Pencheva, S. Thomas, T. Wildey): Pitt
(I. Yotov); ConocoPhillips (H. Klie)
• Parallel Computation: IBM (K. Jordan, A. Zekulin, J.
Sexton); Rutgers (M. Parashar)
• Closed Loop Optimization: NI (Igor Alvarado, Darren
Schmidt)

Support of Projects:
NSF, DOE, and Industrial
Affiliates (Aramco, BP, Chevron, ConocoPhillips,
ExxonMobil, IBM, KAUST)
Outline





Introduction
General Parallel Framework for Modeling Flow,
Chemistry, and Mechanics (IPARS)
• Solvers
• Discretizations
• Multiscale and Uncertainty Quantification
• Closed Loop Optimization
Formulations (IPARS-C02)
• Compositional and Thermal
Computational Results
• Validation and Benchmark Problems
Current and Future Work
Societal Needs in Relation to Geological Systems
Resources Recovery
• Petroleum and natural gas recovery from
conventional/unconventional reservoirs
• In situ mining
• Hot dry rock/enhanced geothermal systems
• Potable water supply
• Mining hydrology
Waste Containment/Disposal
• Deep waste injection
• Nuclear waste disposal
• CO2 sequestration
• Cryogenic storage/petroleum/gas
Underground Construction
• Civil infrastructure
• Underground space
• Secure structures
Site Restoration
• Aquifer remediation
• Acid-rock drainage
Highly Integrated
Multidisciplinary, Multiscale, Multiprocess
Exploration
Characterization
Geological
Eng.
Geology
Physics
Mathematics
Geophysics
Exploration
Characterization
Diagnostics
Earth Stresses
Mechanical
Geomechanics Rock/Soil
Behavior
Fluid Flow
Waste
GeoHydrology
isolation
Hydrocarbon
Petroleum Recovery
Engineering Simulation
Construction
Soil Mech.
Civil
Rock Mech.
Engineering
Struct. Anal.
Geochemistry
Mining Design/
Mining
Stab, Waste,
Engineering
Land Reclamation
Waste isolation
Mechanical
Engineering
Drilling & Excav.,
Support, Instruments
Computer
Sciences
Code Development,
Software Engineering
Long Range Vision: Characterization
And Management of Diverse Geosystems
Uncertainty
Assessment
3D Visualization
& Interpretation
Data
Management
Sensor Data
Management
Characterization
& Imaging
Sensor Placement
Complex
Geosystem
Management
Optimization
and Control
Multiscale
Simulation
Geophysical
Interpretation
Petrophysical
Interpretation
Multiphysics
Simulation
A Powerful Problem Solving Environment
Framework Components

High fidelity algorithms for treating relevant physics:
• Locally Conservative discretizations (e.g. mixed finite element
and DG)
• Multiscale (spatial & temporal multiple scales)
• Multiphysics (Darcy flow, biogeochemistry, geomechanics)
• Complex Nonlinear Systems (coupled near hyperbolic &
parabolic/ elliptic systems with possible discrete models)
• Robust Efficient Physics-based Solvers (ESSENTIAL)
• A Posteriori Error Estimators

Closed loop optimization and parameter estimation
• Parameter Estimoation (history matching) and uncertainty
quantification

Computationally intense:
• Distributed computing
• Dynamic steering
The Instrumented Oil Field
Detect and track changes in data during production.
Invert data for reservoir properties.
Detect and track reservoir changes.
Assimilate data & reservoir properties into
the evolving reservoir model.
Use simulation and optimization to guide future production.
IPARS: Integrated Parallel and
Accurate Reservoir Simulator
PHYSICS BASED SOLVERS
Fractures
K Tensor
Multiple Physics
Reinforced
Learning
Flow Regimes
Heterogeneity
Physics
Random Graph
Theory
Well Operations
Multiresolution
Analysis
Numerical
representation
Randomized
Algorithms
Insights
AMG
FDM
MFE
AML
CVM
DG
DD
Discretization
MPFA
Mortar
Numerical
Solution
Physics-based
Solvers
Solvers
HPC
Krylov
LU/ILU
Why Multiscale?



Subsurface properties vary on the
scale of millimeters
Computational grids can be refined
to the scale of meters or
kilometers
Multiscale methods are designed to
allow fine scale features to impact
a coarse scale solution
• Variational multiscale finite
elements
 Hughes et al 1998
 Hou, Wu 1997
 Efendiev, Hou, Ginting et al
2004
• Mixed multiscale finite elements
 Arbogast 2002
 Aarnes 2004
• Mortar multiscale finite elements
 Arbogast, Pencheva, Wheeler,
Yotov 2004
 Yotov, Ganis 2008
Upscale
Basic Idea of the Multiscale Mixed Mortar Method
Multiscale Mortar Mixed Finite Element Method
Domain Decomposition and Multiscale
Domain Decomposition
For each stochastic realization,
time step and linearization
Subdomain
solves
Apply
precond.
Multiple
subdomain
solves
For each stochastic realization,
time step and linearization
Compute data for
interface problem
Subdomain
solves
Compute data for
interface problem
Precondition
data
Multiple
subdomain
solves
Compute multiscale
basis for coarse scale
Solve the
interface problem
Multiple
precond.
applications
Subdomain
solves
Multiscale Approach
Solve local problems
given interface values
Multiple linear
combinations
of basis
Subdomain
solves
Solve the
interface problem
Solve local problems
given interface values
Domain Decomposition and Multiscale
Multiple
subdomain solves
Compute the multiscale
basis for a training operator
For each stochastic realization,
time step and linearization
Subdomain
solves
Apply
Multiscale
precond.
Fixed number
of subdomain solves
Fixed number of
multiscale precond.
applications
Subdomain
solves
Compute data for
interface problem
Precondition
data
Solve the
interface problem
Solve local problems
given interface values
Example: Uncertainty Quantification

360x360 grid

25 subdomains of equal size

129,600 degrees of freedom

Continuous quadratic
mortars

Karhunen-Loéve expansion
of the permeability truncated
at 9 terms

Second order stochastic
collocation

512 realizations

Training operator based on
mean permeability
Mean Permeability
Number of Interface Iterations
Interface
Mean Pressure
Solver Time
Example: IMPES for Two Phase Flow







360x360 grid
25 subdomains of equal
size
129,600 degrees of
freedom
Continuous quadratic
mortars
50 implicit pressure
solves
100 explicit saturation
time steps per pressure
solve
Training operator based
on initial saturation
Absolute Permeability
Number of Interface Iterations
Initial Saturation
Interface
Solver Time
Finite Element Oxbow Problem
FD & FEM Couplings: 3 Blocks with Fault
Solution
Continuous Measurement and Data
Analysis for Reservoir Model
Estimation
Source: E. Gildin, CSM, UT-Austin
Continuous Measurement and Data
Analysis for Reservoir Model
Estimation
Optimization &
Supervisory
Control
Field
Controller(s)
IPARS
Reservoir
Dynamic
I/F
Data Assimilation
(EnKF)
Online Analysis
(Data Fusion,
Denoising,
Resampling…)
Data Acquisition
(Sensors + DAQ)
Source: I. Alvarado and D. Schmidt, NI
Parameter Estimation Using SPSA
Key Issues in C02 Storage






What is the likelihood and magnitude of CO2 leakage and
what are the environmental impacts?
How effective are different CO2 trapping mechanisms?
What physical, geochemical, and geomechanical
processes are important for the next few centuries and
how these processes impact the storage efficacy and
security?
What are the necessary models and modeling capabilities
to assess the fate of injected CO2?
What are the computational needs and
address these issues?
groundwater
flow
capabilities to
How these tools can be made useful
and accessible to regulators and industry?
drinking-water
aquifer
CO2 leakage
deep brine aquifer
Global Experience in CO2 Injection
From Peter Cook, CO2CRC
CO2 Sequestration Modeling Approach

Numerical simulation
 Characterization (fault, fractures)
 Appropriate gridding
 Compositional EOS
 Parallel computing capability

Key processes
 CO2/brine mass transfer
 Multiphase flow

During injection (pressure driven)

After injection (gravity driven)
 Geochemical reactions
 Geomechanical modeling
IPARS-COMP
Gridding
Parallel
Solvers
EOS Comp.
Geomechanics
Geochemical
Reaction
Thermal
2-P Flash
Graphics
Physical Prop
Numerics
IPARS-COMP Flow Equations
Mass Balance Equation
  Ni 
t




 .    i u    S Di i   qi


Pressure Equation
Solution Method

Iteratively coupled until a volume balance convergence
criterion is met or a maximum number of iterations
exceeded.
Thermal & Chemistry Equations
Energy Balance
 Solved using a time-split
  MT T 




.

C
u
T


T

 p 

  qH
scheme (operator splitting)
t


 Higher-order Godunov for
Internal energy : M T
advection
M T  1    s Cvs    Cv S

 Fully implicit/explicit in time and
Mixed FEM in space for thermal
conduction
Chemistry
 System of (non-linear) ODEs
 Solved using a higher order
integration schemes such as
Runge-Kutta methods
Coupled Flow-Thermal-Chemistry Algorithm
CO2 EOR Simulations
Validation
SPE5 -- A quarter of 5 spot benchmark WAG problem
3-phase, 6 components C1, C3, C6, C10, C15, C20
IPARS-CO2 vs CMG-GEM
Cum. oil produced
Cum. gas
Prod
Inj
Validation
CO2 pattern flood injection
3-phase, 10 components CO2, N2, C1, C3, C4, C5, C6, C15, C20
IPARS-CO2 vs CMG-GEM
CO2 conc.
Cum. gas
Inj
Prod.
Parallel Simulations
Modified SPE5 WAG injection
 Permeability from SPE10
 160x160x40 (1,024,000 cells)
 32, 64, 128, 256, 512 processors
Oil pressure and water saturation
@ 3 yrs
Gas saturation and propane conc.
@ 3 yrs
Parallel Scalability
Hardware
Texas Advanced Computing Center
The University of Texas at Austin
Lonestar: Linux
cluster system
Blue GeneP: CNK
system, Linux I/O
1,300 Nodes /
5,200 cores
262,144 Nodes /
1,048,576 cores
Processor Arch:
2.66GHz, Dual
core, Intel Xeon
5100, Peak: 55
TFlops/s
Processor Arch:
850MHz,
IBM CU-08, Peak:
~1 PFlop/s
8 GB/node
2 GB/node
Network:
InfiniBand, 1GB/s
Network:
10Gb Eth,1.7GB/s
Software
GMRES, BCGS, LSOR, Multigrid.
MPI: MVAPICH2 library for parallel
communication
Scalability On Ranger (TACC) & Blue Gene P
GMRES solver with Multigrid Preconditioner
3500ft, 3500 ft, 100ft reservoir
40x160x160=1,024,000 elements
CPUs: 32, 64, 128, 256, 512, 1024
Ranger (TACC)
Blue Gene P
CO2 Storage Benchmark Problems
A Benchmark-Study on Problems Related to CO2 Storage in
Geological formations, Summary and Discussion of the Results
H. Class, A. Ebigbo, R. Helming et al., 2008
Benchmark Problem 1.1
CO2 Plume Evolution and Leakage via
Abandoned Well
Objective
Quantification of leakage rate in
deep aquifer @2840-3000 m
K = 20 md
Output
1- Leakage rate = %CO2 mass
flux/injection rate
2- Max. leakage value
3- Leakage value at 1000 d
 =0.15
P = 3.08x104 KPa
Benchmark Problem 1.1
Leakage Rate of CO2
CO2 BT: 10 days
Peak Leakage value: 0.23%
Final leakage value: 0.11%
Agrees with semi-analytic
solution (Nordbotten et al.)
Comparison with Published Results
at 80 days
IPARS-COMP
Gas Saturation
Pressure
Ebigbo et al., 2007
Frio Brine CO2 Injection Pilot
Bureau of Economic Geology
Jackson School Of Geosciences
The University of Texas at Austin
Funded by DOE NETL
Frio Brine Pilot Site





Injection
interval



Injection interval: 24-m-thick,
mineralogically complex fluvial
sandstone, porosity 24%,
Permeability 2.5 D
Unusually homogeneous
Steeply dipping 16 degrees
7m perforated zone
Seals  numerous thick shales,
small fault block
Depth 1,500 m
Brine-rock, no hydrocarbons
150 bar, 53 C, supercritical CO2
Oil production
From Ian Duncan
Frio Modeling Effort

Stair stepped approximation on a 50x100x100 grid (~70,000
active elements) has been generated from the given data.
Figure shows porosity in the given and approximated data.
Solution profiles

Pressure and close-up of top-view of gas (CO2) saturation at t=33
days. Simulations on bevo2 cluster at CSM, ICES on 24
processors.
CO2 Plume Transport

CO2 saturation as seen below the shale barrier at t=2 and 33 days.
Breakthrough time is observed to be close to 2 days.
Current Research Activities at CSM
Model CO2 injection either in deep saline aquifers or
depleted oil and gas reservoirs using compositional
and parallel reservoir simulator (IPARS-CO2)
 Large scale parallel computing
 Efficiency with different solvers
 Couple IPARS-CO2 with geochemistry
 Couple IPARS with geomechanics
 Enhance EOS model and physical property
models (effect of salt, hysteresis, etc)
 Data sources, field sites, practical applications
(in collaboration with Duncan from BEG at UT)
 Gridding and a posteriori error estimators
 Optimization
 Risk and uncertainty analysis