The birth of Paige`s sonic log simulator on Russian data
Download
Report
Transcript The birth of Paige`s sonic log simulator on Russian data
The birth of Paige’s sonic log simulator on Russian data around 2000AD
The profile on the far right simulates a sonic log at that position in the section.
If you are returning, click on oval to go to routing section for another topic choice.
The birth of Paige’s sonic log simulator on Russian data around 2000AD
The profile on the far right simulates a sonic log at that position in the section.
Of course the fault interpretation is new as of this posting, and the exercise is making me re-consider my normal mode
of display (since I find the strike slip faults easier to see here). I had switched away from “wiggle trace” because it
does not highlight amplitude, which is all important in spotting reservoirs. I did not realize what I was losing. The fact
that different display modes bring out different attributes just emphasizes the vital need for flexible human
visualization, especially where stratigraphy is complex. Current over-automation in seismic interpretation is a losing
way to economize.
However, since source material is long gone, I continue with my old display mode on the validity of my inversion logic.
The convolution of the evolving down wave with the reflecting coefficients can
be thought of as the “trace equation”. Reversing this process by removing the
effect of wavelet shape is seismic inversion. Determining this shape is tough
enough with pure data, but when there are foreign events masquerading as
primary reflections, each type with its own shape variation, we really have a
problem.
We know from experience that earth filtering is a function of distance traveled, and we know there is not only a
large variation in travel path between noise types and signal, but even larger ones between event sets arriving
at different receiver offsets.
Rock elasticity is counter intuitive, yet we know it exists. When we hit a rock layer with a sharp spike of energy, it
is going to vibrate, eventually leading to multiple lobes in the down wave. Initial resistance to this movement
leads to a broadening of the lobes, and a reduction of the sharp edges (higher frequencies). To mathematically
describe the shape at any point in time keeps experts busy, but it seems a little silly when statistical averaging of
the actual events in a time/space zone provides the best guess possible under the circumstances.
For a deeper understanding it is helpful to get inside the convolution process. To compute one trace value we
line up the set of down wave amplitudes with the corresponding set of reflection coefficient amplitudes, crossmultiply and sum over the length of the down wave. To get the next trace value, we shift the process over one
reflection coefficient value and repeat the sequence. Thinking it through at this level emphasizes the difficulty
that foreign events and travel path differences introduce.
Integration seems to encourage resolution in down wave resolution. The ADAPS system works in cycles, using
its current down wave and reflection coefficient spikes to create new event guesses. It saves the data values
that produced these new events, and at the end of the cycle, it uses the saved values to compute a new down
wave. It then calls it’s integration logic to try to make sense out this new set of reflection coefficients, computing
the amount of trace energy removed. When it can no longer improve, it exits the cycle logic.For some reason,
including the integration logic in this working cycle seems to force an earlier convergence.
The proof of our wavelet optimization lies in the consistent excellence of our well log matches, as seen next.
Raw seismic sections are “coincidental” mixtures of primary reflections.
The amplitudes and polarities of stacked events depends on how the primary reflections were aligned when
they reached the recorder.
These alignments depend on effective velocities (a function of distance from source to receiver). Earth filtering
continually produces trailing side lobes with distance traveled. Because of large differences in recording travel
time, these character changes can be significant. Of course they occur before any processing is possible.
In this mix, the strong will prevail. The amplitude of each primary reflection is a function of both the velocity of
the previous layer, and that of the current one. Thus the lower interface between a sand and a limestone will
generate a strong event,making the sand look like a lime, where that between a sand and a shale will be
weaker. Of course the same is true on the upper interface. The point here is that trusting the amplitude of any
stacked event before inverted data is integrated can be specious, at best. In my own development work I was
continually surprised (and pleased) at how the integrated results matched with available well logs. Not perfect,
but certainly a big improvement.
The simulated sonic log section at
the left is a good example of where
integration after inversion made
good sense out of amplitudes. The
well match on this strong red event
matched beautifully, to the point we
could have predicted success if the
run had been made before drilling.
A few words on technique It may be that every last thing is explainable mathematically. At least there is no proof that it is not so. Certainly
this seems to be the hypothesis for those who believe solving seismic problems via a set of provable equations is
the only valid way to go. Way back when I introduced my inversion logic this certainly was the prevailing view. My
method of using a series of optimized guesses met with some derision. However, as I show in slides that follow,
my methods yield remarkable matches with independent well based sonic logs, and that should be a deciding
test. In any case, that is the approach I discuss in this presentation.
The well match
To a sonic log is our logical proof,
and polarity is the key to check.
The non-linear inversion logic
Starts with the first depth point on the line. It
detects the position of the strongest primary
event, stores it and subtracts the energy from the
trace. It repeats until it fails to find an event that
is strong enough, moving on to the next depth
point when it fails. It continues this cycle to the
end of the assigned segment. At the end of each
of these passes it tests to see if it reduced the
trace energy, exiting upon no improvement.
Using this set of coefficient guesses it computes
a new working wave, refreshes the original data
and goes back for another complete pass.
There is a time for ultra seismic accuracy
But that comes after we’ve located something worth looking at.
The big cost of looking comes with 3D coverage. The question is whether
we can get enough resolution from 2D for our initial search. The purpose of
this presentation is to argue that the answer is yes, given spread re-designs
that favor shallow to reasonable depth targets. The possibility exists that
resolution probably would be even better, even at relatively deep depths.
My thesis is heavy noise is intertwined
with the signal in most land prospects
(as is illustrated at the right, where the
bulk of it has been removed by pattern
recognition methods) The power of the
removal logic depends on how many
traces it has to work with. Tightening the
spread makes a major difference.
The depressed exploration environment makes innovative approaches like
this necessary. New (short spread) acquisition will be needed to prove the
2D approach viable. I stand ready to provide free processing for an
extended period.
From reflection basics on
Arguments against conventional practices exist and they should be
heard. This show touches on a few I feel strongly about.
It starts with a warning that normal stacks are a complex mixture of
primary events plus noise, not representing lithology. It goes on to
show how wave shapes differ significantly between offsets, raising
questions on AVO validity.
Non-linear inversion and integration come next, with emphasis on
how integration makes stratigraphic sense out of the complex mixture
of primary events and noise. A shale play example is used.
A fairly spectacular coherent noise removal finishes the primer set,
and the series closes with 11 “before and after” examples with sonic
log matches. Study of these matches not only proves the non-linear
logic used, but emphasizes why thickness predictions facilitate long
range stratigraphic correlations, especially across complex faulting.
An advanced explanation of a few often over-looked seismic principles.
The difficulty of communicating over-lapped subjects is that, to focus on one, requires explaining
the others. My work over the last 45 years has covered non-linear inversion, sonic log simulation
(through subsequent integration), coherent noise removal & strike slip fault tracking. Since my
productive time has about run out, a current goal is to convince the industry that my ideas have
merit. This personal need is fostered by my opinion that my logic beats all others. Most of my
postings over the past few years have concentrated on one or another of these four topics. This
presentation tries to bridge all of them. Because of industry structuring, interpreters often are not
exposed to the technical details I cover. Hopefully my graphical explanations will help.
BEFORE
AFTER
How Paige’s inversion and integration can make all the difference.
Proving once more that straight stacks represent
jumbled sets of primary reflections and noise that
need to be integrated into a simulation of lithology
before they are of much use.
Even though the improvement above is significant, working with the data tells me
there still is a lot of improvement within sight. An example is doing the inversion
before stack. Luckily, because of the modular architecture of the inversion logic it
was possible to work this new concept into the overall structure.
Optimized stack
inverted and integrated
You may have seen this “before and after”
but some things badly need repeating.
Let us first look at the central well based sonic log
that overlays both sides of the comparison. The goal of
my logic is to duplicate this log, lobe by lobe. In my effort
to eliminate noise, the study of this comparison is my
main test of success. While this result is not bad, it is far
from perfect (circled areas are good, an arrow points to a
bad one. My assumption is that serious noise was
missed at that point, (since most of the match is quite
good).
Starting with error – On the last slide I pointed out that
substantial differences in travel times to and from offsets
create significant wavelet shape changes. Thus traces
going into each gather all have different shapes, and the
trace character of that gather is dependent on how they
mix. Therefore, even disregarding probable noise, we go
into the final stack with imperfect statistics.
To go from complex sets of primaries to lithology is
an ambitious jump, considering the problems faced. The
trick is getting the best answer possible under the
circumstances. Linear, equation driven mathematics
does not give us the flexibility we need to adjust to the
reality of nature, so we turn to statistical optimization and
advanced pattern recognition. We use this power to spot
the onsets of the actual primary events with an educated
guess at the down wave shape, then use this answer to
improve on the primary event guesses then use this set
to improve on wavelet shape. Amazingly enough this
really works as proven by countless well matches like
this one.
Predicting bed thickness is the target discussed on the
next slide. This data is from a shale play, and increasing
resolution to the point of recognizing thick beds is vital.
And here is an expanded view featuring the target shale. Some shale detail is already visible.
My non-linear inversion and integration has done a remarkable job (as shown by the sonic log match). Much of
the credit goes to the pre-stack noise removal logic. However, it was clear during this phase that we did not get
all of the noise (we did not even concentrate in this zone). The next slide shows how complex the mix can be.
To the left we have a raw gather and to the right the same data with noise lifted. I say the jumbled mess we
have on this input is typical of land data. At first glance one would suspect a severe statics problem. On
the right, however, we see that we can bring out the desired signal by predicting and lifting off the noise.
The fact is such noise is present on every prospect, to some degree. The problem
is seeing it! To do so requires intense pattern searching on the gathers, and this is not normally done. When we
add the factors I outline next, such things as AVO claims and frequency domain waveform generation become
questionable, and the need for non-linear approaches becomes apparent.
Before
Now for several before and
after examples. This one is
a Buda/Del Rio comparison.
The intent of this presentation is to
promote new processing work that
is oriented to improving the ability
to “see” fractures (as well as other
stratigraphic features).
The ability to closely simulate a sonic
log (without the entry of any well data)
provides a logical proof that all of the
internal processes are valid.
after
Because it is so important that readers
trust the honesty of the logic, I have
included a bunch of before and after
examples, all with outstanding “after the
processing fact” well matches. Where
one good match might be an accident,
the dozens I have shown show a very
solid consistency. I have never had to
exclude one.
At the time these runs were made, I had
permission to show results. Of course
no location information is visible, partly
because I make it a practice to not know
myself.
Before
When this was first run
My attention was on the great improvement
on seismic resolution coming from the
“simulation of lithology” logic. It was not
until preparing this slide that I realized this
example was such a prime example of how
common the strike slip fault slip problem is,
and how the reservoirs are bounded by the
fault breaks.
I ask you to consider both.
.
After
Please note the minute well match details,
including the polarity flip-flops.
If you have made the common mistake of
thinking the match on the straight stack is
good, look again at the polarities. Protocol
is red left and blue right.
There is a very valuable lesson to be learned
studying the results below this point. The
resolution provided by the integration has
flipped almost every event polarity. This is
why inversion and integration are required
before any stratigraphic assumptions are
made.
Rocky Mountain post-stack.
This is Rocky Mountain post stack input. It is
obvious no one was concerned with polarity on
their well matches.
On the next slide you’ll see an allmost perfect
match on the inverted and integrated result.
SO please toggle.
I will admit that the
parameters were tailored
to Vibroseis
input.
perfection
of these next
matches makes me a little
Here I show the results first. Again note the
sad aswell
I start
remarkable
match.to shut my
operation down, having
failed to sell my concepts
after over 50 years of
effort.
A “post stack” example where
On the other hand I have
enjoyed the battle.
Please toggle with straight
stack input.
(Please, no philosophy).
.
Toggle back.
Rocky mountain (post stack) stacked input, line Y
And one more pair.
Please toggle w results
Rocky mountain (post stack) inversion and integration, line Y
From here on, click on subject image to go to content.
Strike slip (parallel throw) faulting should be accepted as a geological fact.
They are caused by shallower beds being pulled apart by
the deep plate movements caused by continental drift.
Because their primary movement is lateral, they often do not
exhibit any vertical throw, and thus are hard to track.They went
un-noticed for years because seismic resolution was not good
enough to see the fault patterns. I say my “simulation of sonic
log” logic enabled me to do the tracking shown at the left. The
emphasis on thickness, and the simplification of the
stratigraphic picture were key. Click on this picture to go to the
strike slip content. It is a large show so give it time to load.
Several more groupings are being constructed and will be here soon.
The fact that intertwined noise
exists (at least) on all land data
is perhaps the most important
message in this series.
And attacking the central cone
noise mistakenly thought of as
ground roll helped define the
evolution of a salt dome.
The end.
Before
After
I switch to an effort to map strike
slip faults in the North Sea. The pro
bono project started as a trial of my non-linear
inversion, but deep results were too random
for any satisfactory answer. I then switched my
attention to noise removal, and finally began to
see a fault pattern emerge. As you will see,
what looked like a noise mishmash became a
reasonable geological picture after the faults
are picked.
To the far left you see the best the client could
do on a cross line that cut the discovery well.
To the right you see my final result, showing a
pretty fair sonic log match. This cross line tied
into a series of in-line that tied to the well data
here. This series is shown next. A blow up of
the well match is at the lower left.
The same well image is shown on all. Note
the U1 regional shale identified as event B. It
plays a major role in establishing the fault
pattern on the series of slides that follows.
Clicking through the series is most helpful.
Hopefully you will agree the strike slip faults I
show on this cross line are supported by the
in-lines that follow.
Click on upper picture for North Sea series