In the third part of a series on the future of gas exploration in Australia, New Scientist examines the sophisticated new sensing and analysis techniques for sniffing out hydrocarbons in ever more complex geological formations
At the same time, other companies are eyeing the next generation of offshore gas reserves. These will come from ultra-deep wells stretching 1500 metres or more beneath the ocean floor.
These reservoirs are the last frontier for hydrocarbon discovery. They’re not particularly hard to find: the challenge is to work out what they contain, how much can be extracted and at what cost. The sheer dimensions and depth of these fields make this time-consuming and expensive. But it is the geological complexity of the rock that makes these reservoirs so hard to characterise and this kind of exploration so financially risky.
That’s why a new generation of engineers is developing techniques that can characterise these regions in greater detail than ever before. The ultimate goal is to develop three-dimensional computer models of vast regions of gas-bearing rock that reveal not only how much gas is trapped but also the geological and chemical environment in which it sits, and even how this will change over time as the reserve is tapped.
For a conventional gas field, it is relatively straightforward to work out how much gas can be extracted because the gas is free to flow but usually trapped beneath an impermeable layer, like a bubble. Most of this can be extracted by drilling through the impermeable rock, allowing the gas to flow up and out.
But so-called unconventional resources such as shale gas and coal seam gas are much harder to characterise because the gas cannot flow easily through these rocks. The amount that can be extracted depends on the network of natural cracks in the rock, how easily it can be fractured to produce more cracks and hence release more gas, whether the gas is chemically bound to the rock and so on. Then there is the type of rock, the shape of the formation, how easy it is to drill into – the list goes on.
Oil and gas companies typically characterise resources through a combination of exploratory drilling and sophisticated computer models. They create these models by first mapping promising sections of geological basins – on land or underwater – by transmitting seismic waves into the ground and measuring the way they are reflected and bent by deep geological features.
One long-standing problem for these kinds of surveys are vast layers of salt deep underground that tend to blur seismic waves, making it hard to probe what lies beneath. Salt is impermeable to oil and gas so important resources can be trapped beneath this layer.
To get around this problem, engineers have developed mapping techniques capable of looking beneath the salt. BP’s Wide-Azimuth Towed Streamer system represents the state-of-the-art for creating 3D seismic images. WATS uses several ships sending acoustic waves into the water from different positions. These are flanked by other boats towing long floating cables carrying hydrophones to record the echoes. These cables can be several kilometres apart (see diagram).
This distance ensures that the hydrophones receive the same rebounding seismic waves from a variety of angles. This provides a 3D perspective that can reveal what lies beneath salt layers.
These surveys are also getting faster thanks to new techniques that reduce noise and filter out unwanted reflections. This allows engineers to use natural seismic waves generated by the Earth itself. And a new generation of sensing techniques use a single optical fibre to sense the reflected seismic waves along its entire length. This allows large areas to be monitored continuously for long periods of time. “The cost of seismic acquisition drops to almost nothing,” says geophysicist Roman Pevzner at Curtin University in Perth, Australia. Such long-term monitoring can be useful for tracking which parts of a reservoir are being exhausted and which should be drilled next.
Seismic data is just the start, however. It can also be combined with electromagnetic surveys – which can distinguish between brine, which acts as a conductor, and gas, which has a higher resistance – and gravity measurements from satellites, which reveal different densities in the layers of rock below the ground, to give a more complete picture without resorting to expensive exploratory drilling.
“Once we have sufficient confidence from the seismic and geological interpretation that there is a reasonable prospect of finding gas, we may choose to drill an exploration well,” says Shaun Gregory, senior vice-president of sustainability and technology for Woodside, Australia’s largest oil and gas company, based in Perth.
Drilling is still the only way to be certain that gas is present. It also produces samples that reveal the pore structure of the rock in the formation. These can all be combined, along with mapping data, to create a computer model of the potential reservoir to gain insight into where the “sweet spots” to drill production wells might be.
The explosion of computing capacity is making these models cheaper and more accurate, says Bill Barkhouse at the Society of Exploration Geophysicists (SEG) in Houston, Texas, which is pioneering these methods.
The goal is to combine data about the rock at many different scales, from the molecular dynamics of how gas adheres to a substrate, to how gas flows through different types of rock, all the way to the behaviour of an entire gas field. “We are able to look at every level,” says George Moridis, head of the Hydrocarbon Resources Program at the Lawrence Berkeley National Laboratory in California.
One such advanced model is currently being developed by the SEG with companies such as Chevron and Royal Dutch Shell. Called SEAM (SEG Advanced Modeling Program), the project has four phases, the first of which modelled a chunk of the sea floor 40 kilometres by 35 kilometres in the Gulf of Mexico to a depth of 15 kilometres, with a resolution of just 10 metres.
The SEAM team has now turned its attention to land-based models of gas reservoirs that have been hydraulically fractured to work out how hydrocarbons flow through these rocks. The team also wants to predict the pressure at different points in a formation, an important factor in drilling operations. The ultimate goal is to simulate the entire life of a gas field as the hydrocarbons are extracted.
But despite the huge advances made in gathering data and analysing it, Barkhouse says these models are too expensive and time-consuming to be commercially useful at this point. The SEAM phase I cost over $5 million and took 24 experts six years to create, finishing in July 2013. By contrast, a land-based exploratory well costs around $1 million and produces results on a timescale of weeks rather than years. Exploratory marine wells are much more expensive and can cost upwards of $100 million to drill. But again, they produce quicker results, a key factor in a fast-moving industry.
Nevertheless, engineers are increasingly turning to supercomputers to analyse the flood of data their surveys are generating. BP’s Center for High-Performance Computing in Houston houses a machine capable of more than 2 quadrillion calculations per second (2.2 petaflops) for analysing the data from WATS and other sources. Computer scientists there are developing machine-learning algorithms that comb the data for “promising anomalies”.
Gregory says the future of this technology is clear. Sensors will continue to get smaller, cheaper and less power-hungry, and they will feed ever growing amounts of data into increasingly powerful computers. He envisions networks of nanoscale sensors that can be injected into a well, giving a detailed picture of its structure and behaviour.
But sensing and computing will never entirely replace good old-fashioned exploratory drilling, like that in Tanumbirini. As Moridis says: “You cannot produce gas out of a computer.”
The topics in this series were developed by New Scientist in conjunction with APPEA.