Amazing clarity.

Sverre Brandsberg-Dahl and the team at PGS are generating high resolution 3-D seismic images that more accurately locate oil and gas resources.

Printable version
See more stories
Submit your own story

The oil and gas industry is in a fierce efficiency race. With energy consumption on pace to double by 2040, demand for oil and gas is climbing. At the same time, resources are getting harder to find — and the industry as a whole is experiencing cutbacks.

This economic reality means exploration and production (E&P) companies can’t afford to guess at where oil and gas resources reside. They have to know. To do that, they need the highest-quality images of the subsurface in the shortest time possible.

For PGS, a marine seismic company that creates these high-resolution 3-D images that E&P companies use to determine whether to invest in an area or not, it means their images must be increasingly accurate and clear. And they must remain competitively priced.

Solution

Cray® XC™ series supercomputer
Cray® Sonexion® storage system

System Details

24 cabinets, all CPU
600 TB of memory
Aries™ interconnect

Working faster, better, and smarter is the name of the game. That’s because data volume, fidelity and algorithmic complexity are on the rise and processing all of it into an image requires immense computing power.

“There’s more pressure on us to produce [images] in a cost-effective way,” says PGS’s chief geophysicist Sverre Brandsberg-Dahl. “We needed to do something radically different.”

That radically different “something” was a complete change of their computing infrastructure. In 2014 PGS found themselves faced with their most complex imaging challenge ever. They had conducted a survey of a critically important, resource rich — but extraordinarily complex — area of the Gulf of Mexico. The “Triton” survey covered 10,000 square miles, took two years to plan and almost a year to conduct. At the end, they had not only the most complex but the largest seismic survey to process ever.

While the survey produced massive amounts of data and the potential for the clearest subsurface images of the region yet, processing that data required more compute power than the clusters that PGS and companies like them typically use.

In PGS’s case, the wealth of information in the Triton dataset would have gone to waste. Their cluster system was simply not capable of processing it. “It’s a big data mining effort to go in and find the energy that is good and discard what is not,” says Brandsberg-Dahl.

This reality prompted the company to make the switch from clusters to a Cray supercomputer. Supercomputing technology gave PGS both the immediate ability to process the Triton survey in the shortest time possible as well as keep up as survey demands continue to intensify.

“This is why we need supercomputers,” says Brandsberg-Dahl. “[These] are massive imaging challenges by all standards and we have the ability to do this now having moved to this distributed computing system. We delivered this survey on time, on budget and with much improved geophysical quality.”