Post

Energy Companies Lead in Commercial Supercomputer Adoption

Salvatore Salamone's picture
Former Editor, Energybiz

Salvatore Salamone is a physicist by training who has been writing about science and information technology for 25 years. Sal is the author of three business technology books and a frequent...

  • Member since 2008
  • 32 items added with 16,133 views
  • Mar 13, 2014
  • 3918 views

Energy exploration and production is highly dependent on the rapid analysis of increasingly more complex and larger seismic imaging datasets. At the same time, the modeling, reserve management, and analytics algorithms being used on that data are becoming more granular and sophisticated requiring enormous processing power. 

The field has always been a big user of high performance computing (HPC). But the growing need for faster and more precise results is driving some companies to really push the envelope when it comes to using computing power. 

To that point, within the last year three energy companies started using supercomputing systems that normally only would be found in government labs or major academic research facilities.

Total E&P (headquartered in France) installed a system that ranked fourteenth on the Top500 list of the world's most powerful supercomputers. The Top500 list is published twice a year and ranks computers based on the results of a benchmark performance test, the LINPACK benchmark.*

The Total E&P system is the only commercial system on the list in the top 25 and one of only four commercial systems in the top 50. It was benchmarked as having a performance of 2.3 petaflops (a petaflop is a measure of a computer's processing speed, specifically the ability to perform one quadrillion floating point operations per second).

This computing power will be used to help reduce the uncertainty of oil and gas exploration and production. It also will allow Total E&P to take more physical phenomena into account in its simulations. 

The second system of note is one installed by BP at its Center for High-Performance Computing in Houston. The system is a 2.2 petaflops system that will help BP to render precise images of the subsurface, which in turn will boost the company's ability to find and develop new energy resources. The system's high-speed processing capability will reduce the time it takes to analyze massive quantities of seismic data and it will enable more detailed in-house modeling of rock formations before drilling begins.

In late November, after the most recent Top500 list was published, Italian energy company Eni announced it had installed a new supercomputer that delivers in excess of 3 petaflops of processing power. The new supercomputer will support the firm's seismic imaging and hydrocarbon exploration activities. 

Certainly, there are other industries that rely on HPC such as automotive and aeronautics. But in the commercial world, the use of such powerful supercomputers such as the ones installed at Total, BP, and Eni is unprecedented.

Simply put, Total, BP, and Eni are running three of the largest private supercomputers in the world. A few other energy industry companies have announced their plans to deploy similarly powerful systems in the years to come. 

The need for such processing power highlights how energy exploration and production is becoming an ever-more data-driven operation, which relies on rapid analysis of large datasets.

* One note about the Top500 list: Entry on the list is voluntary. And some companies intentionally do not participate to keep information about their computing capacity hidden from competitors. As a result, there might be other commercial systems that are more powerful than those included on the list.

Discussions
Salvatore Salamone's picture
Salvatore Salamone on Mar 24, 2014
Hi Bill

Thank you for your comments. I think we are in a very interesting time where high performance computing capabilities are available to many more people and organizations than ever before.

--sal salamone

Eloise Cardwell's picture
Eloise Cardwell on Feb 20, 2017

Energy investigation and creation is very subjected to the fast examination of progressively more intricate and bigger seismic imaging datasets. In the meantime, the demonstrating, administration, and investigation calculations being utilized by essay writer on that information are turning out to be more granular and modern requiring colossal power.

Mark Lopez's picture
Mark Lopez on Apr 16, 2020

I'm not surprised. There is a lot of data that goes into the management of electricity and the flow of it as well. As we live in a world that becomes more and more connected, these things will only increase. We're working with a solar installation company to help figure out the various APIs needed to connect different datapoints together. 

Ironically enough these supercomputers will also need a lot of cooling power as well. These energy companies may need their own help with regards to their energy management needs. 

Salvatore Salamone's picture
Thank Salvatore for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member

Get Published - Build a Following

The Energy Central Power Industry Network® is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »