Descartes Labs Reaches 41st Place in TOP500 with Cloud-Based Supercomputing Demonstration Powered by AWS, Signaling New Era for Large-Scale Geospatial Data Analysis


SANTA FE, New Mexico, June 28, 2021 / PRNewswire / – Descartes Labs, Inc. today announced a new cloud supercomputing achievement with a TOP500 runtime on AWS of 9.95 petaflops using Amazon Elastic Compute Cloud (Amazon EC2) virtualized instances. The TOP500 organization uses the LINPACK Benchmark, a test that consists of solving a dense system of linear equations, to rank the performance of the 500 most powerful computer systems commercially available. Descartes Labs improved its previous cloud record of 1,926 petaflops set in its TOP500 2019 submission (ranking # 136) with an HPL performance improvement of 417% over two years and a performance improvement of a factor of 10 over gains performance achieved by traditional, on-premises supercomputers over the same two-year period. For 2021, the overall calculation took about nine hours and performed 3 x 1020 total number of floating-point operations on 172,692 Intel Xeon Scalable Processor cores.

The global data volume of sensors processed by Descartes Labs requires state-of-the-art cloud technology capable of performing high-performance computing (HPC) and machine learning (ML) workflows at scale. This enables Descartes Labs’ engineering teams to calculate, analyze, and store data in near real time to deliver critical business analysis on a global scale to clients in the agriculture, mining and quarrying industries. mining, defense and intelligence, and consumer packaged goods (CPG) industries. This capability is becoming increasingly important for applications that rely on a combination of global geospatial analysis, such as environmental monitoring, severe weather tracking and forecasting, multimodal data fusion, and weather analysis. climate change. An example of this mission is the six petabytes of Sentinel-1 radar data recently processed by Descartes Labs to help track deforestation on a continental scale, understand where the risk of forest fires is highest, and monitor forest fire practices. regenerative agriculture.

“Having the vast compute capacity that AWS offers, connected to huge amounts of storage accessible at very high speeds, allows you to do things that you just couldn’t do before in terms of data analysis and understanding. of the reality of our planet or the universe as a whole, “said Mike Warren, co-founder and CTO of Descartes Labs. “Our use of clouds today covers the range of analyzes of petametric scale Earth observation datasets, in particular going beyond simple optical imagery to observation types of Earth more esoteric data such as radar, InSAR and AIS data. “

Working with the AWS HPC team, Descartes was able to increase the resources needed to run its TOP500 using general purpose and publicly accessible Amazon EC2 instances in the C5 / M5 / R5 families. Thanks to the scalability and elasticity of the AWS cloud, Descartes Labs was able to scale the infrastructure for this test without affecting its production workflows. Additionally, due to the elasticity of Amazon EC2’s compute resources, Descartes was also able to test different cluster sizes with multiple HPL runs ranging from 1 petaflops to 8.5 petaflops, leading to the end result of 9.95 petaflops. .

“The fact that Descartes Labs was able to use Amazon EC2 to build a 9.95 petaflops supercomputer ranked # 41 on the TOP500 list is a truly impressive achievement,” said Dave brown, Vice President, Amazon EC2, AWS. “AWS is proud to support Descartes Labs in its mission to develop geospatial information on a planetary scale for its customers. We expect the demand for instant data and information to increase in the future, and this is a powerful demonstration of the value AWS can bring to customers and partners at any scale. “

With petabytes of Earth observation data and ground sensors moving at the speed of collection, it is now the data that drives computational needs, not the other way around. The architecture of a traditional supercomputer is often limited in terms of cluster size and input / output (I / O). Having flexible computing capacity, connected to huge amounts of cloud storage, all accessible at very high speeds, allows Descartes Labs to achieve a science the world could not accomplish before.

The LINPACK benchmark used in this compute-intensive demonstration is used to estimate how fast a computer will run when solving real “tightly coupled” problems, such as weather forecasting. By building systems capable of running the high-performance LINPACK benchmark, Descartes Labs can use the same systems to model the weather quickly, accurately, and at multiple locations simultaneously to improve crop yield predictions.

This cutting-edge science has been instrumental in Descartes Labs’ collaboration with clients like CJ OliveNetworks, an organization that recently migrated all computer systems from CJ Group companies to AWS. The company used Descartes Labs technology to complete a purchasing innovation project to predict sugar input prices for CJ CheilJedang, which imports 600,000 tonnes of raw sugar per year. CJ was able to innovate business through this project using satellite imagery technology and analysis of climate and vegetation data.

According to the CEO of OliveNetworks and Group Chief Digital Officer In hyok Cha, “I would like to congratulate both Descartes Labs and AWS for performing such a large-scale calculation to support the infrastructure work behind planetary sustainability initiatives. , and also wishes to continue to collaborate with the two companies to develop more innovative use cases and commercial applications for Earth observation. “

The transition from on-premises computing to cloud computing reflects the shift from simply analyzing an organization’s internal data to incorporating large volumes of geospatial data and sensors from the world beyond. Planetary scale data requires planetary scale calculation. As a result, the size of this data “outside the firewall” requires the type of cloud infrastructure that Descartes Labs carefully developed and used for this latest TOP500 run. The company looks forward to helping others do the same.

“The cloud is where all of the latest technology first appears. You see all of the latest processors from Intel and AMD in the cloud before you get access to them in your on-premises supercomputers,” said Warren. “Investing in networking technology to support these hyperscale applications in the cloud is also proving very effective for traditional tightly coupled supercomputing applications. Now the big push is exascale in HPC. But at the same time, machine learning and deep neural networks are making huge strides in other kinds of problems. I think these worlds can get closer, or they can move further apart. And that’s one of the big questions for the next decade. “

Visit or learn more about our TOP500 2019 achievement.

About Descartes Labs

Descartes Labs is a geospatial intelligence company that performs scientific analyzes of geospatial datasets, remote sensing, and various complementary datasets to enable sustainable sourcing best practices, commodity price prediction and l efficient mineral exploration for leading CPG, agriculture and mining companies. Our SaaS platform automates geospatial image analysis for our users, enabling planetary-scale analysis through artificial intelligence and machine learning. The company also supports a diverse set of federal government efforts to organize, analyze, and deliver unique actionable information from geospatial data. For more information, visit or follow us on Twitter or LinkedIn.

press contact
Alex Diamond
Descartes Laboratories
[email protected]

Related images

Descartes Laboratories

TOP500 Descartes Labs 2021 certificate
TOP500 Descartes Labs 2021 certificate

SOURCE Descartes Laboratories

Leave A Reply

Your email address will not be published.