Story image

HPE unveils 160TB memory-driven computing ‘Machine’ built for big data

17 May 2017

Hewlett Packard Enterprise has unveiled the world’s largest single-memory computer – all 160 terabytes of it.

The prototype, which HPE says is the largest R&D program in the vendor’s history, is designed to deliver an architecture custom-built for the big data era, with what HPE calls memory driven computing.

Memory driven computing puts memory, rather than the processor, at the centre of the computing architecture, with HPE claiming it can reduce the time needed to process complex problems dramatically, to deliver real-time intelligence.

Meg Whitman, Hewlett Packard Enterprise chief executive, says “The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day.

“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”

Packing 160TB of memory spread across 40 physical nodes interconnected by a high performance fabric protocol, The Machine – as HPE has dubbed the prototype – is capable of simultaneously working with data of approximately 160 million books – or in HPE (and United States terms) the data held in every book in the Library of Congress five times over.

The company says The Machine offers a glimpse of the ‘immense potential’ of memory driven computing, with HPE saying the architecture could easily scale to an exabyte-scale single memory system and, beyond that, to a nearly limitless pool of memory – 4,096 yottabyte, or 250,000 times the entire digital universe today.

Mark Potter, HPE CTO and director of Hewlett Packard Labs, says the architecture can be applied ‘to every computing category, from intelligent edge devices to supercomputers’.

“We believe memory-driven computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” Potter says.

Orange Belgium opens 1,000 sqm Antwerp data centre
It consists of more than 500 high-density 52 unit racks, installed on the equivalent of 12 tennis courts.
Time to build tech on the automobile, not the horse and cart
Nutanix’s Jeff Smith believes one of the core problems of businesses struggling to digitally ‘transform’ lies in the infrastructure they use, the data centre.
Cloud providers increasingly jumping into gaming market
Aa number of major cloud service providers are uniquely placed to capitalise on the lucrative cloud gaming market.
Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.
NVIDIA announces enterprise servers optimised for data science
“The rapid adoption of T4 on the world’s most popular business servers signals the start of a new era in enterprise computing."
Unencrypted Gearbest database leaves over 1.5mil shoppers’ records exposed
Depending on the countries and information requirements, the data could give hackers access to online government portals, banking apps, and health insurance records.
Storage is all the rage, and SmartNICs are the key
Mellanox’s Kevin Deierling shares the results from a new survey that identifies the key role of the network in boosting data centre performance.
Opinion: Moving applications between cloud and data centre
OpsRamp's Bhanu Singh discusses the process of moving legacy systems and applications to the cloud, as well as pitfalls to avoid.