Memory Driven Computing - the next big thing?

Wonder when we’ll see such a big jump in cores…

A prototype computer with 160TB of memory has been unveiled by Hewlett Packard Enterprises.

Designed to work on big data, it could analyse the equivalent of 160 million books at the same time, HPE said.

The device, called The Machine, had a Linux-based operating system and prioritised memory rather than processing power, the company said.

HPE said its Memory Driven Computing research project could eventually lead to a “near-limitless” memory pool.

“The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day,” said HPE boss Meg Whitman.

“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”

2 Likes

Well the need for “Big Data” Stuff just dropped even more that is all.

But it does not solve at all the “Bandwith” and CPU bound problem, nor the cache problem of CPU computations. So it is quite good for us Elixir devs, it is a big attack on Hadoop vendors, but outside of that, nothing to see here.

2 Likes