The Ripple Effect

Why Two Tech Legends Changed Their Minds About The Future of Computers

Large memory, in-memory, software-defined server, in-memory performance, in-memory computing, HANA, Gordon Bell, IEEE Computer

In 1984, Ike Nassi, now an accomplished technologist and entrepreneur, was vice president of research at Encore Computer.  He and his colleagues, along with Encore co-founder Gordon Bell, the legendary engineering vice president at Digital Equipment Corp. and originator of Bell’s Law of computer classes, submitted a proposal to DARPA. They hoped the defense-focused research agency would fund the development of a distributed approach to strongly coherent shared memory.  The work was founded on the notion that applications are more easily written, and deliver results sooner, when the data is entirely resident in memory.

Read More

How WaveRunner Puts You in Control of a Better SDDC

Large memory, virtualization, in-memory computing, software-defined data center, Amazon EC2, Mellanox, Cumulus, Ubiquity, AWS, Juniper

TidalScale’s WaveRunner – the point-and-click control panel that makes creating a right-sized Software-Defined Server fast, flexible and easy – isn’t just about creating one or more virtual servers from multiple commodity systems. It also puts you in control of all the software-defined pieces in the data center. So in addition to cores and memory, WaveRunner allows you to monitor and manage storage and networks. You simply pick the software-defined resources you need and plug them together.

Read More

Predicting Yesterday’s Weather

Large memory, software-defined server, in-memory performance

While it’s true that we can never predict tomorrow’s weather with 100 percent reliability (at least not yet), at the same time it’s true that we can predict yesterday’s weather with 100 percent certainty.

What does this have to do with anything?

Well, it turns out that meteorologists aren’t the only people who use historical data in an attempt to predict reasonable futures. 

Read More

System Abstractions - Déjà Vu all over again...

abstraction, Large memory, TidalScale, virtualization, virtual memory

I’ve known Ike Nassi since we both worked at Digital Equipment back in the good old days, and I’ve always enjoyed talking to Ike about computer architecture. In some ways, what Ike’s doing at TidalScale seems very déjà vu with what DEC did in that timeframe when it introduced its first mini-computer with real virtual memory – the VAX-11/780. Virtual memory is an abstraction – let’s pretend we have a lot of physical memory even though we don’t; TidalScale is an abstraction – let’s pretend we have a big, powerful computer, even though we don’t.In both cases the idea might seem highly questionable to someone whose job is to squeeze the last Iota of performance out of a computer.

Read More

Application Programming When Memory Is No Longer A Constraint

Large memory, big data, application programming

We recently came across a problem that illustrates how software might be reconsidered in this new, software-defined-server environment.

Customer Problem Statement:

  • Consider two tables of historical data for each of, for example 3,000 securities. One table is called “Left” and one “Right”.
  • Each table for each security has a column of timestamps, and a column containing the name of the security its represents (e.g. “AAPL”), and additional data. The Left table might have, for example, 150 additional columns of data, and the Right table might have, for example, 100 columns of additional data.
Read More

Focus on Possibilities, Not Limits

Large memory, Amdahl, Gustafson, Multiprocessor, TidalScale

Computer Science is obsessed with "negative results" and "limits.” We seem to delight in pinpointing a terminus for virtually any technology or architecture – to map the place where the party ends.

Take, for example, Amdahl's Law, which seems to suggest that once you reach a certain point, parallelism doesn't help performance. Amdahl’s law has prevented many from believing that a market exists for bigger single systems since, since the law leads us to conclude larger multicore systems won't solve today's problems any faster.

Beware the Intuitively Obvious

The reasoning behind Amdahl’s Law turns on an assumption that all the parts of the problem must interact in such a way that

Read More
TidalScale featured in e-week!
Gary Smeardon in the Cube Interview