Breakthrough flexibility & performance in third generation of TidalScale software GET THE DETAILS >

The TidalScale Blog

    Why Wait for HPE’s The Machine?

    Today, Hewlett Packard Enterprise (HPE) unveiled a prototype of a massive server designed around memory – 160TB of it in fact. In announcing this concept system, which is the latest project in HPE’s research effort known as The Machine, HPE chief Meg Whitman reasoned, “We need a computer built for the Big Data era.”

    We couldn’t agree more. Big data, shifting workloads, unexpected or seasonal spikes in business activity – they all can overwhelm even the largest servers in a data center and foil IT’s attempts to accurately predict just how much server it needs to buy for a given project. The problem of finding computing systems capable of handling problems of any size is real and present. And it will only grow more pressing.

    The good news for users excited about HPE’s futuristic prototype – the production version is still several years off – is that they can create the server of the future, today.  They can create it on the fly from commodity servers they may already have in their data center. And they can, should they want or need to, provision a system in minutes whose memory eclipses even HPE’s 160TB target.

    We know this, because at TidalScale we’ve been spending the past four years coming up with the solution to the problem HPE hopes to someday address with The Machine. Our solution, already in use by financial institutions, automakers and large manufacturers, is called a Software-Defined Server. Available as an on-premise or cloud-based solution, TidalScale’s Software-Defined Server offers a way for organizations to right-size one or more servers by combining the resources of one, two, or even dozens of industry-standard systems. With a Software-Defined Server, all the combined memory, cores and I/O are available to your problem or application, which sees these aggregated resources as a single system. Or to borrow a phrase, one big Machine.

    This may be why eWeek reported TidalScale “may well have come up with the biggest advance in servers since VMware's virtualization of the Intel IA-32 platform 18 years ago.” Or why we were recently named an IDC Innovator in a report on software-defined solutions in the data center. Or why we have attracted top investors including Bain Capital, HWVP, SAP Sapphire Ventures, Samsung and Infosys.

    As much as we agree that HPE has correctly identified a real market need, and as enthusiastic as we are for their efforts to help address it, we’re even more excited to tell organization they don’t have to wait for a future where they can provision the right-sized system for even their most demanding workloads.

    It’s already here, waiting for them, in their own data centers or in the cloud.

    Watch Open Data Science Conference Keynote

    Topics: TidalScale, software-defined server, in-memory performance, infrastructure