The Ripple Effect

The Software-Defined Future is Here


In January, I argued that 2018 is shaping up to be the year of the Software-Defined Server. I pointed to a number of reasons why:

  • Data is growing rapidly, putting pressure on IT infrastructures that simply aren’t built to keep up.
  • To act on all that data quickly, businesses need to analyze it entirely in memory, which is 1,000 times faster than flash storage.
  • Today’s on-premise and cloud data centers typically aren’t equipped with servers that can provide a single instance of memory large enough to accommodate many data sets.

So enterprises need results sooner, but most are not set up to get them.

Then there’s the matter of cost.

  • Scaling up is expensive, particularly because hardware manufacturers tie memory to core counts, which means you’ll spend more than necessary to buy the memory you need.
  • Scaling out is expensive in its own way as well, because rewriting code and sharding data costs you time—and now more than ever, the longer you hold onto data, the less value it has.
  • Software vendors also tie licenses to core counts, which means that, once again, you end up overpaying just to access enough memory for your data set.

Enterprises,then, also need to achieve these insights quickly while somehow driving down costs.

The Software-Defined answer

The solution to both those challenges is the Software-Defined Server. A Software-Defined Server combines all the resources of multiple commodity servers into one or more virtual servers.  Applications and operating systems need no modification, and they think they’re looking at an actual single hardware system.  All the resources associated with those aggregated servers—including cores, memory, storage and network—are available to the application. To applications and workloads, a Software-Defined Server is virtually indistinguishable from a large physical system.

Read the cover story in IEEE Computing of how two tech legends planted the seeds of the Software-Defined Server back in 1984. 

The benefits of software-defined resources are well known. For years, IT administrators have achieved flexibility and cost savings with software-defined storage and network platforms. But servers were left behind, unable to achieve the same kind of fluid, composable state because they remained fixed assets.

As the missing piece of the software-defined data center (SDDC), Software-Defined Servers change all that. They give data center administrators, for the first time, a way to marshal existing “sweet spot” servers, with their cost-effective but limited configurations, and use them as building blocks for Software-Defined Servers sized to fit virtually any workload or data set, with up to dozens of terabytes of memory and hundreds of cores.

Enterprises need results sooner, and Software-Defined Servers deliver. They deliver R results up to 300X faster than traditional systems, and increase throughput of containerized apps by more than 20X. And you can deploy a new Software-Defined Server in just five minutes, so results don’t have to wait for long, arduous setup.

And cost reduction?  You can cut capital costs by as much as 50 percent and reduce software licensing costs by 40 percent. That’s the kind of economic transformation that happens when you enable memory scaling on low core counts.

Read more about it

This is the software-defined server that now, finally, is coming to pass. In a new TidalScale white paper featuring research from Gartner, "Software-Defined Servers: A guide to adding the missing piece of the software defined data center," you'll learn how organizations are embracing the flexiblity and cost savings of SDDCs, and exactly where Software-Defined Servers fit into that picture. You'll also receive a Planning Guide to adding Software-Defined Servers to your own SDDC.

Review the paper, read the research, and let us know what you think. After all, it's your future at stake, and we want it to be a successful one.

Download the paper 

Topics: TidalScale, software-defined server, software-defined data center, composable infrastructure, right-sizing, Memory, Gordon Bell, IEEE Computer

gartner_cool_vendor_2017.jpg
IDC-Innovator-Logo-2017.png
TidalScale featured in e-week!
TidalScale Red Herring 100 Winner
Gary Smeardon in the Cube Interview