The Ripple Effect

Learn About Right-Sizing in 5 Easy Steps

software-defined server, software-defined data center, composable infrastructure, right-sizing

I’ll be the first to acknowledge that there’s a lot to the TidalScale story. Our Software-Defined Servers enable organizations to right-size servers on the fly to fit any data set. The process of creating one is fast, flexible and easy. With TidalScale, you can:

Read More

5 Ways Data Centers are Grossly Inefficient

IDC, data center, software-defined data center, composable infrastructure

Earlier this year, IDC surveyed 301 IT users from medium-sized and large enterprises, asking them questions that allowed the research firm to determine the relative efficiency of those data centers. (For reference, the average data center contained 386 blades and servers, while the largest third of those surveyed averaged 711 blades and servers.)

Read More

Part 2: Composable Infrastructure for the Modern Data Center

data center, software-defined data center, composable infrastructure

The Role of Software-Defined Resources

Last week, I explored some of the key issues and core benefits that are prompting enterprises to move to more flexible and cost-effective composable infrastructures.  As I pointed out in Part 1 of this blog, composable infrastructure technologies from vendors like TidalScale are designed to address many of the most pressing issues in today’s data centers, 

Read More

Part 1: Composable Infrastructure for the Modern Data Center

software-defined server, Cloud Computing, software-defined data center, composable infrastructure

Part 1: The Need for Composable Infrastructure

New approaches to infrastructure design are required for businesses to keep up with the amount of data that is generated, and whose timely analysis is of paramount importance for the business to remain competitive in the digital economy. Newer approaches to infrastructure must focus on efficiency to

Read More

Why More Data Centers Will Be Software Defined

software-defined server, data center, in-memory computing, software-defined data center, sddc

A new report projects that the global Software-Defined Data Center (SDDC) market will grow at a 22 percent compound annual growth rate through 2021. The authors, an India-based outfit called Wise Guy Consultants, estimate that the total market size for SDDC goods and services will reach $81.4 billion in the same period.

Read More

How to Get the Most from Your Cloud

infrastructure, cloud, Cloud Computing, cloud optimization

In case you missed the news the other day, the analysts at Gartner have just named TidalScale a Cool Vendor in its 2017 Cloud Computing Report[i].  This latest report from one of the world’s most respected research outfits affirms once more that the time is right for TidalScale’s Software-Defined Servers, which bring flexibility to modern data centers by enabling organizations to right-size servers on the fly.  (I say “affirms once more” because TidalScale was named an IDC Innovator just a few weeks ago.) 

Read More

The Cloud as Rainmaker

TidalScale, software-defined server, Gartner Cool Vendor, Cloud Computing

In case you missed the news the other day, the analysts at Gartner have just named TidalScale a Cool Vendor in its 2017 Cloud Computing Report[i].  This latest report from one of the world’s most respected research outfits affirms once more that the time is right for TidalScale’s Software-Defined Servers, which bring flexibility to modern data centers by enabling organizations to right-size servers on the fly.  (I say “affirms once more” because TidalScale was named an IDC Innovator just a few weeks ago.) 

Read More

Why Wait for HPE’s The Machine?

TidalScale, software-defined server, in-memory performance, infrastructure

Today, Hewlett Packard Enterprise (HPE) unveiled a prototype of a massive server designed around memory – 160TB of it in fact. In announcing this concept system, which is the latest project in HPE’s research effort known as The Machine, HPE chief Meg Whitman reasoned, “We need a computer built for the Big Data era.”

Read More

3 Secrets to Right-Sizing a Server

software-defined server, in-memory performance

I’ve grown accustomed to the stares of disbelief. It usually starts like the conversation I had the other day with some folks from a leading North American insurance company. They were planning to roll out an advanced new analytic model. Trouble was, they had no way to predict how much compute or memory capacity they’d need.

Read More

Predicting Yesterday’s Weather

Large memory, software-defined server, in-memory performance

While it’s true that we can never predict tomorrow’s weather with 100 percent reliability (at least not yet), at the same time it’s true that we can predict yesterday’s weather with 100 percent certainty.

What does this have to do with anything?

Well, it turns out that meteorologists aren’t the only people who use historical data in an attempt to predict reasonable futures. 

Read More

Why Not Just Build a Bigger Box?

software-defined server

Dr. Ike Nassi founded TidalScale on the premise of aggregating the resources available in one to many commodity servers so they can handle huge database, graph, simulation and analytics computations entirely in memory. 

Read More

The Trouble with Hadoop

TidalScale, software-defined server, hadoop

Whenever IT folks talk about handling their big data problems by scaling out with Hadoop, I tend to think about the 1986 comedy, “Big Trouble in Little China.” It chronicles the mishaps that ensue when a trucker gets dragged into a mystical battle in Chinatown. It’s kind of awful, but with John Carpenter in the chair and Kurt Russell on the screen it still delivers some laughs.

Read More

Why You Need a BFC (Part 2)

big data, software-defined server, in-memory performance, infrastructure

Last week, I looked at some of the compelling reasons for transforming a set of commodity servers into a big flexible computer, or BFC.  At TidalScale, we call this a Software-Defined Server -- a single virtual machine operating across multiple nodes, and that makes all the aggregated resources available to the application. But for today’s blog, it’s BFC all the way.

Read More

Why You Need a BFC (Part 1)

TidalScale, virtualization, in-memory performance, data center

If you’re familiar at all with TidalScale, then you know we believe people should fit the computer to the problem, rather than the other way around.  We believe in new technologies that can be adopted easily, in leveraging advances in cost-effective hardware, and in automation. We believe you shouldn’t have to invest in new hardware to solve large or difficult computational problems. We believe commodity, industry-standard technologies hold remarkable power and possibilities that are just waiting to be tapped.

Read More

9 Ways to Press the Easy Button for Scalability

Multiprocessor, TidalScale, in-memory, big data, software-defined server

In some recent blogs, we covered eight reasons why Software-Defined Servers can help reduce OpEx and CapEx, while helping data center managers extract maximum use and value from existing IT resources. And last week, we illustrated how you can achieve some startling real-world performance gains by implementing Software-Defined Servers.

Today, let’s look at how simple, straightforward and transparent Software-Defined Servers are. 

Read More

A Market Awakens to the Value of Software-Defined Servers

TidalScale, software-defined server

 

You may have seen last week’s announcement that TidalScale was named an IDC Innovator in a recent report on software-defined solutions in the data center. IDC Innovators: Virtualizing Infrastructure with Software-Defined Compute, 2017 (March 2017) calls out TidalScale for allowing enterprises to “reuse commodity servers currently in service as workload demands arise.” That’s a gloriously concise way to bottom-line the

Read More

300x Performance Gains Without Changing a Line of Code

TidalScale, software-defined server, in-memory performance

In Gary' Smerdons last post, he listed eight ways Software-Defined Servers can help reduce OpEx and CapEx, while helping data center managers extract maximum use and value from existing IT resources.

As vital as these benefits are to IT, operations, finance and other areas, the ability to scale your system to the size of your problem is just as beneficial to scientists and analysts – the people on the front lines of big data analytics.If you fall into that camp, then you’re probably familiar with the dreaded “memory cliff.”

Read More

For Cloud Infrastructure Providers, a Way to Do More than Ever Before

TidalScale, software-defined server, infrastructure, cloud

Guest Blog from Sheng Yeo, CEO of OrionVM, a partner of TidalScale

Cloud infrastructure providers today don’t have much flexibility when it comes to the systems they use. Resources devoted to running specific applications and workloads are generally confined to the limits of a single system, typically a “sweet spot” server that perhaps offers 24 cores and a few hundred gigs of memory. It's a matter of economics, really: 

Read More

Open Compute Rack & the Software-Defined Server

TidalScale, software-defined server, OCP

Let’s take a trip back in time. It’s 2009, and Facebook has just become the No. 1 social network in the United States.  In January of that year, Facebook reports it has 150 million users worldwide. Only eight months later, membership doubles to 300 million.

Read More

How To Avoid Writing Terrible Code

innovation, start ups

Writing quality code can be a challenge for any organization. At TidalScale, we to go to great effort not to write terrible code. And while that might seem absurdly obvious, in a fast-growth environment, it doesn't exactly come easy. Here's some of what we do to make sure we come up with the good stuff.

Read More
gartner_cool_vendor_2017.jpg
IDC-Innovator-Logo-2017.png
TidalScale featured in e-week!
rhglobal100logo_360.png
Gary Smeardon in the Cube Interview