The Ripple Effect

How to Get the Most from Your Cloud

infrastructure, cloud, Cloud Computing, cloud optimization

In case you missed the news the other day, the analysts at Gartner have just named TidalScale a Cool Vendor in its 2017 Cloud Computing Report[i].  This latest report from one of the world’s most respected research outfits affirms once more that the time is right for TidalScale’s Software-Defined Servers, which bring flexibility to modern data centers by enabling organizations to right-size servers on the fly.  (I say “affirms once more” because TidalScale was named an IDC Innovator just a few weeks ago.) 

Read More

The Cloud as Rainmaker

TidalScale, software-defined server, Gartner Cool Vendor, Cloud Computing

In case you missed the news the other day, the analysts at Gartner have just named TidalScale a Cool Vendor in its 2017 Cloud Computing Report[i].  This latest report from one of the world’s most respected research outfits affirms once more that the time is right for TidalScale’s Software-Defined Servers, which bring flexibility to modern data centers by enabling organizations to right-size servers on the fly.  (I say “affirms once more” because TidalScale was named an IDC Innovator just a few weeks ago.) 

Read More

Why Wait for HPE’s The Machine?

TidalScale, software-defined server, in-memory performance, infrastructure

Today, Hewlett Packard Enterprise (HPE) unveiled a prototype of a massive server designed around memory – 160TB of it in fact. In announcing this concept system, which is the latest project in HPE’s research effort known as The Machine, HPE chief Meg Whitman reasoned, “We need a computer built for the Big Data era.”

Read More

3 Secrets to Right-Sizing a Server

software-defined server, in-memory performance

I’ve grown accustomed to the stares of disbelief. It usually starts like the conversation I had the other day with some folks from a leading North American insurance company. They were planning to roll out an advanced new analytic model. Trouble was, they had no way to predict how much compute or memory capacity they’d need.

Read More

Predicting Yesterday’s Weather

Large memory, software-defined server, in-memory performance

While it’s true that we can never predict tomorrow’s weather with 100 percent reliability (at least not yet), at the same time it’s true that we can predict yesterday’s weather with 100 percent certainty.

What does this have to do with anything?

Well, it turns out that meteorologists aren’t the only people who use historical data in an attempt to predict reasonable futures. 

Read More

Why Not Just Build a Bigger Box?

software-defined server

Dr. Ike Nassi founded TidalScale on the premise of aggregating the resources available in one to many commodity servers so they can handle huge database, graph, simulation and analytics computations entirely in memory. 

Read More

The Trouble with Hadoop

TidalScale, software-defined server, hadoop

Whenever IT folks talk about handling their big data problems by scaling out with Hadoop, I tend to think about the 1986 comedy, “Big Trouble in Little China.” It chronicles the mishaps that ensue when a trucker gets dragged into a mystical battle in Chinatown. It’s kind of awful, but with John Carpenter in the chair and Kurt Russell on the screen it still delivers some laughs.

Read More

Why You Need a BFC (Part 2)

big data, software-defined server, in-memory performance, infrastructure

Last week, I looked at some of the compelling reasons for transforming a set of commodity servers into a big flexible computer, or BFC.  At TidalScale, we call this a Software-Defined Server -- a single virtual machine operating across multiple nodes, and that makes all the aggregated resources available to the application. But for today’s blog, it’s BFC all the way.

Read More

Why You Need a BFC (Part 1)

TidalScale, virtualization, in-memory performance, data center

If you’re familiar at all with TidalScale, then you know we believe people should fit the computer to the problem, rather than the other way around.  We believe in new technologies that can be adopted easily, in leveraging advances in cost-effective hardware, and in automation. We believe you shouldn’t have to invest in new hardware to solve large or difficult computational problems. We believe commodity, industry-standard technologies hold remarkable power and possibilities that are just waiting to be tapped.

Read More

9 Ways to Press the Easy Button for Scalability

Multiprocessor, TidalScale, in-memory, big data, software-defined server

In some recent blogs, we covered eight reasons why Software-Defined Servers can help reduce OpEx and CapEx, while helping data center managers extract maximum use and value from existing IT resources. And last week, we illustrated how you can achieve some startling real-world performance gains by implementing Software-Defined Servers.

Today, let’s look at how simple, straightforward and transparent Software-Defined Servers are. 

Read More

A Market Awakens to the Value of Software-Defined Servers

TidalScale, software-defined server

 

You may have seen last week’s announcement that TidalScale was named an IDC Innovator in a recent report on software-defined solutions in the data center. IDC Innovators: Virtualizing Infrastructure with Software-Defined Compute, 2017 (March 2017) calls out TidalScale for allowing enterprises to “reuse commodity servers currently in service as workload demands arise.” That’s a gloriously concise way to bottom-line the

Read More

300x Performance Gains Without Changing a Line of Code

TidalScale, software-defined server, in-memory performance

In Gary' Smerdons last post, he listed eight ways Software-Defined Servers can help reduce OpEx and CapEx, while helping data center managers extract maximum use and value from existing IT resources.

As vital as these benefits are to IT, operations, finance and other areas, the ability to scale your system to the size of your problem is just as beneficial to scientists and analysts – the people on the front lines of big data analytics.If you fall into that camp, then you’re probably familiar with the dreaded “memory cliff.”

Read More

For Cloud Infrastructure Providers, a Way to Do More than Ever Before

TidalScale, software-defined server, infrastructure, cloud

Guest Blog from Sheng Yeo, CEO of OrionVM, a partner of TidalScale

Cloud infrastructure providers today don’t have much flexibility when it comes to the systems they use. Resources devoted to running specific applications and workloads are generally confined to the limits of a single system, typically a “sweet spot” server that perhaps offers 24 cores and a few hundred gigs of memory. It's a matter of economics, really: 

Read More

Open Compute Rack & the Software-Defined Server

TidalScale, software-defined server, OCP

Let’s take a trip back in time. It’s 2009, and Facebook has just become the No. 1 social network in the United States.  In January of that year, Facebook reports it has 150 million users worldwide. Only eight months later, membership doubles to 300 million.

Read More

How To Avoid Writing Terrible Code

innovation, start ups

Writing quality code can be a challenge for any organization. At TidalScale, we to go to great effort not to write terrible code. And while that might seem absurdly obvious, in a fast-growth environment, it doesn't exactly come easy. Here's some of what we do to make sure we come up with the good stuff.

Read More

8 More Benefits of Software-Defined Servers

TidalScale, software-defined server

In my last blog, I touched on some of the benefits that make Software-Defined Servers the crucial missing piece of the software-defined data center.  Now it’s time to look closer at why Software-Defined Servers are so beneficial, and to whom.

Read More

The One Word That Kills Startups

innovation, culture, success factors, start ups, team building, entrepreneurship

One word can kill a startup:

Read More

The One Thing Every Data Center Needs (But Doesn't Have)

TidalScale, software-defined server

Most data center managers – and even many end users – are familiar with Software-Defined Networking and Software-Defined Storage. These battle-tested approaches to virtualizing existing assets make it easier for resources to zig when workloads zag. They introduce significant flexibility into the data center, which is a win for practically everyone involved.

But one piece has been conspicuously missing from the software-defined puzzle: the server.

Read More

The Secret to Keeping Your R Code Simple

R

When the demands of Big Data analytics surpass the core count and memory available on your biggest server, you’re usually left with three dismal options:  spend money you don’t have on new hardware; devote time you can’t spare rewriting code to run across clusters; or delay insights you can’t put off by shrinking the size of your problems to fit the limits of your hardware.

Read More

3 Ways to Amplify Container Performance

TidalScale, linux containers, software-defined server

A recent survey of 310 IT professionals found that container production operations have nearly doubled in the past year. Container technology is popular because it provides efficient utilization of isolated resources without all the overhead of traditional virtualization.

Read More
gartner_cool_vendor_2017.jpg
IDC-Innovator-Logo-2017.png
TidalScale featured in e-week!
rhglobal100logo_360.png
Gary Smeardon in the Cube Interview