The TidalScale Blog

Chuck Piercey

Chuck Piercey

Recent Posts

At Oracle OpenWorld, See How You Can Optimize Oracle DB Performance

Oracle Open World, memoptimize

For Oracle DBAs, the growth of data is proving to be an unrelenting problem, with data volumes persistently overwhelming the amount of memory that’s available to process them. That impacts the performance of Oracle In-Memory Database, delaying insights and hampering productivity.

Read More

8 Reasons Software-Defined Servers are Great for DevOps

devops, software-defined server

New demands on DevOps environments—from soaring data volumes to increasingly unpredictable workloads—have IT managers searching for new ways to extract more utility and cost efficiency from their data centers and clouds.

Read More

TidalScale's Year in the News

TidalScale, software-defined server, infrastructure, in-memory performance, server, composable infrastructure

May (the month, not the guitarist from Queen) has always been synonymous with endings and beginnings. May marks the end of autumn in the Southern Hemisphere and the practical start of summer up north. It’s a big month for graduations (another ending), and just as big for weddings. You get the idea.

So with schools about to let out for summer, I thought we should look at our own report card.

Read More

Deploy a "Supercomputer" in Minutes, Pay by the Hour

hpc, supercomputing, software-defined data center, software-defined server, SC17, TidalScale, TidalScale WaveRunner, composable infrastructure

As the high-performance computing (HPC) community prepares to descend on Denver for SC17 next week, its members will arrive in the Mile-High City with more baggage than the usual rolling carry-on. They’ll also be packing some long-held expectations. One of these is that it’s more or less impossible to create a real HPC system—a massive single system image—in the cloud.  I fully anticipate they will leave Denver with the opposite expectation.

Read More

The Magic of Hardware that Isn't

virtualization, software-defined data center, software-defined server, prickett morgan

“We didn’t believe it either. But the TidalScale team is not fooling around here.”

These are two of my favorite sentences in Timothy Prickett Morgan’s excellent recent piece for TheNextPlatform in which he details the longtime quest to achieve “a big ole flat memory space that is as easy to program as a PC but brings to bear all that compute, memory and I/O of a cluster as a single system image.”

Read More

4 Compelling DevOps Advantages of Software-Defined Servers

devops, CloudExpo, containers, docker, kubernetes, software-defined data center, software-defined server

Forrester recently called cloud computing "the most exciting and disruptive force in the tech market in the last decade." We would agree.

That's why we're exhibiting and presenting at CloudExpo, which runs through Nov. 2 at the Santa Clara Convention Center. 

Visit us in Booth #309

Come see us in Booth #309 (just behind the huge IBM booth in the entry of the exhibit area) to see how you can right-size your cloud server resources to fit any data set or workload. 

Software-Defined Servers are growing popular with large manufacturers, financial services firms and other innovators because of four key benefits to DevOps:

Read More

Why Software-Defined Servers & Storage are a Match Made in Data Center Heaven

storage visions, software-defined data center, software-defined server, software-defined storage

I made a presentation at the recent Storage Visions conference which gave me an opportunity to think about how rapidly Software-Defined Servers are evolving and how perfectly they fit with software defined storage architectures.

Read More

4 Ways to Right-Size a Server Right Now

Cloud Computing, High Performance Computing, Oracle Bare Metal Cloud, Oracle Cloud Infrastructure, OrionVM, IBM BlueMix, software-defined server, right-sizing, TidalScale WaveRunner

Four Options for Harnessing the Flexibility & Power of Software-Defined Servers

You may have caught our big news at Oracle OpenWorld 2017 – that TidalScale has teamed up with Oracle Cloud Infrastructure to enable the world’s largest servers. It’s a big deal for us, but an even bigger deal for users of Oracle Cloud who have found they need more compute, memory, storage or I/O than any single bare metal server in the Oracle Cloud can provide.

Read More

How WaveRunner Puts You in Control of a Better SDDC

Large memory, virtualization, software-defined data center, in-memory computing, Amazon EC2, AWS, Mellanox, Cumulus, Ubiquity, Juniper

TidalScale’s WaveRunner – the point-and-click control panel that makes creating a right-sized Software-Defined Server fast, flexible and easy – isn’t just about creating one or more virtual servers from multiple commodity systems. It also puts you in control of all the software-defined pieces in the data center. So in addition to cores and memory, WaveRunner allows you to monitor and manage storage and networks. You simply pick the software-defined resources you need and plug them together.

Read More

3 Secrets to Right-Sizing a Server

software-defined server, in-memory performance

I’ve grown accustomed to the stares of disbelief. It usually starts like the conversation I had the other day with some folks from a leading North American insurance company. They were planning to roll out an advanced new analytic model. Trouble was, they had no way to predict how much compute or memory capacity they’d need.

Read More

9 Ways to Press the Easy Button for Scalability

software-defined server, big data, Multiprocessor, in-memory, TidalScale

In some recent blogs, we covered eight reasons why Software-Defined Servers can help reduce OpEx and CapEx, while helping data center managers extract maximum use and value from existing IT resources. And last week, we illustrated how you can achieve some startling real-world performance gains by implementing Software-Defined Servers.

Today, let’s look at how simple, straightforward and transparent Software-Defined Servers are. 

Read More

Focus on Possibilities, Not Limits

TidalScale, Amdahl, Gustafson, Multiprocessor, Large memory

Computer Science is obsessed with "negative results" and "limits.” We seem to delight in pinpointing a terminus for virtually any technology or architecture – to map the place where the party ends.

Take, for example, Amdahl's Law, which seems to suggest that once you reach a certain point, parallelism doesn't help performance. Amdahl’s law has prevented many from believing that a market exists for bigger single systems since, since the law leads us to conclude larger multicore systems won't solve today's problems any faster.

Beware the Intuitively Obvious

The reasoning behind Amdahl’s Law turns on an assumption that all the parts of the problem must interact in such a way that

Read More

Test Drive TidalScale for Yourself

TidalScale, Test Drive

TidalScale has built and operates very large (1.5TB - 3.0TB RAM) TidalPods for customers to test and evaluate running their applications on a single, virtualized very large system. We call these systems PoC systems (i.e. Proof Of Concept). We are making these systems available to you for a limited time, at no charge, to test your application(s) in a large scale environment. The basic testing program:

Read More

Tips for Scaling Up Open Source R

R, TidalScale


When we started searching for large scale Open R benchmarks we were surprised to find few good workloads for multi-terabyte sized TidalScale systems.  We ended up writing our own R Benchmark that allowed us to scale R workloads to arbitrarily large in-memory sizes. In the process we learned a few tips and tricks that we thought we'd share for how to run large workloads using Open Source R. 

Read More

Scale Up Open Source R on TidalScale

R, TidalScale

Like many statistical analytic tools, R can be incredibly memory intensive. A simple GAM (generalized additive model) or K-nearest neighbor routine can devour many multiples of memory size compared to the starting dataset. And, R doesn't always behave nicely when it runs out of memory.

Read More

Simulation Applications on TidalScale

TidalScale, Powering Software-Defined Servers:

Executing and analyzing large, complex models pose unique computing challenges. And these challenges are only growing as companies face the need to process, analyze, and act on ever-increasing amounts of data. Typically, IT professionals are faced with two options: scale-up by purchasing extremely expensive specialized computers, or scale-out by rewriting applications using complex distributed algorithms for running on clusters of standard hardware. One costs money, the other costs time. And in today’s budget-conscious, real-time world, few organizations can afford either. Read More

Sign up for weekly updates!

Recent Posts: