Some developments challenge our accepted notions of what things are and how they’re created. Uber and Airbnb, for instance, caused us to rethink what transportation and lodging mean in the new sharing economy. Then there’s Amazon, one of the granddaddies of disruptors.
On Oct. 4, 1995, back when the Internet was still called the World Wide Web, this upstart Seattle bookseller welcomed online customers with a long letter explaining that if its inventory of 1 million books were sold in a store, the resulting shop would have been 40 times larger than a typical mall bookstore. To sell them by mail, you’d have needed a catalog the size of seven New York City phone books.
We all know where Amazon ended up (everywhere), and how it disrupted not just book and media sales, but sale of virtually any physical goods, from lawn mowers and mattresses to computers and TVs (today, visiting a store like Best Buy is known in the retail world as “showrooming” – kicking the tires of real-life products that you fully intend to go home and purchase from Amazon). Amazon caused us to reframe our understanding of shopping.
In the IT realm, disruptors like cloud computing have all prompted us to recalibrate our understanding of computers themselves. Even here, Amazon continues to shake things up, despite its humble beginnings as a book peddler. I mean, it’s unlikely anyone would have looked at this web page captured from Amazon’s early days and thought, “One day, I’m totally going to entrust critical enterprise IT operations, services and data to these guys!”
And yet that’s what’s happened with Amazon Web Services, launched a few years after this snapshot was taken. Today AWS hosts websites, cloud services, ERP systems and more for over a million companies, non-profits, universities and even governments. And through its Elastic Compute Cloud service, AWS allows anyone to fashion a kind of software-defined data center from a limited inventory of servers, storage and network.
This wouldn’t have been possible without disruptive technologies like virtualization and containers. In fact, as a new TidalScale white paper explains, containers have some distinct advantages over traditional virtualization; in particular, they can isolate and transport a full virtual machine much more efficiently by sharing a common kernel. This means memory management, filesystem, networking and other core pieces of a server can easily be shared rather than duplicated. This also enables the allocations of the server’s memory and CPU to be set independently from the server hardware and mixed with one another to fit workloads together.
All of that is great news for modern data center administrators. But containers aren’t without their downsides. Three significant shortcomings stand out:
- Limited mobility. Containers don’t take advantage of the virtualization extensions in modern processors that facilitate efficient mobility. This means migrating processes from one machine to the next isn’t possible as it is with traditional virtualization.
- Complex orchestration. Adding and removing containers while servers are running should be relatively simple, but it’s not.
- Less-than-stellar security. Separating physical resources is Security 101 in today’s data centers, but container platforms offer no hardware-enforced separate between multiple containers operating on the same server. And introducing containers into a data center can prohibit IT from relying on some of its tried-and-true security techniques.
In this latest white paper, we trace how users and data center administrators can overcome the limitations of containers with an equally disruptive technology: Software-Defined Servers.
Software-Defined Servers combine all the resources of multiple servers – CPUs, memory, storage and network – into one or more virtual machines. In the data center, Software-Defined Servers complement other software-defined resources, including storage and networks, turning a previously fixed resource into one that can be “right-sized” on the fly.
As it happens, the very design of TidalScale’s Software-Defined Servers works to counteract the challenges of containers. To find out how, read our new complimentary white paper, “Using Containers to Deliver an Efficient Private Cloud.”
Take TidalScale out for a Test Drive to see for yourself with your own applications: