Last week at Silicon Valley’s storied Computer History Museum, some 200 people came together to see what’s next.Read More
In January, I argued that 2018 is shaping up to be the year of the Software-Defined Server. I pointed to a number of reasons why:
- Data is growing rapidly, putting pressure on IT infrastructures that simply aren’t built to keep up.
- To act on all that data quickly, businesses need to analyze it entirely in memory, which is 1,000 times faster than flash storage.
- Today’s on-premise and cloud data centers typically aren’t equipped with servers that can provide a single instance of memory large enough to accommodate many data sets.
It has been over 50 years since Gordon Moore saw that transistor density doubles every two years. Over the decades, the interpretation of “Moore’s Law” has evolved to represent that the performance of microprocessors, and computers in general, is doubling every 18 months.Read More
With data volumes growing at a whopping 62 percent CAGR, it’s easy to see why some organizations are worried about keeping up. They’re trying to process and analyze bigger and more complicated problems, which increasingly stresses their computing resources. Eventually, you hit the limits of what your systems can handle.Read More
As the high-performance computing (HPC) community prepares to descend on Denver for SC17 next week, its members will arrive in the Mile-High City with more baggage than the usual rolling carry-on. They’ll also be packing some long-held expectations. One of these is that it’s more or less impossible to create a real HPC system—a massive single system image—in the cloud. I fully anticipate they will leave Denver with the opposite expectation.Read More
“We didn’t believe it either. But the TidalScale team is not fooling around here.”
These are two of my favorite sentences in Timothy Prickett Morgan’s excellent recent piece for TheNextPlatform in which he details the longtime quest to achieve “a big ole flat memory space that is as easy to program as a PC but brings to bear all that compute, memory and I/O of a cluster as a single system image.”Read More
Forrester recently called cloud computing "the most exciting and disruptive force in the tech market in the last decade." We would agree.
Visit us in Booth #309
Come see us in Booth #309 (just behind the huge IBM booth in the entry of the exhibit area) to see how you can right-size your cloud server resources to fit any data set or workload.
Software-Defined Servers are growing popular with large manufacturers, financial services firms and other innovators because of four key benefits to DevOps:Read More
How the WaveRunner API Enables Tomorrow’s SDDC Innovation, Today
Guest blog post by Chris Busse, CTO at APIvista
In my consulting work, I encourage enterprises of many sizes to use standardized APIs across their business areas. This means I’m often called upon to explain what an application programming interface is to non-technical stakeholders.
Some developments challenge our accepted notions of what things are and how they’re created. Uber and Airbnb, for instance, caused us to rethink what transportation and lodging mean in the new sharing economy. Then there’s Amazon, one of the granddaddies of disruptors.Read More
TidalScale’s WaveRunner – the point-and-click control panel that makes creating a right-sized Software-Defined Server fast, flexible and easy – isn’t just about creating one or more virtual servers from multiple commodity systems. It also puts you in control of all the software-defined pieces in the data center. So in addition to cores and memory, WaveRunner allows you to monitor and manage storage and networks. You simply pick the software-defined resources you need and plug them together.Read More
I’ll be the first to acknowledge that there’s a lot to the TidalScale story. Our Software-Defined Servers enable organizations to right-size servers on the fly to fit any data set. The process of creating one is fast, flexible and easy. With TidalScale, you can:Read More
Earlier this year, IDC surveyed 301 IT users from medium-sized and large enterprises, asking them questions that allowed the research firm to determine the relative efficiency of those data centers. (For reference, the average data center contained 386 blades and servers, while the largest third of those surveyed averaged 711 blades and servers.)Read More
Last week, I explored some of the key issues and core benefits that are prompting enterprises to move to more flexible and cost-effective composable infrastructures. As I pointed out in Part 1 of this blog, composable infrastructure technologies from vendors like TidalScale are designed to address many of the most pressing issues in today’s data centers,Read More
Part 1: The Need for Composable Infrastructure
New approaches to infrastructure design are required for businesses to keep up with the amount of data that is generated, and whose timely analysis is of paramount importance for the business to remain competitive in the digital economy. Newer approaches to infrastructure must focus on efficiency toRead More
A new report projects that the global Software-Defined Data Center (SDDC) market will grow at a 22 percent compound annual growth rate through 2021. The authors, an India-based outfit called Wise Guy Consultants, estimate that the total market size for SDDC goods and services will reach $81.4 billion in the same period.Read More