Breakthrough flexibility & performance in third generation of TidalScale software GET THE DETAILS >

The TidalScale Blog

    Meet your new database machine

    Authored by: Gary Smerdon

     

    blog-new-db-machine

    Data growth is staggering and shows no signs of tapering off. This current environment of accelerating data velocity and volume is forcing companies to rethink not only how they operate and what they can support, but also their business strategy as a whole. 

    With data-driven organizations 23 times more likely to acquire customers than their peers, it’s no wonder more businesses are doing their best to plan for database sizes that are well beyond what they’ve seen in the past. 

    But how does an organization build an Oracle Database, SAP HANA or SybaseIQ  infrastructure that can support this changing environment? And how can it plan for unprecedented in-memory workloads? 

    It’s a dilemma every company must solve--or risk falling behind.  

    Startup tech is shaking up the norm

    Traditionally, to outfit an in-memory database environment for the future, a company would spend several months and hundreds of thousands of dollars on an extensive system sizing exercise to ensure their database environment has enough memory for the next 3-5 years--and then ultimately invest in another large database system with all its associated infrastructure and maintenance expenses. It’s an extremely arduous and costly process with shaky guarantees for long-term sustainability and a near promise of repeating the process again in just a few years. 

    However, more and more companies are looking for ways to optimize and streamline their current infrastructure without a major new purchase or database overhaul. And they’re looking to startups to help them out. According to the CIO Innovation Index, spending on disruptive technology solutions from emerging startups will increase 50% in the next 12 months. 

    The future is software-defined

    In the world of database machines, one of those disruptions is the Software-Defined Server.  

    The default mode of most companies is to invest hundreds of thousands and often millions of dollars into a large, costly database machine. But TidalScale Software-Defined Servers provide an attractive and affordable alternative to this heavy lift. They offer the same or better performance at a much lower cost with a solution that leverages existing industry-standard servers. 

    How? By combining all the resources of multiple x86 commodity servers into Software-Defined Servers of virtually any size to handle any in-memory database workload. 

    As many customers discover, the benefits are transformative. In one recent implementation for a multinational bank, TidalScale was able to match the capacity and capabilities of three large traditional Oracle Database systems--one dedicated to production, one to non-production, and the third to disaster recovery--at a fraction of the cost and deployment time of traditional database platforms. 

    Using commodity, two-socket servers, TidalScale mirrored the bank’s three large 6TB Oracle racks in memory (18TB total) with fewer cores (between 48 and 144, vs. 576 on the traditional systems) and less than 1/10th the power consumption. In fact, TidalScale was able to support all three separate environments in just ⅓ of a single rack.

    For OracleDB users in particular, TidalScale’s lower core count and grow-on-demand model translates to significant TCO savings, because organizations aren’t paying to support and operate their “three years from now” system until they actually need it three years from now.

    For details on this breakthrough implementation, read our solution brief

    An array of businesses, from Fortune 500 manufacturers to leading data analytics firms all are discovering that Software-Defined Servers remove the constraints of an existing infrastructure, delivering the benefits of an on-demand, composable environment without the burden of vendor lock-in and massive procurement budgets. The result is that organizations can optimize their environment to meet data demands as they arise.

    Only companies who can harness their power of their data can truly future-proof their business. And as many businesses are realizing, old-school technology solutions are sometimes a poor fit for new technology challenges.

     

    LEARN MORE >

     

     

    Topics: virtualization, software-defined server, in-memory computing, data growth, R & Python on Ubuntu Linux, in-memory analytics, data analytics