Breakthrough flexibility & performance in third generation of TidalScale software GET THE DETAILS >

The TidalScale Blog

    Data is more valuable than oil

    Authored by: Gary Smerdon


    It’s not surprising that data has surpassed the value of precious resources like gold or oil. After all, modern businesses run on data. So the need to understand and maximize its value will only grow more imperative.

    Consider how professional sports has been transformed--beginning, naturally, with Billy Beane. The former Oakland A’s general manager famously eschewed old-school recruiting techniques   (like batting average or a player’s dominant hand) in favor of analytics-driven metrics, such as on-base percentage and slugging percentage, that virtually no other club considered to be meaningful. A player’s physical attributes and power factored far less into his worth. By using this “Sabremetrics” system to acquire undervalued players for the cash-strapped A’s franchise, Beane built a winning team and revolutionized the business of sports.

    Fast forward to today, and we see how extensive that transformation has become. Take Draymond Green of the NBA's Golden State Warriors. If players were valued simply on the time-honored metric of points per game (PPG), then Green wouldn't be pulling down $100 million contracts with one of the most successful teams in the nation. But he does. Because analysis of relatively new metrics like career win share uncover previously overlooked yet lasting value to the team--and reveal how data has become just as important to the success of professional sports franchises as a state-of-the-art venue. 


    Transformation across the board

    It’s not just happening in sports. Amazon, Facebook, Alphabet and Netflix all have businesses that essentially rely on data as a precious raw material.  

    But other, more traditional businesses are also harnessing the value of data.  It turns out data is the new oil even in the oil and gas industry, where companies are drilling into data to cut operating costs, do a better job identifying drill sites, and reduce business risk. But just how much is data worth to energy companies? Plenty.  According to the World Economic Forum Report of 2017, digital transformation will add $1.7 trillion of value to the oil and gas industry by 2025.

    Meanwhile, other businesses like banking, brick-and-mortar retail, and even flooring manufacturers are finding new ways to monetize data. For example, supermarket giant The Kroger Co. generates $100 million in incremental revenue per year by selling its inventory and point-of-sale data and “making that available as a syndicated data provider."


    Where business and IT agendas align

    In a presentation at the upcoming Gartner Infrastructure, Operations and Cloud Strategies Conference this month in Las Vegas, we’ll be exploring how existing hardware limitations--exacerbated by the death of Moore’s Law--are preventing businesses from extracting the greatest value from their data on a timely basis. Big data workloads are overwhelming even the largest servers. IT departments waste money and time on elaborate sizing exercises to determine their needs three to five years from now, culminating in expensive server investments.

    “Gartner research and anecdotal evidence both point to a gap between near-term priorities of CIOs and VPs of infrastructure,” notes Daniel Bowers, Research Director at Gartner, Inc. “The CIO’s goal is to grow revenue, while the infrastructure group focuses on cutting costs. This gap is widening year after year.”

    So the CIO is caught between the goal to grow revenue and the need to cut costs, and is tasked with aligning these often conflicting agendas. 

    Sounds easy enough, right? Well, it actually can be. 


    A next-level approach to maximizing data value

    Both agendas can be addressed by removing the limitations of traditional hardware.  Achieving that requires a flexible infrastructure that accommodates fluctuating and unpredictable workloads. Software-defined storage and networks are both well established in modern data centers and in the cloud, but servers remain fixed assets. To bring the same kind of on-demand flexibility to inflexible servers requires a next-level approach. 

    TidalScale’s Software-Defined Server technology overcomes the constraints of today’s hardware by combining existing servers to create virtual servers that can be sized  on demand to match any workload. By achieving this with industry standard, commodity servers on premise and in the cloud, TidalScale helps CIOs reduce time to insight for the business while helping IT keep operating costs down. Suddenly, those two conflicting agendas are no longer at odds.


    See us at #GartnerIO

    We’re excited to be bringing this groundbreaking solution to Gartner’s conference next week. And we’re even more excited to welcome a Fortune 500 CIO to the stage to share how his firm is using data to maximize revenue while cutting IT costs.

    If you’re headed to Vegas, be sure to visit us Booth #366 and mark your calendars for our speaking session on Tuesday, Dec. 10th at 3:45 p.m. PT in Murano 3301A. 

    Even if you don’t have plans to be at #GartnerIO, not to worry. Check out our latest paper featuring research from Gartner. It highlights how TidalScale is the missing piece in Gartner’s vision of an intelligent, flexible, on-demand infrastructure.




    Topics: TidalScale, big data, software-defined server, data growth