Welcome!

Recurring Revenue Authors: Liz McMillan, Pat Romanski, JP Morgenthal, Elizabeth White, Ed Featherston

Blog Feed Post

4 Observations About the EMC VFCache Announcement and the Sandisk Acquisition of Flashsoft

The SSD caching market is on fire.  This week, EMC announced VFCache, IBM announced XIV Gen3 SSD Caching, Flashsoft was acquired by Sandisk, and many storage vendors acknowledged that flash features in their long-term roadmap.  Activity has certainly picked up. It is interesting to explore what is causing the rapid development of the SSD caching market.

Here are my 4 key takeaways:

1. Flash has arrived

    Reliability: Flash technology (particularly in the SSD form factor) has matured enough to be used in the 24/7 enterprise environments. Through a combination of more reliable SSD and better management of data on SSD, suppliers have managed to create solutions that, while expensive, provide the bullet-proof reliability required in enterprise environments.

    Real results: Businesses have experienced the benefits of higher SSD IOPS in production environments. Higher SSD performance has translated to up to 10+ times improvement in performance for applications like databases (Oracle, SQL Server, MySQL, etc.), Financial analysis, research, simulation, modeling, enterprise search, E-commerce and social networking.

    Enterprise Budgets:  2012 is the first year where IT teams have dedicated budgets for deploying SSD.

    2. The storage industry has accepted that storage architectures will change

    EMC’s launch of VFCache and similar announcements from other storage vendors show that the industry has accepted a paradigm shift in data storage.  Gone are the days when all data was stored and served from fibre-channel SAN arrays. Storage arrays will continue to be used to consolidate data for backup and sharing, but performance will be driven by a layer of flash in the server that holds a copy of primary data.  Chris Mellor’s article in The Register “Inside the Mind of EMC, Is Storage Just a Launchpad” and David FLoyer's article in WikibonDesigning Systems and Infrastructure in the Big Data IO Centric Era” provide more details on the shift in storage architectures.

    3.  SSD caching offers an easier path to SSD deployment

      SSDs offer many advantages over HDDs, including much higher I/O performance, lower power consumption, and non-volatile memory. However, SSDs are expensive and deploying SSD often disrupts existing data protection and data management infrastructures. Deploying SSD as a cache overcomes both limitations.

       Lower Cost: By caching only important data on SSD and keeping primary data on lower-cost HDD, application performance can be increased at a lower cost

       Non-Disruptive Deployment: When SSD is deployed as a cache, existing primary storage remains unchanged. There is no need to pause or change existing backup and data management infrastructure.

       Best of both worlds:Deploying SSD as a cache provides the low latency and high performance of server-based SSD, while preserving the ability to uniformly protect and share primary data stored in the storage array.

      4. SSD cache solutions will soon pack more intelligence

        SSD is not just a faster disk: data is structured, accessed, and protected in a fundamentally different way in SSDs. Unfortunately, applications today are optimized for working with HDD and do not take into account the asymmetric read/write operations and SSD wear caused by write/erase operations. One can expect that over time applications will be re-written to account for the specific properties of SSD; however, the process of rewriting applications will be slow and is likely to drag over 10-15 years. SSD Caching software is a natural application translation layer that can optimize data for SSD.

        Data optimization for SSD: The table below illustrates the asymmetric nature of SSD. Read speed is 10x faster than write speed and 80x faster than erase speed. Furthermore, erase operations wear down the SSD and shorten its life. An SSD performs best if the caching software structures data in a way that maximizes read operations and minimizes write and erase operations. SSD performance is also impacted by the structure and size of data blocks stored on SSD. Intelligent SSD caching software would structure data blocks in a way that optimizes SSD performance and reliability while minimizing cost and wear.

         

        S per GB

        Latency (Microsec)

        Mean Time to Failure

        Flash Memory

        $1-$6

        Read:         25

        Write:       250

        Erase:    2,000

        3-50K erase cycles

        Hard Disk

        $0.10

                    3,000

         

        5 years

         

        Performance: Existing caching algorithms were developed for HDD and do not account for the asymmetric properties of SSD (the blog “Why Standard Cache Algorithms Won’t Work For SSDs” by Prof. Qing Yang, CTO of VeloBit, discusses this in further detail). As a result, when used with SSD, –existing caching algorithms do not perform well and shorten the life of SSD. New caching algorithms, designed with SSD in mind, can increase SSD performance by 3-5x relative to existing caching solutions and can enable use of less expensive SSD.

        Data pattern intelligence: Tiering and caching solutions today prioritize data based on its recency or frequency of use and, as a result, are not very effective at predicting future data use. The intelligent SSD caching software of the future will self-tune based on the data pattern to maximize performance on every application.

        What do you think about the recent wave of SSD announcements and acquisitions? Where do you think the industry is going?

        Read the original blog entry...

        More Stories By Peter Velikin

        Peter Velikin has 12 years of experience creating new markets and commercializing products in multiple high tech industries. Prior to VeloBit, he was VP Marketing at Zmags, a SaaS-based digital content platform for e-commerce and mobile devices, where he managed all aspects of marketing, product management, and business development. Prior to that, Peter was Director of Product and Market Strategy at PTC, responsible for PTC’s publishing, content management, and services solutions. Prior to PTC, Peter was at EMC Corporation, where he held roles in product management, business development, and engineering program management.

        Peter has an MS in Electrical Engineering from Boston University and an MBA from Harvard Business School.

        @ThingsExpo Stories
        "We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
        The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
        “delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
        From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
        IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
        The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
        Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
        Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
        "My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
        A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
        With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
        "We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
        Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
        We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
        What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
        Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
        You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
        Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
        Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
        WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...