Welcome!

Recurring Revenue Authors: Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz, Xenia von Wedel

Blog Feed Post

Adoption of R by large Enterprise Software Vendors

by Uday Tennety:  Director, Advanced Analytics Services at Revolution Analytics The R ecosystem has become widely popular lately with large players such as Pivotal, Tibco, Oracle, IBM, Teradata and SAP integrating R into their product suites. All these big players are using value chain integration and platform envelopment strategies to build a network effect in order to gain maximum leverage against their competitors in the Big Data and Analytics space. Big Data movement has gained a lot of traction in the enterprise space, and the ecosystem is rapidly evolving. The end goal for most enterprises is not to collect, store and manage data, but to obtain new business insights through predictive modeling and analytics. With this objective in mind, many enterprise software vendors have embraced R for their analytics story. Below is my analysis on the various distributions of R provided by large Enterprise software vendors along with their integration strategies: Oracle R Enterprise Oracle R Distribution is Oracle's free distribution of open source R.  Oracle R Enterprise integrates Oracle R Distribution/ R, the open source scripting language and environment, with Oracle Database. Oracle R Enterprise primarily introduces a variant to many R data types by overloading them in order to integrate Oracle database with R. But, the names of the Oracle R Enterprise data types are the same as the names of corresponding R data types prefixed by "ore". Oracle’s strategy with Oracle’s R Enterprise is to provide in-database analytics capabilities for its widely adopted enterprise RDBMS, and for its Exadata appliance. Tibco’s TERR Tibco with its acquisition of S+ technology from Insightful in 2008, built its own distribution of R called TERR. TERR has been built from ground up in C++, and their team has redesigned data object representation by implementing those objects using abstract C++ classes.  Also, TERR claims to provide better performance and memory management compared to open source R. According to the company sources, TERR is compatible with open source R, and runs analytics by loading data in memory.  But TERR does not yet support data from disk, streaming and database sources, and the company plans to support them sometime in future. Tibco has recently integrated TERR with their data visualization tool, Spotfire, to make it easy for enterprises choosing Spotfire to run an R based analytics tool. PivotalR Pivotal was officially launched on April 1, 2013, as EMC decided to group together a set of EMC, VMware and Pivotal Lab’s products to offer a differentiated Enterprise grade big data platform. Pivotal’s strategy with R is very similar to Oracle’s strategy with their Oracle R Enterprise. PivotalR is a package that enables users of R to interact with the Pivotal (Greenplum) Database as well as Pivotal HD and HAWQ for Big Data analytics. It does so by providing an interface to the operations on tables and views in the database. PivotalR also claims to provide parallel and distributed computation ability on Pivotal for big data analytics. It also provides a wrapper for MADlib, which is an open-source library for parallel and scalable in-database analytics. SAP SAP has integrated R with their in-memory database, HANA, to allow usage of R for specific statistical functions. But, SAP does not ship the R environment with SAP HANA database, nor does it provide support for R. In order to use the SAP HANA integration with R, one needs to download R from CRAN and configure it. Also, an Rserve configuration is needed for this integration to work. SAP’s strategy for integrating HANA with R is to provide a well-known and robust environment for advanced data analysis, while providing a support mechanism in HANA for specific statistical functions. Teradata Teradata has partnered with Revolution Analytics to provide a platform that brings parallelized analytical algorithms to the data. Revolution R Enterprise 7 for Teradata includes a library of Parallel External Memory Algorithms (PEMAs) that run directly in parallel on the Teradata nodes. This strategy provides a scalable solution to run analytical algorithms in-database, in parallel, by bringing analytics to the data in its true sense. IBM Similar to Teradata, IBM has also partnered with Revolution Analytics to provide advanced data analysis capabilities to its PureData System for Analytics platform (fka Netezza).  Revolution R Enterprise for PureData System for Analytics, enables the execution of advanced R computations for rapid analysis of hundreds of petabyte-class data volumes. Today, businesses are scrambling to build IT infrastructure to extract value from all the data available to them. They are afraid that their competitors might get there first and gain a competitive advantage. In short, enterprises are now in a Big Analytics arms race. With strong partners, a powerful community and with a promise of an easy-to-integrate solution, R is in a great position to capitalize on the Big Data and Analytics revolution.      

Read the original blog entry...

More Stories By David Smith

David Smith is Vice President of Marketing and Community at Revolution Analytics. He has a long history with the R and statistics communities. After graduating with a degree in Statistics from the University of Adelaide, South Australia, he spent four years researching statistical methodology at Lancaster University in the United Kingdom, where he also developed a number of packages for the S-PLUS statistical modeling environment. He continued his association with S-PLUS at Insightful (now TIBCO Spotfire) overseeing the product management of S-PLUS and other statistical and data mining products.<

David smith is the co-author (with Bill Venables) of the popular tutorial manual, An Introduction to R, and one of the originating developers of the ESS: Emacs Speaks Statistics project. Today, he leads marketing for REvolution R, supports R communities worldwide, and is responsible for the Revolutions blog. Prior to joining Revolution Analytics, he served as vice president of product management at Zynchros, Inc. Follow him on twitter at @RevoDavid

IoT & Smart Cities Stories
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and Bi...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...