The ever changing Hadoop Ecosystem: What does it mean?

Aug 12th, 2015

Gary Allemann, Managing Director, Master Data Management

Big data is synonymous with Hadoop, the open source software project that enables distributed processing of large data sets across clusters of commodity servers. Hadoop is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance.

“One of the challenges facing companies looking to adopt Hadoop is the rapid evolution of the Hadoop ecosystem,” says Gary Allemann, MD at Datameer partner, Master Data Management. “This creates risk as analytics built on today’s processing engines may not be a viable option in the future.”

To manage this risk companies are looking to a broader ecosystem of tools that are compatible with Hadoop, but add management capability and shield the user from changes to the underlying platform

What do the ongoing changes to the Hadoop ecosystem mean to the corporate user?

Join big data specialists, Datameer, for a webinar on 20th August that will answer this thorny question

Datameer CEO, Stefan Groschupf and Andrew Brust, resident Big Data analyst and senior director for product marketing will provide insights on:

  • The evolution of Hadoop, and subsequent changes in the Hadoop ecosystem.
  • Why the Hadoop Distributed File System may be Hadoop’s killer app.
  • Batch versus interactive; in-memory vs. on-disk.
  • Hadoop, the Apache Software Foundation and the Open Data Platform.
  • Open source projects you might not know about, but should.

Click here to obtain more information:

Date: Thursday August 20, 2015
Time:  20:00-21:00
Register here: