Fourth Annual Hadoop Summit: The Countdown Begins!

On June 29, Yahoo! will host the 4th annual Hadoop Summit at the Santa Clara Convention Center. Hadoop Summit 2011 brings together some of the most influential thought leaders in the space - from Yahoo, Facebook, IBM, NetApp, and others.

Jay Rossiter, Senior Vice President of the Yahoo! Cloud Platform Group will open the show with a keynote around how Yahoo! is developing the next generation of Hadoop applications to handle big data, the important role that Hadoop plays in Yahoo!’s integrated technology ecosystem and how wide industry adoption of Hadoop is benefiting the entire community.

Also on the main stage, Facebook will discuss its use of Hadoop to power the Facebook Messages infrastructure and IBM will discuss how they used Hadoop to power supercomputer, Watson.

Additional conference highlights include some key sessions:

* Next Generation Apache Hadoop MapReduce: Arun Murthy, Yahoo!’s lead architect on the Hadoop Map-Reduce development team, will lead a discussion on the next generation of Apache Hadoop MapReduce that factors the framework into a generic resource scheduler and a per-job, user-defined component that manages the application execution.
* Introducing HCatalog (Hadoop Table Manager): Alan Gates, Yahoo! architect for Pig and Howl, will provide an overview of HCatalog as well as the release/roadmap.
* Automated Rolling OS Upgrades for Yahoo! Hadoop Grids: Dan Romike, Yahoo! Hadoop Data and Grid Systems engineer, will detail how we are upgrading thousands of servers, the problems of system state management, and the operational workflows specific to a Hadoop grid environment.
* Case Studies of Hadoop Operations at Yahoo!: The Grid Operations team at Yahoo! operates about 40,000 servers running Hadoop in clusters of up to 4,200 servers. Charles Wimmer, Yahoo! senior service engineer of grid computing, will provide a dive deep into a series of case studies that exemplify these issues.
* Large Scale Math with Hadoop MapReduce: Nerd out and learn how Yahoo! established a new world record by computing the two quadrillionth bits of pi using Hadoop in July 2010. Widely covered in the news, the world record computation was composed of 35,000 MapReduce jobs, requiring 23 days of real time and 503 years of CPU time in Yahoo! clusters. Led by Yahoo! Hadoop engineer Tsz-Wo Sze, attendees will also learn MapReduce algorithms for large-scale mathematical calculation, their implementation, and our experience in running and tuning these computations in Hadoop clusters.

Check out the official conference agenda for a full preview of what’s to come and a look at the 32 different sessions, including best practice deep-dives and case studies on the Hadoop roadmap, operations and management, innovative Hadoop applications and research, and much more.

Space is limited, so don't miss this unique opportunity to hear directly from Hadoop thought leaders and pioneers by registering now.

And finally, a special thanks to the Hadoop Summit 2011 sponsors:

Platinum Sponsors:
* MapR Technologies
* NetApp

Gold Sponsors:
* Aster Data
* Cloudera
* DataStax
* Datameer

Silver Sponsors:
* Amazon Web Services
* Arista
* Impetus
* Pentaho
* Syncsort

* Dell
* Hadapt
* HStreaming
* Jive
* Karmasphere
* Mellanox
* Pervasive DataRush
* Quest Software
* Softlayer
* StackIQ
* ThinkBig Analytics

Stay up to date on Hadoop Summit buzz by following #hadoopsummit on Twitter.

If you are a Hadooper or would like to become one, come join the community for this one day event.