
NYU HPC (High Performance Computing) instruction will provide a brief overview of what Hadoop is and the various components that are involved in the Hadoop ecosystem. There will be a hands on showcase on how to use the dumbo(Hadoop) cluster to run basic map-reduce jobs. Various hands on exercises have been incorporated for the users to get a better understanding.
More information and to register: Big Data Tutorial 1: MapReduce