• Classroom
  • Online, Instructor-Led
Course Description

Apache Hadoop is an open source platform designed to query and analyze big data distributed across large clusters of servers with a very high degree of fault tolerance. In this course, students will learn to write applications on Hadoop. Topics include how to create a Hadoop project through completion, how to write Map Reduce programs, working with APIs, HDFS training including the loading and processing of data with CLI and API and processing applications using Apache Hadoop. Other topics include MapReduce mapper, reducer, combiner, practitioner and streaming. Workflow implementations will also be introduced.

Learning Objectives

Learn to write applications on Hadoop including Map Reduce programs, working with APIs, HDFS training including the loading and processing of data with CLI and API and processing applications using Apache Hadoop

Framework Connections

The materials within this course focus on the Knowledge Skills and Abilities (KSAs) identified within the Specialty Areas listed below. Click to view Specialty Area details within the interactive National Cybersecurity Workforce Framework.