HDP Operations: Install and Manage with Apache Ambari

Displaying courses for Great Britain [Change]


This course is designed for administrators who will be managing the Hortonworks Data Platform (HDP) 2.2. It covers installation, configuration, maintenance, security and performance topics.


IT administrators and operators responsible for installing, configuring and supporting an HDP 2.2 deployment in a Linux environment.


Attendees should be familiar with with Hadoop and Linux environments.


  • Describe various tools and frameworks in the Hadoop 2.x ecosystem
  • Understand support for various types of cluster deployments
  • Understand storage, network, processing, and memory needs for a Hadoop cluster
  • Understand provisioning and post deployment requirements
  • Describe Ambari Stacks, Views, and Blueprints
  • Install and configure an HDP 2.2 cluster using Ambari
  • Understand the Hadoop Distributed File System (HDFS)
  • Describe how files are written to and stored in HDFS
  • Explain Heterogeneous Storage support for HDFS
  • Use HDFS commands
  • Perform a file system check using command line
  • Mount HDFS to a local file system using the NFS Gateway
  • Understand and configure YARN on a cluster
  • Configure and troubleshoot MapReduce jobs
  • Understand how to utilize Capacity Scheduler
  • Utilize cgroup and node labeling
  • Understand how Slider, Kafka, Storm and Spark run on YARN
  • Use WebHDFS to access HDFS over HTTP
  • Understand how to optimize and configure Hive
  • Use Sqoop to transfer data between Hadoop and a relational database
  • Use Flume to ingest streaming data into HDFS
  • Understand how to use Oozie and Falcon
  • Commission and decommission worker nodes
  • Configure a cluster to be rack-aware
  • Understand NameNode HA and ResourceManager HA
  • Secure a Hadoop cluster

Hands-On Labs

  • Install HDP 2.2 cluster using Ambari
  • Add new hosts to the cluster
  • Managing HDP services
  • Using HDFS commands
  • Verify data with Block Scanner and fsck
  • Troubleshoot a MapReduce job
  • Configuring the Capacity Scheduler
  • Using WebHDFS
  • Using Sqoop
  • Install and test Flume
  • Mounting HDFS to a Local File System
  • Using distcp to copy data from a remote cluster
  • Dataset Mirroring using Falcon
  • Commissioning and Decommissioning Services
  • Using HDFS snapshots
  • Configuring Rack Awareness
  • Configure NameNode HA using Ambari
  • Setting up the Knox Gateway
  • Securing an HDP Cluster

Training provider

Teaching mode: Classroom - Instructor Led
Duration: 4 days
Gooroo has partnered with the global leaders in IT training to give you access to quality training, personalised to you, targeted at increasing your job opportunities and salary.

Our pricing

We do not display pricing as Gooroo members qualify for special discounts not available elsewhere. You must enquire through Gooroo to get this benefit.

New courses are happening all the time

Our partner's expert training consultant will provide you with the times and all the details you need. Enquire today.

Top skills covered in this course

Apache Hadoop
Great Britain
This skill has an average salary of
and is mentioned in
of job ads in this area.
Great Britain
This skill has an average salary of
and is mentioned in
of job ads in this area.
Apache Spark
Great Britain
This skill has an average salary of
and is mentioned in
of job ads in this area.
Apache Hive
Great Britain
This skill has an average salary of
and is mentioned in
of job ads in this area.