The Cloudera Hadoop Administrator is a certified individual responsible for administering the full Hadoop stack including, application integration, performance management, security implementation, configuration management, and problem management against an array of services and function at a platform and host level. The ideal candidate should be Cloudera certified and possess a minimum of 5 years practical experience on enterprise platforms.
1. Experience with multiple large scale Enterprise Hadoop environment builds and operations including design, capacity planning, cluster set up, security, performance tuning and monitoring.
2. Experience with the full Cloudera CDH distribution to install, configure and monitor all services in the CDH stack.
3. Strong understanding of core Cloudera Hadoop services such as HDFS, MapReduce, Kafka, Spark and Spark-Streaming, Hive, Impala, HBASE, Kudu, Sqoop, and Oozie.
4. Experience in administering, and supporting RHEL Linux operating systems, databases, and hardware in an enterprise environment.
5. Expertise in typical system administration and programming skills such as storage capacity management, debugging, performance tuning.
6. Proficient in shell scripting (e.g. Bash,ksh,etc)
7. Experience in setup, configuration and management of security for Hadoop clusters using Kerberos with integration with LDAP/AD at an Enterprise level.
8. Experience with scaling enterprise data into the ecosystem.
1. Expertise in writing python scripts and debugging existing scripts.
2. Enterprise Database Administration Platform Experience.
3. Experience In Large Analytic Tools including SAS, Search, Machine Learning, Log Aggregation.
4. Experience with Hadoop distributions in the Cloud is a plus, AWS, Azure, Google.
5. Experience with Apache Nifi a plus.
Shift:1st shift (United States of America)
Hours Per Week:40
Learn more about this role