1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Hadoop Admin

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Admin with Cloudera in Dallas, TX. It is a contract position requiring strong Hadoop ecosystem experience, proficiency in Linux, scripting skills, and relevant certifications. Key responsibilities include cluster management and security implementation.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Linux #Impala #Computer Science #YARN (Yet Another Resource Negotiator) #Big Data #Spark (Apache Spark) #NiFi (Apache NiFi) #LDAP (Lightweight Directory Access Protocol) #Sqoop (Apache Sqoop) #Unix #Ambari #Python #Hadoop #Debugging #Security #Data Ingestion #Data Warehouse #Scala #Kudu #Programming #Apache Kafka #Bash #Databases #HBase #Automation #Kerberos #DevOps #HDFS (Hadoop Distributed File System) #Capacity Management #Data Science #Cloudera #SQL (Structured Query Language) #Apache NiFi #Data Bricks #Scripting #Shell Scripting #Kafka (Apache Kafka) #Cloud #Data Engineering #Storage
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Hadoop Admin with Cloudera

Dallas TX

Contract

Job Description:

   • Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).

   • Strong experience in designing, implementing, and administering Hadoop clusters in a production environment.

   • Proficiency in Hadoop ecosystem components such as HDFS, YARN, MapReduce, Hive, Spark, and HBase.

   • Experience with cluster management tools like Apache Ambari or Cloudera Manager.

   • Solid understanding of Linux/Unix systems and networking concepts.

   • Strong scripting skills (e.g., Bash, Python) for automation and troubleshooting.

   • Knowledge of database concepts and SQL.

   • Experience with data ingestion tools like Apache Kafka or Apache NiFi.

   • Familiarity with data warehouse concepts and technologies.

   • Understanding of security principles and experience implementing security measures in Hadoop clusters.

   • Strong problem-solving and troubleshooting skills, with the ability to analyze and resolve complex issues.

   • Excellent communication and collaboration skills to work effectively with cross-functional teams.

   • Relevant certifications such as Cloudera Certified Administrator for Apache Hadoop (CCAH) or Hortonworks Certified Administrator (HCA) are a plus.

Skills:

Technical Proficiency:

Experience with Hadoop and Big Data technologies, including Cloudera CDH/CDP, Data Bricks, HD Insights, etc.

Strong understanding of core Hadoop services such as HDFS, MapReduce, Kafka, Spark, Hive, Impala, HBase, Kudu, Sqoop, and Oozie.

Proficiency in RHEL Linux operating systems, databases, and hardware administration.

Operations and Design:

Operations, design, capacity planning, cluster setup, security, and performance tuning in large-scale Enterprise Hadoop environments.

Scripting and Automation:

Proficient in shell scripting (e.g., Bash, KSH) for automation.

Security Implementation:

Experience in setting up, configuring, and managing security for Hadoop clusters using Kerberos with integration with LDAP/AD.

Problem Solving and Troubleshooting:

Expertise in system administration and programming skills for storage capacity management, debugging, and performance tuning.

Collaboration and Communication:

Collaborate with cross-functional teams, including data engineers, data scientists, and DevOps teams.

Provide technical guidance and support to team members and stakeholders.

Skills:

On-prem instance

Hadoop config, performance, tuning

Ability to manage very large clusters and understand scalability

Interfacing with multiple teams

Many teams have self service capabilities, so should have this experience managing this with multiple teams across large clusters.Hands-on and strong understanding of Hadoop architecture

Experience with Hadoop ecosystem components - HDFS, YARN, MapReduce & cluster management tools like Ambari or Cloudera Manager and related technologies.

Proficiency in scripting, Linux system administration, networking, and troubleshooting skills

Thanks

Lalit

Lalit@hirextra.com