Cloudera Hadoop Admin

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloudera Hadoop Admin on a contract basis for 6 months, paying $70-73/hour, located in Jersey City, NJ/Charlotte, NC/Plano, TX (Hybrid). Requires 5+ years in financial services, Cloudera certification, and expertise in Hadoop, Linux, and automation scripting.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
584
🗓️ - Date discovered
April 16, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Yes
📍 - Location detailed
Jersey City, NJ
🧠 - Skills detailed
#Capacity Management #Cloud #NiFi (Apache NiFi) #Cloudera #HTTP & HTTPS (Hypertext Transfer Protocol & Hypertext Transfer Protocol Secure) #Big Data #Kerberos #Monitoring #Hadoop #Spark (Apache Spark) #Kafka (Apache Kafka) #ML (Machine Learning) #AWS (Amazon Web Services) #Debugging #Shell Scripting #Databases #Linux #Scripting #Bash #Security #Sqoop (Apache Sqoop) #Azure #Automation #Impala #HDFS (Hadoop Distributed File System) #Database Administration #HBase #LDAP (Lightweight Directory Access Protocol) #SAS #Storage #Apache NiFi #Programming #Python
Role description

Akkodis is seeking Cloudera Hadoop Admin for a Contract job with a client in Jersey City, NJ/Charlotte, NC/Plano, TX - Hybrid (3 days on site in a Week). Ideally looking for applicants with a solid background in the financial services industry.

Pay: $70-73/hour – All Inclusive; The rate may be negotiable based on experience, education, geographic location, and other factors.

Job Description:

The Cloudera Hadoop Administrator is responsible for partnering with engineering and operational teams in order to implement tools technology needed to enhance and support the overall Big Data platforms. Key responsibilities include administering the full Hadoop stack including, application integration, performance management, security implementation, configuration management, and problem management against an array of services and function at a platform and host level. The ideal candidate should be Cloudera certified and possess a minimum of 5 years practical experience on enterprise platforms. Job expectations also include using coding skills to automate certain tasks to address gaps in efficiency and monitoring.

Responsibilities:

   • Participates regularly in an on-call rotation with Production Support teammates

   • Develops and maintains automation monitoring scripts and leverages them for common instrumentation, automation, and operational needs

   • Engages as a subject matter expert in major incident triage efforts and problem resolution with our Production Support teammates.

   • Collaborates with our Security Team to identify, assess, and implement various security items from GIS and/or vendor CVE's

   • Identifies root causes for major incident/problem management investigations partners to implement code and/or configuration changes

   • Collaborates with Development and Infrastructure teams to understand technical solutions and implement monitoring capabilities outlined in the application and system monitoring designs put forward by the Technology Lead

   • Experience with multiple large scale Enterprise Hadoop environment builds and operations including design, capacity planning, cluster set up, security, performance tuning and monitoring.

   • Experience with the full Cloudera CDP distribution to install, configure and monitor all services in the CDP stack.

   • Strong understanding of core Cloudera Hadoop services such as HDFS, Tez, Kafka, Spark and Spark-Streaming, Hive, Impala, HBASE, Sqoop, and Oozie.

   • Experience in administering, and supporting RHEL Linux operating systems, databases, and hardware in an enterprise environment.

   • Expertise in typical system administration and programming skills such as storage capacity management, debugging, performance tuning.

   • Proficient in shell scripting (e.g. BASH, ksh, etc.)

   • Experience in setup, configuration and management of security for Hadoop clusters using Kerberos with integration with LDAP/AD at an Enterprise level.

   • Experience with scaling enterprise data into the ecosystem.

   • Expertise in writing python scripts and debugging existing scripts.

   • Enterprise Database Administration Experience.

   • Experience In Large Analytic Tools including SAS, Search, Machine Learning, Log Aggregation.

   • Experience with Hadoop distributions in the Cloud is a plus, AWS, Azure, Google.

   • Experience with Apache Nifi a plus.

If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, please contact Aditya Singh at aditya.singh@akkodisgroup.com

Equal Opportunity Employer/Veterans/Disabled

Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.

To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.

The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:

· The California Fair Chance Act

· Los Angeles City Fair Chance Ordinance

· Los Angeles County Fair Chance Ordinance for Employers

· San Francisco Fair Chance Ordinance