1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Hadoop Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer in Charlotte, NC, on a W2 contract. Requires 5+ years in Software Engineering, strong Hadoop and Spark experience, Python, and Unix skills. Local candidates preferred; containerization and S3 experience are a plus.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Compliance #Spark SQL #Cloudera #SQL (Structured Query Language) #Unix #Python #Hadoop #Scripting #Consulting #Big Data #Consul #Kubernetes #Shell Scripting #Spark (Apache Spark) #Cloud
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Greetings from 10Decoders!

This is Ruban Alwin, Recruitment Lead with 10Decoders. We are currently looking for experienced professionals in Hadoop / Big Data to join with one of our Client.

Job Information:Position: Hadoop DeveloperLocation: Charlotte, NC (Prefers Local Candidate)Duration: Contract (W2)

Job Description:· In this contingent resource assignment, you may: Consult on complex initiatives with broad impact and large-scale planning for Software Engineering.· Review and analyze complex multi-faceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors.· Contribute to the resolution of complex and multi-faceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables.· Strategically collaborate and consult with client personnel.

Required Qualifications:5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education.· Strong relational database knowledge· Understanding of relational database vs. big data concepts· Python Experience· Spark & Spark SQL Experience· Unix & Shell Scripting Experience· Hadoop Experience - MapR (Hortonworks/Cloudera in addition preferred)· Containerization tools nice to have (OpenShift/Kubernetes)· S3 experience nice to have