1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Associate Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Associate Data Architect in Atlanta, GA, on a hybrid contract. Key skills include Spark, Scala, Python, SQL, and experience migrating Hadoop to AWS and Snowflake. Healthcare industry experience is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 2, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Atlanta, GA
🧠 - Skills detailed
#Python #AWS (Amazon Web Services) #Scala #Data Engineering #Data Pipeline #SQL (Structured Query Language) #AWS S3 (Amazon Simple Storage Service) #Cloud #Data Architecture #Big Data #Snowflake #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Hadoop
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Role: Associate Data Architect

Location: Atlanta, GA - Hybrid - 3 days onsite

Contract

Must-Have:

   • Spark, Scala, Python, SQL

   • Hands-on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake

   • Experience in analyzing data using the ‘Big data’ platform

   • Spark, Scala, Python, SQL

Description

   • Client is looking for a Big Data Cloud Architect to work with one of the leading healthcare providers in US.

   • The ideal candidate may possess a good background on Healthcare Business Responsibilities Provide Architecture and hands on support for below activities.

   • Lead data engineer and analyst to deliver data sets and analysis results as per business requirements.

   • Assemble large, complex data sets that meet functional/Non-Functional business requirements. Automating manual processes, optimizing data delivery, recommending platform greater scalability/improvements Collaborate with initiative leads to optimize and enhance new capabilities.

   • Mentor team in migrating Hadoop on-prem to cloud AWS and snowflake Create and maintain optimal data pipeline architecture.

   • Presenting analysis results/recommendations using Powerpoint Requirements

Mandatory skills:

Hands on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake Experience in analyzing data using ‘Big-Data’ platform.