

Associate Data Architect
Role: Associate Data Architect
Location: Atlanta, GA - Hybrid - 3 days onsite
Contract
Must-Have:
• Spark, Scala, Python, SQL
• Hands-on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake
• Experience in analyzing data using the ‘Big data’ platform
• Spark, Scala, Python, SQL
Description
• Client is looking for a Big Data Cloud Architect to work with one of the leading healthcare providers in US.
• The ideal candidate may possess a good background on Healthcare Business Responsibilities Provide Architecture and hands on support for below activities.
• Lead data engineer and analyst to deliver data sets and analysis results as per business requirements.
• Assemble large, complex data sets that meet functional/Non-Functional business requirements. Automating manual processes, optimizing data delivery, recommending platform greater scalability/improvements Collaborate with initiative leads to optimize and enhance new capabilities.
• Mentor team in migrating Hadoop on-prem to cloud AWS and snowflake Create and maintain optimal data pipeline architecture.
• Presenting analysis results/recommendations using Powerpoint Requirements
Mandatory skills:
Hands on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake Experience in analyzing data using ‘Big-Data’ platform.
Role: Associate Data Architect
Location: Atlanta, GA - Hybrid - 3 days onsite
Contract
Must-Have:
• Spark, Scala, Python, SQL
• Hands-on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake
• Experience in analyzing data using the ‘Big data’ platform
• Spark, Scala, Python, SQL
Description
• Client is looking for a Big Data Cloud Architect to work with one of the leading healthcare providers in US.
• The ideal candidate may possess a good background on Healthcare Business Responsibilities Provide Architecture and hands on support for below activities.
• Lead data engineer and analyst to deliver data sets and analysis results as per business requirements.
• Assemble large, complex data sets that meet functional/Non-Functional business requirements. Automating manual processes, optimizing data delivery, recommending platform greater scalability/improvements Collaborate with initiative leads to optimize and enhance new capabilities.
• Mentor team in migrating Hadoop on-prem to cloud AWS and snowflake Create and maintain optimal data pipeline architecture.
• Presenting analysis results/recommendations using Powerpoint Requirements
Mandatory skills:
Hands on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake Experience in analyzing data using ‘Big-Data’ platform.