1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

AWS with Advance Databricks

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS with Advanced Databricks Engineer in Atlanta, Georgia, or Hartford, Connecticut. Contract length is unspecified, with a pay rate of "TBD." Requires 5+ years in cloud data engineering, advanced Databricks expertise, and relevant certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 2, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Atlanta, GA
🧠 - Skills detailed
#AWS (Amazon Web Services) #IAM (Identity and Access Management) #Data Lake #Compliance #Agile #Databricks #Python #Data Pipeline #Big Data #EC2 #Monitoring #Terraform #Data Governance #SQL (Structured Query Language) #ML (Machine Learning) #Cloud #Data Security #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Kubernetes #Data Integrity #Data Science #Data Warehouse #Scala #Data Engineering #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Security #"ETL (Extract #Transform #Load)" #Redshift #Delta Lake #Data Processing
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

ONLY FOR W2

NO C2C

Job Title: AWS with Advanced Databricks Engineer (Overall 12+years)

Job Location:  Atlanta, Georgia and Hartford Connecticut [for both locations]

Job Summary:

We are seeking a skilled AWS Engineer with advanced expertise in Databricks to join our team. The ideal candidate will have hands-on experience in designing, implementing, and optimizing cloud-based data solutions on AWS using Databricks. You will be responsible for building scalable data pipelines, ensuring data integrity, and optimizing performance for large-scale data processing and analytics.

Key Responsibilities:

   • Design, develop, and deploy scalable data pipelines on AWS using Databricks.

   • Implement best practices for data engineering, ETL processes, and data governance.

   • Optimize Databricks workflows for performance, cost-efficiency, and scalability.

   • Integrate Databricks with various AWS services such as S3, Lambda, Glue, and Redshift.

   • Develop and maintain data lake and data warehouse solutions.

   • Collaborate with data scientists, analysts, and business teams to enable data-driven decision-making.

   • Ensure security and compliance best practices in cloud data solutions.

   • Troubleshoot and resolve issues related to data pipelines and Databricks infrastructure.

Required Skills & Experience:

   • 5+ years of experience in cloud-based data engineering.

   • Expertise in AWS services, including S3, EC2, Lambda, Glue, Redshift, IAM, and CloudFormation.

   • Advanced experience with Databricks, including Spark optimizations and performance tuning.

   • Proficiency in Python, SQL, and Scala for data processing and transformations.

   • Strong knowledge of data lake architectures, Delta Lake, and data warehouse solutions.

   • Experience with CI/CD pipelines and infrastructure as code (IaC) tools like Terraform.

   • Knowledge of data security, compliance, and monitoring best practices.

   • Excellent problem-solving skills and ability to work in an agile environment.

Preferred Qualifications:

   • Databricks Certification (e.g., Databricks Certified Data Engineer Professional).

   • AWS Certified Solutions Architect or AWS Certified Data Analytics certification.

   • Experience with machine learning workflows in Databricks.

   • Hands-on experience with Kubernetes and containerization on AWS.

Why Join Us?

   • Work with cutting-edge cloud technologies and big data solutions.

   • Opportunity for professional growth and certifications.

   • Collaborative and innovative work environment.

   • Competitive salary and benefits package.

If you are passionate about cloud data engineering and have a deep understanding of AWS and Databricks, we would love to hear from you! Apply now to join our dynamic team.