Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data Engineer

This role is for a "Senior Data Engineer" with a 12-month contract in Boston, MA. Key skills include Snowflake, ETL processes, Kubernetes, and AWS. Familiarity with CI/CD, Terraform, and containerization is required. Hybrid work requires 3 days in-office.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 13, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Boston, MA
🧠 - Skills detailed
#Apache Kafka #Snowflake #Deployment #Kafka (Apache Kafka) #Consul #Data Processing #C# #Scripting #Airflow #Infrastructure as Code (IaC) #Data Engineering #Terraform #AWS (Amazon Web Services) #Python #Continuous Deployment #Microservices #Tableau #Linux #Kubernetes #Cloud #"ETL (Extract #Transform #Load)" #Docker #SQL Server #DataOps #SQL (Structured Query Language) #RDBMS (Relational Database Management System) #Bash #Data Pipeline
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Senior Data Engineer

This role is with a Maris Financial Services Partner

Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.

12 Month + contract (or contract to hire, if desired)

This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.

Key Responsibilities:
• Design, enhance, and manage DataOps tools and services to support cloud initiatives.
• Develop and maintain scheduled workflows using Airflow.
• Create containerized applications for deployment with ECS, Fargate, and EKS.
• Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
• Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.

Qualifications:
• Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
• Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
• Proficiency in scripting languages such as Bash.
• Experience with Python, Go, or C#.
• Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
• Preferred experience with Apache Kafka and Flink.
• Proven experience working with Kubernetes.
• Strong knowledge of Linux and Docker environments.
• Excellent communication and interpersonal skills.
• Strong analytical and problem-solving abilities.
• Ability to manage multiple tasks and projects concurrently.
• Expertise with SQL Server, Postgres, and Snowflake.
• In-depth experience with ETL/ELT processes.