Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data & Analytics Engineer

This role is for a "Senior Data & Analytics Engineer" on a 6-month contract-to-hire, paying $65+ per hour. Remote work is required during EST hours. Key skills include Snowflake, SQL, Python, and ETL processes, with 5+ years of data engineering experience necessary.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Automation #Snowflake #Data Warehouse #SNS (Simple Notification Service) #Data Engineering #Scala #Azure #Lambda (AWS Lambda) #SQL (Structured Query Language) #Data Quality #Data Ingestion #Data Modeling #AWS Lambda #Data Extraction #Azure DevOps #Data Integrity #dbt (data build tool) #AWS (Amazon Web Services) #Airflow #Fivetran #DevOps #S3 (Amazon Simple Storage Service) #Data Pipeline #"ETL (Extract #Transform #Load)" #Python
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Senior Data & Analytics Engineer

Location: Remote, must work EST hours

Duration: 6-month contract-to-hire

Pay: $65+ per hour

JOB DESCRIPTION

Our global Fortune 500 client, with U.S. headquarters in Charlotte, NC, is a world class food service provider with a strong presence across the nation. Celebrating almost 30 years in North America, this employee-focused company has received honors for diversity and inclusion, innovation, health and wellness, and company culture. CRG has successfully placed over 220 employees within the last 7 years within this organization, known for its continuous growth opportunities, fantastic benefits package, innovative technology, flexible work environment, and collaborative culture.

We are looking for a hands-on Senior Data Engineer/Analytics Engineer with expertise in developing data pipelines and transforming data to be consumed downstream. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance. Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and dbt or similar tools are also a must have.

RESPONSIBILITIES
• Design, build, test, and implement scalable data pipelines using Python and SQL.
• Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization.
• Design and implement analytical data models using SQL in dbt and Snowflake, focusing on accuracy, performance, and scalability
• Own and maintain the semantic layer of our data modeling, defining and managing metrics, dimensions, and joins to ensure consistent and accurate reporting across the organization
• Collaborate with the internal stakeholders to understand their data needs and translate them into effective data models and metrics
• Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
• Collaborate with other data engineers and architects to develop new pipelines and/or optimize existing ones.
• Maintain code via CI/CD processes as defined in our Azure DevOps platform.

QUALIFICATIONS
• Highly self-motivated and detail-oriented with strong communication skills.
• 5+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion and data transformation.
• Expertise in Snowflake, including data ingestion and performance optimization.
• Strong experience using ETL software (Fivetran, dbt, Airflow, etc.)
• Strong SQL skills for writing efficient queries and optimizing existing ones.
• Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
• Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.

Category Code: JN008