Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Snowflake Data Analytics Lead [W2 ONLY]

This role is for a "Snowflake Data Analytics Lead" with a contract length of over 6 months, remote work location, and a pay rate of "W2 ONLY." Key skills required include Snowflake, SQL, Python, AWS Lambda, and ETL/ELT experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Python #S3 (Amazon Simple Storage Service) #AWS Lambda #Data Quality #dbt (data build tool) #Data Warehouse #Deployment #Snowflake #Data Ingestion #Data Engineering #Lambda (AWS Lambda) #Airflow #Data Integrity #Scala #Cloud #Consul #Azure #SNS (Simple Notification Service) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Fivetran #Automation #Data Extraction #DevOps #Azure DevOps #AWS (Amazon Web Services) #Data Pipeline
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Sr Data Analytics Consultant (Snowflake, SQL) / Remote / Contract or Contract-to-Hire

About our Customer:

Our DIRECT customer, a global leader in Food services industry is seeking an experienced “Sr Data Engineer” to work remotely. This is a very long term Contract position or CTH.

About Sr. Data Engineer:

Senior Data Engineer is a hand-on role with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.

Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus.

Responsibilities:
• Design, build, test, and implement scalable data pipelines using Python and SQL.
• Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization.
• Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.
• Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
• Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.

Maintain code via CI/CD processes as defined in our Azure DevOps platform.

Qualifications:
• 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.
• Expertise in Snowflake, including data ingestion and performance optimization.
• Strong SQL skills for writing efficient queries and optimizing existing ones.
• Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
• Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.
• Highly self-motivated and detail-oriented with strong communication skills.
• Familiarity with ETL/ELT processes.
• Familiarity with cloud development / deployment (AWS preferred)
• Experience with Fivetran and DBT is a plus.