Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Principal Data Engineer

This role is for a Principal Data Engineer (Snowflake) on a 6-month remote contract at $85-$100 per hour. Requires expertise in Snowflake, dbt, GCP, and BigQuery, along with strong data engineering experience and leadership skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
800
🗓️ - Date discovered
February 8, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Analysis #dbt (data build tool) #Programming #Strategy #Leadership #Data Engineering #Data Modeling #Data Ingestion #BigQuery #Scala #Data Architecture #Java #Data Pipeline #Computer Science #Data Quality #Cloud #SQL (Structured Query Language) #Data Science #Snowflake #Storage #Data Strategy #Python #GCP (Google Cloud Platform)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Principal Snowflake Engineer - W2

Contract: 6 months initial

Location: Remote

Rate: $85-$100 per hour

We are seeking a highly skilled Principal Snowflake Engineer to lead our data engineering initiatives. The ideal candidate will have extensive experience in designing and implementing data pipelines, utilizing dbt for data transformation, and working with Google Cloud Platform (GCP) and BigQuery. This role requires collaboration with cross-functional teams to drive data strategy and optimize our data architecture.

Key Responsibilities:
• Data Pipeline Development: Design, build, and maintain robust data pipelines for efficient data ingestion, transformation, and storage in Snowflake.
• DBT Implementation: Lead the implementation of dbt for data modeling and transformation, ensuring best practices in software development and data quality.
• GCP and BigQuery Integration: Optimize data workflows in GCP and leverage BigQuery for analytics and reporting, ensuring seamless integration with Snowflake.
• Architectural Leadership: Provide architectural guidance and establish standards for data engineering practices across the organization.
• Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable insights.
• Performance Tuning: Monitor and optimize the performance of data pipelines and queries, ensuring high availability and scalability.
• Mentorship: Mentor and train junior data engineers, fostering a culture of continuous learning and improvement within the team.

Qualifications:
• Education: Bachelor’s degree in Computer Science, Engineering, or a related field; Master’s degree preferred.
• Experience: Strong experience in data engineering, with a focus on Snowflake and cloud-based data solutions.

Technical Skills:
• Expertise in Snowflake architecture and data warehousing concepts.
• Proficient in dbt for data transformation and modeling.
• Strong experience with GCP services, particularly BigQuery.
• Familiarity with ETL tools and frameworks.
• Knowledge of SQL and programming languages such as Python or Java.

Soft Skills:
• Excellent problem-solving and analytical skills.
• Strong communication and collaboration abilities.
• Proven leadership experience in a data-driven environment.