

Principal Data Engineer
Principal Snowflake Engineer - W2
Contract: 6 months initial
Location: Remote
Rate: $85-$100 per hour
We are seeking a highly skilled Principal Snowflake Engineer to lead our data engineering initiatives. The ideal candidate will have extensive experience in designing and implementing data pipelines, utilizing dbt for data transformation, and working with Google Cloud Platform (GCP) and BigQuery. This role requires collaboration with cross-functional teams to drive data strategy and optimize our data architecture.
Key Responsibilities:
• Data Pipeline Development: Design, build, and maintain robust data pipelines for efficient data ingestion, transformation, and storage in Snowflake.
• DBT Implementation: Lead the implementation of dbt for data modeling and transformation, ensuring best practices in software development and data quality.
• GCP and BigQuery Integration: Optimize data workflows in GCP and leverage BigQuery for analytics and reporting, ensuring seamless integration with Snowflake.
• Architectural Leadership: Provide architectural guidance and establish standards for data engineering practices across the organization.
• Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable insights.
• Performance Tuning: Monitor and optimize the performance of data pipelines and queries, ensuring high availability and scalability.
• Mentorship: Mentor and train junior data engineers, fostering a culture of continuous learning and improvement within the team.
Qualifications:
• Education: Bachelor’s degree in Computer Science, Engineering, or a related field; Master’s degree preferred.
• Experience: Strong experience in data engineering, with a focus on Snowflake and cloud-based data solutions.
Technical Skills:
• Expertise in Snowflake architecture and data warehousing concepts.
• Proficient in dbt for data transformation and modeling.
• Strong experience with GCP services, particularly BigQuery.
• Familiarity with ETL tools and frameworks.
• Knowledge of SQL and programming languages such as Python or Java.
Soft Skills:
• Excellent problem-solving and analytical skills.
• Strong communication and collaboration abilities.
• Proven leadership experience in a data-driven environment.