Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr Snowflake Data Engineer

This role is for a Sr Snowflake Data Engineer in Glendale, CA, with an 18-month contract at a W2 pay rate. Requires 7+ years of data engineering experience, proficiency in Airflow, Apache Spark, Snowflake or Databricks, and API development skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Glendale, CA
🧠 - Skills detailed
#Data Science #Snowflake #PySpark #Data Modeling #Datasets #AWS (Amazon Web Services) #Scala #Apache Spark #Data Pipeline #Programming #Java #Airflow #Stories #Spark (Apache Spark) #Python #API (Application Programming Interface) #Cloud #GraphQL #Consul #Databricks #Data Engineering #Delta Lake #Data Bricks
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

We are hiring for YO HR CONSULTANCY.

Sr Data Engineer

Experience: 7 - 25 Years

Location: Glendale, CA – Onsite 3-4 days a week W2

Contract Role Duration 18 months W2

Experience On The Below Skills Mandatory
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• Candidates should have a depth of experience in PySpark and data bricks

The Company

Headquartered in Los Angeles, this leader in the Entertainment & Media space is focused on delivering world-class stories and experiences to it's global audience. To offer the best entertainment experiences, their technology teams focus on continued innovation and utilization of cutting edge technology.

Platform / Stack

You will work with technologies that include Python, AWS, Snowflake, Databricks, and Airflow.

What You'll Do As a Sr Data Engineer
• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
• Build and maintain APIs to expose data to downstream applications
• Develop real-time streaming data pipelines
• Tech stack includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
• Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
• Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)

Qualifications

You could be a great fit if you have:
• 7+ years of data engineering experience developing large data pipelines
• Proficiency in at least one major programming language (e.g. Python, Java, Scala)
• Hands-on production environment experience with distributed processing systems such as Spark
• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
• Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
• Experience in developing APIs with GraphQL
• Advance understanding of OLTP vs OLAP environments
• Graph Database experience a plus
• Realtime Event Streaming experience a plus

Skills: snowflake,databricks,data modeling,pipelines,core data,spark,graphql,python,aws,data,api development,apis,apache spark,olap,pyspark,airflow,database,delta lake,oltp,data pipeline orchestration