Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr Data Engineer - Snowflake/Databricks

This role is for a Sr Data Engineer with 7-25 years of experience, onsite in Glendale, CA, for 18 months at a W2 pay rate. Key skills include Snowflake or Databricks, Airflow, Apache Spark, and PySpark.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Glendale, CA
🧠 - Skills detailed
#Data Science #Snowflake #PySpark #Data Modeling #Datasets #AWS (Amazon Web Services) #Scala #Apache Spark #Data Pipeline #Programming #Java #Airflow #Stories #Spark (Apache Spark) #Python #API (Application Programming Interface) #Data Access #Cloud #GraphQL #Consul #Databricks #Data Engineering #Delta Lake
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Join Our Team at YO HR CONSULTANCY!

Senior Data Engineer

Experience: 7 - 25 Years

Location: Glendale, CA – Onsite 3-4 days a week W2

Contract Role Duration 18 months W2

Required Skills
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• In-depth experience with PySpark and Databricks required

About Us

Based in Los Angeles, we are a top player in the Entertainment & Media industry, dedicated to providing exceptional stories and experiences to audiences worldwide. Our technology teams are committed to constant innovation and the application of advanced technology to enhance entertainment experiences.

Technology Stack

You will engage with tools such as Python, AWS, Snowflake, Databricks, and Airflow.

Your Responsibilities As a Senior Data Engineer
• Assist in maintaining, upgrading, and extending current Core Data platform data pipelines
• Create and manage APIs for data accessibility in downstream applications
• Develop data pipelines for real-time streaming
• Utilize a tech stack including Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Work alongside product managers, architects, and fellow engineers to ensure the Core Data platform's success
• Help establish and document standards and best practices for pipeline configurations, naming conventions, etc.
• Ensure efficient operation and high-quality standards for datasets within the Core Data platform to meet SLAs and provide reliable solutions to stakeholders (Engineering, Data Science, Operations, and Analytics teams)

What We Are Looking For

A strong candidate will have:
• 7+ years of experience in data engineering with a focus on large data pipelines
• Strong proficiency in at least one programming language (e.g., Python, Java, Scala)
• Practical experience in production environments using distributed processing systems like Spark
• Experience with data pipeline orchestration systems like Airflow for creating and maintaining data workflows
• Familiarity with at least one cloud database or MPP technology (Snowflake, Databricks, Big Query)
• Experience in API development, particularly with GraphQL
• Advanced knowledge of OLTP vs OLAP systems
• Graph Database skills are advantageous
• Experience with Realtime Event Streaming is a bonus

Skills: snowflake,databricks,data modeling,pipelines,core data,spark,graphql,python,aws,data,apis,apache spark,olap,pyspark,airflow,database,delta lake,oltp,data pipeline orchestration