Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr Snowflake Data Engineer

This role is for a "Sr Snowflake Data Engineer" with 7-25 years of experience, located in Glendale, CA. The 18-month W2 contract offers expertise in Airflow, Apache Spark, Snowflake or Databricks, and data modeling.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Glendale, CA
🧠 - Skills detailed
#Data Science #Snowflake #PySpark #Data Modeling #Graph Databases #Databases #Datasets #AWS (Amazon Web Services) #Scala #Apache Spark #Data Pipeline #Programming #Java #Airflow #Stories #Spark (Apache Spark) #Python #API (Application Programming Interface) #Cloud #GraphQL #Consul #Databricks #Data Engineering #Delta Lake
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

We are looking for talent to join YO HR CONSULTANCY.

Senior Data Engineer

Experience Required: 7 - 25 Years

Location: Glendale, CA – Onsite 3-4 days a week W2

Contract Duration: 18 months W2

Essential Skills Required
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• Candidates must have significant expertise in PySpark and Databricks

About The Company

Based in Los Angeles, this prominent organization in the Entertainment & Media sector is dedicated to creating exceptional stories and experiences for its worldwide audience. Their tech teams emphasize ongoing innovation and the use of advanced technology to enhance entertainment offerings.

Technology Stack

You will engage with technologies such as Python, AWS, Snowflake, Databricks, and Airflow.

Responsibilities As a Senior Data Engineer
• Assist in the upkeep, enhancement, and expansion of current Core Data platform data pipelines
• Design and maintain APIs to make data available to downstream applications
• Create real-time streaming data infrastructures
• Your tech stack includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Collaborate with product managers, architects, and other engineers to ensure the success of the Core Data platform
• Help develop and document internal and external standards and best practices for pipeline configurations, naming conventions, etc.
• Guarantee high operational efficiency and quality of Core Data platform datasets to ensure our solutions meet SLAs, maintaining project reliability and accuracy for all stakeholders (Engineering, Data Science, Operations, and Analytics teams)

Qualifications

You may be an excellent match if you possess:
• 7+ years of experience in data engineering with a focus on creating extensive data pipelines
• Expertise in at least one primary programming language (e.g. Python, Java, Scala)
• Hands-on experience in production environments with distributed processing systems like Spark
• Practical experience with data pipeline orchestration tools like Airflow to develop and maintain data pipelines
• Experience with at least one significant Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
• Proficiency in developing APIs utilizing GraphQL
• Advanced comprehension of OLTP vs OLAP systems
• Experience with Graph Databases is a plus
• Real-time Event Streaming exposure is advantageous

Skills: snowflake,databricks,data modeling,pipelines,core data,spark,graphql,python,aws,data,api development,apis,apache spark,olap,pyspark,airflow,database,delta lake,oltp,data pipeline orchestration