

Sr Data Engineer - Snowflake/Databricks
Join Our Team at YO HR CONSULTANCY!
Senior Data Engineer
Experience: 7 - 25 Years
Location: Glendale, CA – Onsite 3-4 days a week W2
Contract Role Duration 18 months W2
Required Skills
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• In-depth experience with PySpark and Databricks required
About Us
Based in Los Angeles, we are a top player in the Entertainment & Media industry, dedicated to providing exceptional stories and experiences to audiences worldwide. Our technology teams are committed to constant innovation and the application of advanced technology to enhance entertainment experiences.
Technology Stack
You will engage with tools such as Python, AWS, Snowflake, Databricks, and Airflow.
Your Responsibilities As a Senior Data Engineer
• Assist in maintaining, upgrading, and extending current Core Data platform data pipelines
• Create and manage APIs for data accessibility in downstream applications
• Develop data pipelines for real-time streaming
• Utilize a tech stack including Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Work alongside product managers, architects, and fellow engineers to ensure the Core Data platform's success
• Help establish and document standards and best practices for pipeline configurations, naming conventions, etc.
• Ensure efficient operation and high-quality standards for datasets within the Core Data platform to meet SLAs and provide reliable solutions to stakeholders (Engineering, Data Science, Operations, and Analytics teams)
What We Are Looking For
A strong candidate will have:
• 7+ years of experience in data engineering with a focus on large data pipelines
• Strong proficiency in at least one programming language (e.g., Python, Java, Scala)
• Practical experience in production environments using distributed processing systems like Spark
• Experience with data pipeline orchestration systems like Airflow for creating and maintaining data workflows
• Familiarity with at least one cloud database or MPP technology (Snowflake, Databricks, Big Query)
• Experience in API development, particularly with GraphQL
• Advanced knowledge of OLTP vs OLAP systems
• Graph Database skills are advantageous
• Experience with Realtime Event Streaming is a bonus
Skills: snowflake,databricks,data modeling,pipelines,core data,spark,graphql,python,aws,data,apis,apache spark,olap,pyspark,airflow,database,delta lake,oltp,data pipeline orchestration