

Sr Data Engineer
Join the Team at YO HR CONSULTANCY!
Senior Data Engineer
Experience: 7 - 25 Years
Location: Glendale, CA – Onsite 3-4 days a week W2
Contract Duration: 18 months W2
Essential Skills Required
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• In-depth knowledge and hands-on experience with PySpark and Databricks
About Us
Based in Los Angeles, our company is a frontrunner in the Entertainment & Media sector, dedicated to providing top-notch storytelling and experiences for a worldwide audience. Our technology teams prioritize innovation and leverage advanced technology to deliver the best entertainment.
Technology Stack
Your work will involve technologies such as Python, AWS, Snowflake, Databricks, and Airflow.
Your Role As a Senior Data Engineer
• Assist in maintaining, updating, and enhancing current Core Data platform data pipelines
• Design and manage APIs to make data accessible for downstream applications
• Create and sustain real-time streaming data pipelines
• Utilize a tech stack that includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Work alongside product managers, architects, and fellow engineers to ensure the effectiveness of the Core Data platform
• Help in the creation and documentation of internal and external standards and best practices related to pipeline configurations and naming conventions
• Guarantee high operational efficiency and quality for Core Data platform datasets, ensuring our solutions meet SLAs and uphold reliability and accuracy for all stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Qualifications
We would love to have you on board if you possess:
• A minimum of 7 years in data engineering, specifically in developing extensive data pipelines
• Expertise in at least one significant programming language (e.g., Python, Java, Scala)
• Practical production experience with distributed processing frameworks such as Spark
• Production experience using data pipeline orchestration tools like Airflow for designing and managing data pipelines
• Experience with at least one primary Massively Parallel Processing (MPP) or cloud database solution (Snowflake, Databricks, BigQuery)
• Background in API development with GraphQL
• Strong comprehension of OLTP versus OLAP systems
• Experience with Graph Databases is a plus
• Knowledge in Real-time Event Streaming is an added advantage
Skills: core data,aws,apis,data,apache spark,python,pyspark,api development,data modeling,snowflake,spark,airflow,pipelines,database,delta lake,graphql,databricks