

Sr Snowflake Data Engineer
We are looking for talent to join YO HR CONSULTANCY.
Senior Data Engineer
Experience Required: 7 - 25 Years
Location: Glendale, CA – Onsite 3-4 days a week W2
Contract Duration: 18 months W2
Essential Skills Required
• Airflow
• Apache Spark
• Snowflake OR Databricks
• Data Modeling
• Candidates must have significant expertise in PySpark and Databricks
About The Company
Based in Los Angeles, this prominent organization in the Entertainment & Media sector is dedicated to creating exceptional stories and experiences for its worldwide audience. Their tech teams emphasize ongoing innovation and the use of advanced technology to enhance entertainment offerings.
Technology Stack
You will engage with technologies such as Python, AWS, Snowflake, Databricks, and Airflow.
Responsibilities As a Senior Data Engineer
• Assist in the upkeep, enhancement, and expansion of current Core Data platform data pipelines
• Design and maintain APIs to make data available to downstream applications
• Create real-time streaming data infrastructures
• Your tech stack includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
• Collaborate with product managers, architects, and other engineers to ensure the success of the Core Data platform
• Help develop and document internal and external standards and best practices for pipeline configurations, naming conventions, etc.
• Guarantee high operational efficiency and quality of Core Data platform datasets to ensure our solutions meet SLAs, maintaining project reliability and accuracy for all stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Qualifications
You may be an excellent match if you possess:
• 7+ years of experience in data engineering with a focus on creating extensive data pipelines
• Expertise in at least one primary programming language (e.g. Python, Java, Scala)
• Hands-on experience in production environments with distributed processing systems like Spark
• Practical experience with data pipeline orchestration tools like Airflow to develop and maintain data pipelines
• Experience with at least one significant Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
• Proficiency in developing APIs utilizing GraphQL
• Advanced comprehension of OLTP vs OLAP systems
• Experience with Graph Databases is a plus
• Real-time Event Streaming exposure is advantageous
Skills: snowflake,databricks,data modeling,pipelines,core data,spark,graphql,python,aws,data,api development,apis,apache spark,olap,pyspark,airflow,database,delta lake,oltp,data pipeline orchestration