Big Data Engineer
Job Title: Lead Operations Engineer
Location: Beaverton, OR
Duration: 11 Months
Job Description:
• As a Lead Operations Engineer, you will need to be a self-starter with a proven sense of ownership, organization, and follow-through, and will often operate as a Subject Matter Expert (SME) on Support Operations.
• You will be responsible for developing solutions to improve reliability, maintainability, availability, and performance of the team, and will ensure that product and technical defects are communicated and accounted for with Product and Engineering leaders.
• Be a key contributor to overall framework, organization and design of D&AI Support processes
• Develop and maintain complex data visualizations and reports using Tableau and/or other reporting solutions
• Take initiative to identify and prioritize projects and tasks, and develop processes to improve efficiency and effectiveness
• Demonstrate strong analytical and problem-solving skills, with the ability to "read between the lines" and identify key issues and opportunities
• Collaborate with stakeholders to gather requirements and develop solutions that meet their needs
• Develop and maintain strong relationships with stakeholders, including supervisors, colleagues, and external partners
• Anticipate and respond to questions and requests from stakeholders, even when not specifically delegated or asked
• Proactively seek out opportunities to contribute to the team's success
• Work independently with minimal supervision, demonstrating a high level of independence and self-motivation
Minimum Qualifications
• 5+ years experience developing data products that operate at scale
• 2+ years developing solutions w/ Python
• 2+ years of experience architecting and development in Airflow
• 2+ years developing solutions on a commercial cloud (AWS, Azure or GCP)
• 2+ years developing and support in database systems like Databricks, Snowflake or Teradata
• 1+ years of experiencing onboarding or mentoring new team members and peers
• Exposure to Agile, ideally knowledge of the SAFe methodology
• BS/MS in CS, a related field or equivalent experience
Ideal Technical Skills
Databricks / Snowflake / RDBMS
Tableau / Cognos / DOMO / Power BI
Apache Airflow / Spark / Hive
Amazon AWS / S3
Python
Okta / SSO
Confluence
Jira
Smartsheets