Big Data Engineer

This role is for a Big Data Engineer based in Beaverton, OR, for 11 months at a competitive pay rate. Requires 5+ years in data product development, expertise in Python, Airflow, and cloud platforms (AWS, Azure, GCP), plus experience with Databricks or Snowflake.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
January 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Portland, Oregon Metropolitan Area
🧠 - Skills detailed
#Cloud #GCP (Google Cloud Platform) #BI (Business Intelligence) #Agile #AWS (Amazon Web Services) #Airflow #S3 (Amazon Simple Storage Service) #RDBMS (Relational Database Management System) #Azure #Database Systems #Databricks #Snowflake #Data Engineering #Apache Airflow #Big Data #Spark (Apache Spark) #Jira #Microsoft Power BI #Tableau #Visualization #Python #Teradata #AI (Artificial Intelligence)
Role description
Log in or sign up for free to view the full role description and the link to apply.

Job Title: Lead Operations Engineer

Location: Beaverton, OR

Duration: 11 Months

Job Description:
• As a Lead Operations Engineer, you will need to be a self-starter with a proven sense of ownership, organization, and follow-through, and will often operate as a Subject Matter Expert (SME) on Support Operations.
• You will be responsible for developing solutions to improve reliability, maintainability, availability, and performance of the team, and will ensure that product and technical defects are communicated and accounted for with Product and Engineering leaders.
• Be a key contributor to overall framework, organization and design of D&AI Support processes
• Develop and maintain complex data visualizations and reports using Tableau and/or other reporting solutions
• Take initiative to identify and prioritize projects and tasks, and develop processes to improve efficiency and effectiveness
• Demonstrate strong analytical and problem-solving skills, with the ability to "read between the lines" and identify key issues and opportunities
• Collaborate with stakeholders to gather requirements and develop solutions that meet their needs
• Develop and maintain strong relationships with stakeholders, including supervisors, colleagues, and external partners
• Anticipate and respond to questions and requests from stakeholders, even when not specifically delegated or asked
• Proactively seek out opportunities to contribute to the team's success
• Work independently with minimal supervision, demonstrating a high level of independence and self-motivation

Minimum Qualifications
• 5+ years experience developing data products that operate at scale
• 2+ years developing solutions w/ Python
• 2+ years of experience architecting and development in Airflow
• 2+ years developing solutions on a commercial cloud (AWS, Azure or GCP)
• 2+ years developing and support in database systems like Databricks, Snowflake or Teradata
• 1+ years of experiencing onboarding or mentoring new team members and peers
• Exposure to Agile, ideally knowledge of the SAFe methodology
• BS/MS in CS, a related field or equivalent experience

Ideal Technical Skills

Databricks / Snowflake / RDBMS

Tableau / Cognos / DOMO / Power BI

Apache Airflow / Spark / Hive

Amazon AWS / S3

Python

Okta / SSO

Confluence

Jira

Smartsheets