

Senior Data Engineer
Tittle: Senior Data Engineer
Location: Portland, OR / Remote PST Hours
Contract
Preference: will be Ex-Nike Long Term Project Exp. Candidates.
Responsibilities:
• Design and implement data products and feature in collaboration with product owners, data analysts, and business partners.
• Work with a variety of teammates to build first-class solutions for Client technology and its business partners, working on development projects related to supply chain, commerce, consumer behaviour and web analytics among others.
• Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes.
• Research, evaluate and utilize new technologies/tools/frameworks cantered around high-volume data processing.
• Responsible for the evaluation of technical feasibility or risks and conveying that information to the team.
• Translate backlog items into engineering design and logical units of work. Profile and analyse data for the purpose of designing scalable solutions.
• Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
• Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem.
• Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns.
• Implement complex automated routines using workflow orchestration tools. Work with architecture, engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
• Anticipate, identify and solve issues concerning data management to improve data quality.
• Build and incorporate automated unit tests and participating in integration testing efforts.
• Utilize and advance continuous integration and deployment frameworks.
• Troubleshoot data issues and performing root cause analysis.
• Applicant must have a Bachelors/master’s degree in computer science, Computer Information Systems, or Information Management and 10+ years of experience in the job offered or a computer related occupation.
Experience must include:
• Databricks
• Databricks sole
• Snowflake
• SQL
• EMR
• Spark
• Dynamo DB
• Data Pipelines
• Python programming
• Airflow
• AWS (S3, SQS, Lambda, Athena, Open search, Glue data catalog, cloud watch)
• Apache Spark
• RDS
• Logging (Splunk, Slack)
• Hive
• Metastore
Tittle: Senior Data Engineer
Location: Portland, OR / Remote PST Hours
Contract
Preference: will be Ex-Nike Long Term Project Exp. Candidates.
Responsibilities:
• Design and implement data products and feature in collaboration with product owners, data analysts, and business partners.
• Work with a variety of teammates to build first-class solutions for Client technology and its business partners, working on development projects related to supply chain, commerce, consumer behaviour and web analytics among others.
• Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes.
• Research, evaluate and utilize new technologies/tools/frameworks cantered around high-volume data processing.
• Responsible for the evaluation of technical feasibility or risks and conveying that information to the team.
• Translate backlog items into engineering design and logical units of work. Profile and analyse data for the purpose of designing scalable solutions.
• Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
• Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem.
• Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns.
• Implement complex automated routines using workflow orchestration tools. Work with architecture, engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
• Anticipate, identify and solve issues concerning data management to improve data quality.
• Build and incorporate automated unit tests and participating in integration testing efforts.
• Utilize and advance continuous integration and deployment frameworks.
• Troubleshoot data issues and performing root cause analysis.
• Applicant must have a Bachelors/master’s degree in computer science, Computer Information Systems, or Information Management and 10+ years of experience in the job offered or a computer related occupation.
Experience must include:
• Databricks
• Databricks sole
• Snowflake
• SQL
• EMR
• Spark
• Dynamo DB
• Data Pipelines
• Python programming
• Airflow
• AWS (S3, SQS, Lambda, Athena, Open search, Glue data catalog, cloud watch)
• Apache Spark
• RDS
• Logging (Splunk, Slack)
• Hive
• Metastore