Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer - Expert

This role is for a "Data Engineer - Expert" in Beaverton, OR, on a 2-year hybrid contract. Key requirements include 6+ years of Data Engineering experience, proficiency in Python and Data Warehouse technologies like Databricks and Snowflake, and strong SQL skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 12, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Fixed Term
🔒 - Security clearance
Unknown
📍 - Location detailed
Beaverton, OR
🧠 - Skills detailed
#ADF (Azure Data Factory) #Airflow #Azure DevOps #Computer Science #Libraries #Jira #Apache Spark #Compliance #DynamoDB #JSON (JavaScript Object Notation) #Delta Lake #Cloud #Data Engineering #Database Design #GitLab #Migration #Redis #NoSQL #Pandas #Azure Data Factory #Schema Design #Triggers #Database Management #Jenkins #Agile #Data Migration #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Apache Airflow #Databricks #SQL (Structured Query Language) #Quality Assurance #Talend #Collibra #Data Processing #Storage #Data Warehouse #MySQL #MongoDB #Alteryx #Strategy #NumPy #Security #Snowflake #AWS Glue #DevOps #PySpark #Python #Azure #RDBMS (Relational Database Management System) #Spark (Apache Spark) #GIT
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Work options: Hybrid

Onsite: Mon-Thurs

Title: Data Engineer - Expert

Location: Beaverton, OR

Duration: 2 year contract

NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game.

What You Will Work On
• Establishes database management systems, standards, guidelines and quality assurance for database deliverables, such as conceptual design, logical database, capacity planning, external data interface specification, data loading plan, data maintenance plan and security policy.
• Documents and communicates database design. Evaluates and installs database management systems.
• Codes complex programs and derives logical processes on technical platforms.
• Builds windows, screens and reports. Assists in the design of user interface and business application prototypes.
• Participates in quality assurance and develops test application code in client server environment.
• Provides expertise in devising, negotiating and defending the tables and fields provided in the database.
• Adapts business requirements, developed by modeling/development staff and systems engineers, and develops the data, database specifications, and table and element attributes for an application.
• At more experienced levels, helps to develop an understanding of client's original data and storage mechanisms.
• Determines appropriateness of data for storage and optimum storage organization. Determines how tables relate to each other and how fields interact within the tables for a relational model.

What You Bring
• Bachelor’s degree or higher or combination of relevant education, experience, and training in Computer Science.
• 6+ years experience in Data Engineering.
• 4+ years of experience working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json.
• 3+ years of experience in Data Warehouse technologies – Databricks and Snowflake.
• Strong Data Engineering Fundamentals (ETL, Modelling, Lineage, Governance, Partitioning & Optimization, Migration).
• Strong Databricks-specific skills (Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance).
• Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS and NoSQL DBs like MSSQL/MySQL and DynamoDB/MongoDB/Redis.
• Cloud Platform Expertise: AWS and/or Azure.
• Experience in one or more ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx.
• Excellent knowledge of coding and architectural design patterns.
• Passion for troubleshooting, investigation and performing root-cause analysis.
• Excellent written and verbal communication skills.
• Ability to multitask in a high energy environment.
• Agile methodologies and knowledge of Git, Jenkins, GitLab, Azure DevOps and tools like Jira/Confluence.

Additional Experience
• Experience with tools like: Collibra, Hackolade.
• Migration Strategy and Tooling
• Data Migration Tools: Experience with migration tools and frameworks or custom-built solutions to automate moving data from Snowflake to Databricks.
• Testing and Validation: Ensuring data consistency and validation post-migration with testing strategies like checksums, row counts, and query performance benchmarks