Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer in Beaverton, OR, lasting 24+ months at $55-$60/hour. Requires a Bachelor's degree, 6+ years in Data Engineering, 4+ years in Python, and 3+ years with Databricks and Snowflake. Strong SQL and cloud expertise needed.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
February 13, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Beaverton, OR
🧠 - Skills detailed
#Migration #Snowflake #Libraries #Data Processing #Database Management #Apache Spark #Database Design #Delta Lake #Security #Jenkins #Storage #Azure #Quality Assurance #GitLab #Airflow #RDBMS (Relational Database Management System) #Data Engineering #NumPy #Redis #AWS (Amazon Web Services) #Spark (Apache Spark) #Python #Triggers #Data Warehouse #Databricks #Alteryx #GIT #Pandas #ADF (Azure Data Factory) #Agile #DevOps #Jira #Strategy #AWS Glue #Apache Airflow #Data Migration #Collibra #JSON (JavaScript Object Notation) #Azure DevOps #DynamoDB #Cloud #MongoDB #"ETL (Extract #Transform #Load)" #Talend #MySQL #Schema Design #NoSQL #PySpark #SQL (Structured Query Language) #Computer Science #Azure Data Factory #Compliance
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Description:

Job Title: Data Engineer

Duration: 24+ Months (With further Extension or Conversion)

Location: Beaverton, OR

Payrate: $55 - 60/ Hour (Depends on Experience)

Responsibilities:
• Establishes database management systems, standards, guidelines and quality assurance for database deliverables, such as conceptual design, logical database, capacity planning, external data interface specification, data loading plan, data maintenance plan and security policy.
• Documents and communicates database design. Evaluates and installs database management systems. Codes complex programs and derives logical processes on technical platforms.
• Builds windows, screens and reports. Assists in the design of user interface and business application prototypes.
• Participates in quality assurance and develops test application code in client server environment.
• Provides expertise in devising, negotiating and defending the tables and fields provided in the database.
• Adapts business requirements, developed by modeling/development staff and systems engineers, and develops the data, database specifications, and table and element attributes for an application.
• At more experienced levels, helps to develop an understanding of client's original data and storage mechanisms. Determines appropriateness of data for storage and optimum storage organization. Determines how tables relate to each other and how fields interact within the tables for a relational model.

Requirements:
• Bachelor’s degree or higher or combination of relevant education, experience, and training in Computer Science.
• 6+ years experience in Data Engineering.
• 4+ years of experience working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json.
• 3+ years of experience in Data Warehouse technologies – Databricks and Snowflake.
• Strong Data Engineering Fundamentals (ETL, Modelling, Lineage, Governance, Partitioning & Optimization, Migration).
• Strong Databricks-specific skills (Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance).
• Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS and NoSQL DBs like MSSQL/MySQL and DynamoDB/MongoDB/Redis.
• Cloud Platform Expertise: AWS and/or Azure.
• Experience in one or more ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx.
• Excellent knowledge of coding and architectural design patterns.
• Passion for troubleshooting, investigation and performing root-cause analysis.
• Excellent written and verbal communication skills.
• Ability to multitask in a high energy environment.
• Agile methodologies and knowledge of Git, Jenkins, GitLab, Azure DevOps and tools like Jira/Confluence.

Nice to have:
• Tools like - Collibra, Hackolade.
• Migration Strategy and Tooling
• Data Migration Tools: Experience with migration tools and frameworks or custom-built solutions to automate moving data from Snowflake to Databricks.
• Testing and Validation: Ensuring data consistency and validation post-migration with testing strategies like checksums, row counts, and query performance benchmarks