Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer in Beaverton, OR (Hybrid) with a contract length of "Unknown" and a pay rate of "Unknown." Requires a Bachelor's degree, 6+ years in Data Engineering, 4+ years in Python, and 3+ years with Databricks and Snowflake.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 13, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Beaverton, OR
🧠 - Skills detailed
#Migration #Snowflake #Consul #Data Processing #Database Management #Database Design #Security #AI (Artificial Intelligence) #Jenkins #Azure #Quality Assurance #GitLab #RDBMS (Relational Database Management System) #Data Engineering #Database Systems #Python #Triggers #Data Warehouse #Databricks #GIT #Jira #Strategy #DevOps #Data Migration #Collibra #Azure DevOps #"ETL (Extract #Transform #Load)" #Schema Design #NoSQL #SQL (Structured Query Language) #Computer Science #Agile
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Data Engineer

Location: Beaverton, OR (Hybrid)

Job Responsibilities:
• Establish and manage database systems, ensuring quality assurance for database deliverables such as conceptual design, logical database, capacity planning, and security policy.
• Document and communicate database design, and evaluate and install database management systems.
• Code complex programs and develop logical processes on technical platforms.
• Assist in designing user interface and business application prototypes.
• Participate in quality assurance and develop test application code in a client-server environment.
• Adapt business requirements and develop the data, database specifications, and table and element attributes for an application.

Skills Required:
• Bachelor’s degree or higher in Computer Science or a relevant field, or a combination of education, experience, and training.
• 6+ years of experience in Data Engineering.
• 4+ years of experience with Python, specifically related to data processing.
• 3+ years of experience in Data Warehouse technologies – Databricks and Snowflake.
• Strong knowledge of ETL, Modelling, Lineage, Governance, Partitioning & Optimization, and Migration.
• Proficiency in SQL, performance, Stored Procedures, Triggers, schema design, and knowledge of RDBMS and NoSQL DBs.

Preferred Skills:
• Experience with tools like Collibra, and Hackolade.
• Knowledge of migration strategy and tooling.
• Experience with data migration tools and ensuring data consistency and validation post-migration.
• Familiarity with Agile methodologies and tools like Git, Jenkins, GitLab, Azure DevOps, Jira/Confluence.

About SSi People:

With over 26 years of industry experience, SSi People has built its reputation and expertise on putting people first. Everything we do works toward delivering an exceptional experience for our consultants, our clients, and our internal team. Through a genuine commitment to people in everything we do. We have developed refined processes and a stellar internal team to deliver talent quickly. More importantly, we focus on building long-term relationships, not transactions. Putting people first is just what we do well.

By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from SSi People and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here: SSi People Privacy Policy