Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer with a 6+ year experience in Data Engineering, proficient in Python and Data Warehouse technologies like Databricks and Snowflake. Contract length is "unknown", pay rate is "unknown", and location is "unknown".
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
February 12, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Beaverton, OR
🧠 - Skills detailed
#Database Systems #ADF (Azure Data Factory) #Airflow #Automation #Azure DevOps #Computer Science #Libraries #Data Storage #Jira #Consul #Apache Spark #Compliance #DynamoDB #Database Schema #JSON (JavaScript Object Notation) #Delta Lake #Cloud #Data Engineering #Data Integrity #Database Design #GitLab #Migration #Redis #NoSQL #Pandas #Consulting #Azure Data Factory #Data Architecture #Database Management #Jenkins #Agile #Data Migration #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Apache Airflow #Databricks #SQL (Structured Query Language) #Quality Assurance #Databases #Collibra #Talend #Data Processing #Storage #Data Warehouse #MySQL #MongoDB #Alteryx #NumPy #Security #Snowflake #AWS Glue #DevOps #PySpark #Python #Azure #RDBMS (Relational Database Management System) #Spark (Apache Spark) #GIT
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

We are seeking an Data Engineer to join our team. In this role, you will establish and manage database systems, ensuring the highest quality standards for data deliverables. Your expertise will guide database design, capacity planning, and security policies, and you will be responsible for designing data architecture that optimally supports our systems.

Responsibilities:
• Establish and document database management systems, including conceptual design, logical database structure, and data maintenance plans.
• Define and enforce guidelines and quality assurance standards for database deliverables.
• Code complex programs and develop logical processes across technical platforms.
• Build Windows, screens, and reports while assisting with user interface and business application prototype design.
• Participate in quality assurance, developing test application code for client-server environments.
• Provide expertise in defining, negotiating, and defending database schema, tables, and field structures.
• Adapt business requirements and develop database specifications, tables, and element attributes for applications.
• Evaluate, install, and optimize database management systems, and help develop data storage strategies.
• Work with cross-functional teams to ensure data integrity, performance, and compliance.
• Troubleshoot and perform root-cause analysis for complex data issues.
• Work with stakeholders to ensure the appropriate organization of relational and NoSQL databases.

Required Skills and Qualifications:
• Bachelor's degree in Computer Science or related field (or equivalent experience).
• 6+ years of experience in Data Engineering.
• 4+ years of experience with Python for data processing, including proficiency with libraries like Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, and JSON.
• 3+ years of experience with Data Warehouse technologies (Databricks and Snowflake).
• Strong Data Engineering Fundamentals, including ETL, Modeling, Lineage, Governance, Partitioning, and Optimization.
• Deep knowledge of Databricks, including Apache Spark, DB SQL, Delta Lake, Unity Catalog, RBAC, Workflows, and compliance frameworks.
• Experience with RDBMS (e.g., MSSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB, Redis).
• Cloud Platform Expertise in AWS and/or Azure.
• Experience with ETL tools such as Apache Airflow, AWS Glue, Azure Data Factory, Talend, and Alteryx.
• Experience with Agile methodologies, Git, Jenkins, GitLab, Azure DevOps, Jira, and Confluence.

Nice to Have:
• Familiarity with Collibra and Hackolade tools.
• Experience with Data Migration Tools and frameworks, particularly for automating data migrations from Snowflake to Databricks.
• Testing and validation experience to ensure data consistency post-migration using strategies like checksums, row counts, and query performance benchmarks.

About BrickRed Systems:

BrickRed Systems is a global leader in next-generation technology, consulting, and business process service companies. We enable clients to navigate their digital transformation. BrickRed Systems delivers a range of consulting services to our clients across multiple industries around the world. Our practices employ highly skilled and experienced individuals with a client-centric passion for innovation and delivery excellence.

With ISO 27001 and ISO 9001 certification and over a decade of experience in managing the systems and workings of global enterprises, we harness the power of cognitive computing hyper-automation, robotics, cloud, analytics, and emerging technologies to help our clients adapt to the digital world and make them successful. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.