

Databricks Engineer
NO CTC W2 ONLY!
MUST HAVE DATABROCKS MIGRATION EXPERIENCE!
ONSITE IN BEAVERTON, OREGON 3 DAYS A WEEK!
Responsibilities:
• Bachelor’s degree or equivalent education, experience, and training in Computer Science.
• 6+ years of experience in Data Engineering.
• 4+ years working with Python for data processing, with proficiency in libraries such as Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, and Json.
• 3+ years of experience with Data Warehouse technologies, including Databricks and Snowflake.
• Strong fundamentals in Data Engineering (ETL, modeling, lineage, governance, partitioning & optimization, migration).
• Expertise in Databricks, including Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance.
• Proficient in SQL, with a focus on performance, stored procedures, triggers, and schema design, and experience with both RDBMS and NoSQL databases like MSSQL/MySQL and DynamoDB/MongoDB/Redis.
• Cloud platform expertise in AWS and/or Azure.
• Experience with ETL tools such as Apache Airflow, AWS Glue, Azure Data Factory, Talend, or Alteryx.
• Strong coding and architectural design pattern knowledge.
• Passion for troubleshooting, investigation, and root-cause analysis.
• Excellent written and verbal communication skills.
• Ability to multitask in a high-energy environment.
• Familiar with Agile methodologies and tools such as Git, Jenkins, GitLab, Azure DevOps, Jira, and Confluence.
Preferred Skills:
• Experience with tools like Collibra and Hackolade.
• Familiarity with data migration strategies and tooling.
• Experience with data migration tools or custom-built solutions for moving data from Snowflake to Databricks.
• Experience with testing and validation post-migration, including strategies like checksums, row counts, and query performance benchmarks.