

Data Engineer
Important Notes:
• Due to the nature of this opportunity, visa sponsorship or application through a 3rd party/employer will not be considered (e.g H1B Visa status)
Data Engineer (Databricks, Unity Catalog)
Optomi, in partnership with a leading management consulting firm, is seeking a Senior Data Engineer with a strong focus on Databricks and Unity Catalog to join a data modernization journey. This role requires hands-on experience with Databricks, Unity Catalog, and a solid background in data quality frameworks. The ideal candidate will play a pivotal role in driving data governance and quality initiatives while remaining involved in hands-on ETL development and implementation.
What the right candidate will enjoy!
• Fully remote work arrangement!
• Major Career growth opportunities!
• Opportunities for professional growth and development!
• A collaborative and innovative work environment!
• The chance to work on cutting-edge data projects that drive business value!
Responsibilities of the Right Candidate:
• Lead the development and implementation of Databricks Unity Catalog for data governance and cataloging.
• Build and deploy data quality frameworks to ensure data integrity and compliance during migration.
• Collaborate with the team on data modernization efforts, integrating AWS Redshift into Databricks.
• Provide hands-on expertise in Python, SQL, and Spark for data engineering tasks.
• Ensure alignment with data governance, cataloging, and quality standards using Informatica tools (IDQ, EDC, AXON).
• Work closely with business and technical teams to deliver high-quality, scalable data solutions.
Experience of the Right Candidate:
• 7+ years of overall data engineering experience.
• 3-4 years of hands-on experience with Databricks and Unity Catalog.
• Strong understanding of data quality and governance frameworks.
• Experience in building and implementing data catalog and data quality solutions using Informatica (IDQ, EDC, AXON).
• Proficiency in Python, SQL, Spark, and experience in an AWS environment.
• Ability to work independently and as part of a team in a fast-moving project.
• Databricks certification (preferred).