

Data Engineer
Job Post: Data Engineer
Experience: 8 - 11 Years
Location: Pittsburgh
Duration : 12 Months
Notice Period: 0 - 30 Days
Job Purpose
Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows and create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python, and other tools.
Key Responsibilities
• Interpret business requirements and collaborate with internal resources and application vendors.
• Design, develop, and maintain Databricks solutions and data quality rules.
• Troubleshoot and resolve data-related issues.
• Configure and create data models and quality rules to meet customer needs.
• Work with multiple database platforms, including MSSQL and Oracle.
• Optimize PySpark/Python code and SQL queries to improve performance.
• Design and implement ETL pipelines and effective data models.
• Document data pipeline architecture and processes.
• Communicate effectively with business and technology stakeholders.
• Deliver results under tight deadlines while adhering to quality standards.
Key Competencies
• Experience: 8+ years in data engineering with expertise in Azure Databricks, MSSQL, LakeFlow, and Python.
• Proficient in creating and optimizing data pipelines using Databricks Notebooks, Spark SQL, and PySpark.
• Knowledge of Azure services like Azure Data Lake Storage and Azure SQL Data Warehouse.
• Expertise in data warehousing, ETL pipeline development, and data governance.
• Hands-on experience with data quality rules using Databricks and platforms like IDQ.
• Strong problem-solving, analytical, and organizational skills.
• Ability to work independently and collaboratively in cross-functional teams.
Skills & Requirements
• Technical Skills: Azure Databricks, PySpark, SQL (MSSQL, Spark SQL), Azure Data Lake Storage, ETL, Data Modeling and Governance, Data Warehousing, Python.
• Soft Skills: Strong communication, problem-solving, and attention to detail.