Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

(12+) Sr. Data Engineer with Informatica

This role is for a "Sr. Data Engineer with Informatica" in San Ramon, CA, for 12+ years. Pay is W2/C2C. Key skills include Python, PySpark, SQL, and Informatica. A Bachelor’s degree and experience in regulated utilities are preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
San Ramon, CA
🧠 - Skills detailed
#Project Management #Python #Data Quality #SQL Queries #Data Governance #Spark (Apache Spark) #Spark SQL #Data Engineering #Microsoft Power BI #Strategy #Data Analysis #Data Integrity #Collibra #PowerApps #BI (Business Intelligence) #Computer Science #Agile #PySpark #Data Science #Data Management #Palantir Foundry #Kanban #SQL (Structured Query Language) #Metadata #Complex Queries #SharePoint #Informatica
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Sr. Data Engineer with Informatica

Location: San Ramon, CA (Onsite – Once a Week)

Experience: 12+ Years

Tax Terms- W2 / C2C only

Job Overview

We are seeking a Sr. Data Engineer with expertise in Python, PySpark, SQL, and Informatica to support data quality improvement initiatives. The ideal candidate will work closely with business and IT teams to ensure data integrity, develop complex queries, and enhance reporting dashboards in Palantir Foundry (preferred but not mandatory).

Key Responsibilities
• Partner with the Lead Program Manager to implement Data Quality Improvement Programs for PG&E’s critical electric asset data.
• Build relationships with Business Data Stewards, IT teams, and Data Quality Specialists to facilitate program execution.
• Use Informatica in the backend to pull and process data, ensuring high data quality.
• Write complex SQL queries for incoming data analysis.
• Document metadata in Collibra and integrate data into Foundry.
• Develop data quality rules and dashboards to monitor KPIs and performance goals.
• Continuously monitor and remediate data quality rule errors to maintain accuracy.
• Collaborate with Business Data Stewards to update or modify data quality rules.
• Ensure data governance standards are met by implementing data quality action plans.
• Advise Program Managers on strategy and roadmap improvements for data governance and expansion.

Required Qualifications
• Bachelor’s Degree in Data Science, Computer Science, Engineering, or related field (or equivalent work experience).
• 10+ years of relevant work experience.
• Strong Python, PySpark, SQL, and Informatica skills.
• Proficiency in MS Excel, Power BI, PowerApps, SharePoint Online.
• Excellent communication and problem-solving skills.

Preferred Qualifications
• Experience in regulated electric or gas utility asset management/operations.
• Hands-on experience with data management and data quality tools.
• Detail-oriented with the ability to manage multiple reports and processes.
• Strong organizational and executive engagement skills.
• Experience with project management tools and methodologies (Agile, Kanban, Design Thinking).
• Demonstrated ability to present to executive stakeholders.