1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Quality Automation Engineer (DWBI & Cloud Testing)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Quality Automation Engineer focused on DWBI and Cloud Testing, remote or in Pleasanton, CA. Contract length is unspecified, with a pay rate of "unknown." Key skills include GCP, SQL, automation, and Agile experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
March 31, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Strategy #Quality Assurance #POSTMAN #"ETL (Extract #Transform #Load)" #Scripting #Automation #Data Warehouse #Agile #API (Application Programming Interface) #Data Quality #Cloud #Programming #Data Integrity #Data Engineering #Business Analysis #Python #SQL (Structured Query Language) #BI (Business Intelligence) #GCP (Google Cloud Platform)
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Job Title: Data Quality Automation Engineer (DWBI & Cloud Testing)

Location: Remote/ Pleasanton CA

Job Summary

Albertsons is seeking a Data Quality Automation Engineer to drive quality assurance initiatives for our Data Warehouse (DWBI) and Cloud-based solutions. The ideal candidate will have hands-on experience in GCP, SQL scripting, and automation of data validations, ensuring seamless data integrity across our platforms. This role involves developing test strategies, automating data validation processes, and working in an Agile environment to support business intelligence and data analytics solutions.

Key Responsibilities

   • Cloud Testing & Data Validation: Perform end-to-end Data Warehouse QA testing using SQL scripts for data validations on GCP.

   • SQL & Automation: Develop SQL scripts to validate data integrity, transformations, and ETL processes; automate validation tasks to improve efficiency.

   • Test Strategy & Planning: Design and implement test strategies, test plans, and test cases for data warehouse and cloud-based applications.

   • GCP Testing: Leverage Google Cloud Platform (GCP) for data validation and testing.

   • STLC & Agile: Ensure adherence to Software Testing Life Cycle (STLC) best practices and actively participate in Agile development cycles.

   • iCEDQ (if applicable): Utilize iCEDQ for data quality validations, especially if previously worked on ACI projects.

   • API & Postman Testing (Added Advantage): Conduct API validations using tools like Postman, ensuring smooth integrations between data sources.

   • Programming Knowledge (Added Advantage): Utilize Python or other programming languages to develop test automation frameworks for data validation.

   • Collaboration & Communication: Work closely with developers, data engineers, business analysts, and stakeholders to resolve data quality issues.

Required Skills & Qualifications

   • Mandatory:

   • GCP experience with hands-on testing and validation.

   • Strong SQL scripting skills for data validation.

   • Deep understanding of Database concepts & STLC principles.

   • End-to-end Data Warehousing QA expertise, including ETL testing.

   • Experience in automation of data validations.

   • Test Strategy & Test Plan creation skills.

   • Agile methodology experience.

   • Strong communication skills to interact with cross-functional teams.

   • Preferred/Good to Have:

   • Experience with iCEDQ, especially for candidates who have worked with ACI.

   • API testing and validation knowledge.

   • Postman validation experience.

   • Proficiency in Python or another programming language for automation.