Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Quality Analyst

This role is for a Data Quality Analyst with 3-5 years of experience in validating ETL pipelines and data warehouses. Located in LA or Seattle, it offers a hybrid work model and requires expert SQL, strong programming skills (Scala/Java, Python), and familiarity with big data tools.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
672
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Greater Seattle Area
🧠 - Skills detailed
#Snowflake #Scrum #Data Modeling #Data Analysis #Databases #Datasets #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Scala #Data Quality #Scripting #SQL (Structured Query Language) #Data Pipeline #Programming #Java #Airflow #Big Data #Data Warehouse #Automation #Spark (Apache Spark) #Python #JUnit #TestNG #Databricks #Data Engineering #Regression
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Required Skills & Experience
• this role requires candidates to be located only in LA or SEA

. 3 - 5 years’ experience in Quality validating ETL pipelines and Data Warehouses

. Expert SQL knowledge and experience working with relational databases as well as a

working familiarity with a variety of databases.

. Good experience in data analytics, data engineering, data modeling, data

warehousing, and big data platform

. Strong programming (Scala/Java) or scripting skills (Python)

. Experience with but not limited to JUnit/TestNG/BDD or similar tools

. Experience in a range of common big data tools & technologies such as Airflow, Hive,

Snowflake, Databricks, Spark etc.

. Experience in working with large datasets (Terabytes or more)

. Ability to operate effectively in a team-oriented and collaborative environment

. Excellent communication skills and ability to interact with all levels of end users and

technical resources

Nice to Have Skills & Experience

AWS experience

Job Description
• this role requires candidates to be located only in LA OR SEATTLE - must be OK with working a hybrid model
• A streaming client is in need of hiring a unique candidate who is a Data Quality Engineer who not only understands data and has the technical background, but also someone who has a 'quality engineering and testing mindset' and is passionate about testing. You will be responsible for building strong SQL test cases to test data and use Python as well for testing. This is a technical role where you will be responsible for helping build the automation process for data validation and testing, while working through a backlog of work. You will be responsible for validating all streaming-related data for subscriber info (etc.), from this client's multiple streaming platforms. You will use automation while doing ETL validations (i.e. the ETL is already built by a different team), so you will be responsible for validating the ETL and adhere to standards. You must also know write queries as well as test your own scripts/queries. You will also be required to attend team/scrum meetings and sprint stand ups and planning sessions. In addition, this team is really looking for a candidate who is passionate about learning, is collaborative, and has excellent communication.

Additional Responsibilities:

. Test and validate ETL logic and Data Pipelines

. Own the quality of every release into production with a data-driven approach

. Create, deliver and continuously improve our quality processes for delivering

operational

data to address all types of subscriber and commerce operations to our stakeholders.

. Partner with Data Analysts, Product and Engineering teams to deeply understand the

underlying transactional systems behavior and business use cases.

. Translate reporting and operational technical specifications, including calculations,

custom groups, parameters, filtering criteria and/or aggregations into test requirements.

. Build automated and reusable tests for data stores to improve quality and

development velocity

. Diagnose issues, report defects, and propose regression tests to catch recurring bugs

. Mentor fellow Test Engineers on the team to ramp up on automation concepts and

contribute towards expanding test coverage

Job Qualifications