

Data Quality Analyst
Required Skills & Experience
• this role requires candidates to be located only in LA or SEA
. 3 - 5 years’ experience in Quality validating ETL pipelines and Data Warehouses
. Expert SQL knowledge and experience working with relational databases as well as a
working familiarity with a variety of databases.
. Good experience in data analytics, data engineering, data modeling, data
warehousing, and big data platform
. Strong programming (Scala/Java) or scripting skills (Python)
. Experience with but not limited to JUnit/TestNG/BDD or similar tools
. Experience in a range of common big data tools & technologies such as Airflow, Hive,
Snowflake, Databricks, Spark etc.
. Experience in working with large datasets (Terabytes or more)
. Ability to operate effectively in a team-oriented and collaborative environment
. Excellent communication skills and ability to interact with all levels of end users and
technical resources
Nice to Have Skills & Experience
AWS experience
Job Description
• this role requires candidates to be located only in LA OR SEATTLE - must be OK with working a hybrid model
• A streaming client is in need of hiring a unique candidate who is a Data Quality Engineer who not only understands data and has the technical background, but also someone who has a 'quality engineering and testing mindset' and is passionate about testing. You will be responsible for building strong SQL test cases to test data and use Python as well for testing. This is a technical role where you will be responsible for helping build the automation process for data validation and testing, while working through a backlog of work. You will be responsible for validating all streaming-related data for subscriber info (etc.), from this client's multiple streaming platforms. You will use automation while doing ETL validations (i.e. the ETL is already built by a different team), so you will be responsible for validating the ETL and adhere to standards. You must also know write queries as well as test your own scripts/queries. You will also be required to attend team/scrum meetings and sprint stand ups and planning sessions. In addition, this team is really looking for a candidate who is passionate about learning, is collaborative, and has excellent communication.
Additional Responsibilities:
. Test and validate ETL logic and Data Pipelines
. Own the quality of every release into production with a data-driven approach
. Create, deliver and continuously improve our quality processes for delivering
operational
data to address all types of subscriber and commerce operations to our stakeholders.
. Partner with Data Analysts, Product and Engineering teams to deeply understand the
underlying transactional systems behavior and business use cases.
. Translate reporting and operational technical specifications, including calculations,
custom groups, parameters, filtering criteria and/or aggregations into test requirements.
. Build automated and reusable tests for data stores to improve quality and
development velocity
. Diagnose issues, report defects, and propose regression tests to catch recurring bugs
. Mentor fellow Test Engineers on the team to ramp up on automation concepts and
contribute towards expanding test coverage
Job Qualifications