

Senior Data Analyst
Data Analyst
Build the Future of Data – Join our Team!
Our Data Cloud is transforming how businesses discover, integrate, and leverage data products. Through our Marketplace, organizations can seamlessly access valuable datasets, applications, and AI models.
At the heart of this ecosystem is Secure Data Sharing, enabling providers and consumers to publish, discover, and monetize data and services. Instead of relying on traditional delivery methods like files and APIs, providers leverage Data Sharing to distribute their data efficiently. Marketplace and Data Sharing are central to our vision, fueling the network effect of our Data Cloud.
As a temporary worker on the Collaboration & Marketplace team, you will play a key role in shaping our internal content arm—curating public data into scalable, high-quality products that enhance our customers’ analytics capabilities. Public datasets—spanning finance, economics, government, and weather—are critical for businesses across industries.
As a Data Analyst, you will be responsible to monitor the public data pipelines daily and help address/fix errors promptly, allowing customers to rely on this data for their business operations.
Responsibilities:
• Monitor data pipelines for failures (formatting, source, duplicative data, etc).
• Investigate errors by analyzing logs, GitHub repositories, provider websites, and SQL queries to pinpoint issues.
• Engage with public data providers when issues arise on their end, ensuring swift resolution.
• Proactively improve data pipelines to improve data integrity and reduce errors.
• Maintain strict SLAs (e.g., 4-hour response time) and aim for 0% error rate at the end of each day.
• Collaborate with the Data Engineering team in daily scrum meetings.
• Identify and implement process improvements to enhance efficiency and reliability.
Requirements:
Education & Experience:
• Bachelor’s degree or equivalent practical experience.
• 5+ years of experience in Data Analysis, Software Engineering, Data Engineering, or Data Quality Operations.
Technical / Analytical Skills:
• Advanced SQL proficiency (joins, subqueries, constraints, daily use).
• Experience creating, maintaining and troubleshooting data pipelines written in Python and SQL, leveraging GitHub, dbt, Spark or equivalent tools.
• Familiarity with Cloud and SaaS environments is a plus.
Communication & Problem-Solving:
• Strong written and verbal communication skills.
• Experience working directly with customers to resolve data issues.
• Proactive problem solver who takes initiative, identifies solutions, and navigates fast-paced environments effectively.
Additional Preferences:
• Proven track record of meeting SLAs and quality targets.
• Experience or familiarity with the financial services industry or related domains is a plus
Compensation:
• Up to $79.31/hr. (W2)