Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Business Intelligence Developer

This role is a Business Intelligence Developer for a 3-month contract, offering competitive pay. Key skills include Google BigQuery, SQL, data pipeline construction, and GCP services. Google Cloud certifications are highly desirable. Remote work location.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 12, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Downey, CA
🧠 - Skills detailed
#Data Quality #Airflow #Automation #BI (Business Intelligence) #Data Integration #Data Security #GCP (Google Cloud Platform) #Consul #Visualization #Compliance #Scala #Data Ingestion #Cloud #Data Engineering #Data Integrity #Data Extraction #Data Analysis #Data Accuracy #Documentation #Consulting #Schema Design #Data Pipeline #Data Architecture #Version Control #Programming #"ETL (Extract #Transform #Load)" #Dataflow #Apache Airflow #SQL (Structured Query Language) #Data Modeling #Data Manipulation #Data Processing #Storage #BigQuery #dbt (data build tool) #Security #Python
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Quantam Solutions provides IT solutions and consulting for various clients. We offer a competitive hourly wage, health benefits, paid time off, and a 401(k) plan. We're currently seeking a Business Intelligence Developer for a 3-month contract engagement.

JOB DESCRIPTION:

Our client is looking to hire a Business Intelligence Developer for designing, developing, and implementing data-driven solutions on Google Cloud Platform (GCP), with a strong focus on utilizing BigQuery for large-scale data analysis and warehousing. The individual will work under the direction of our client’s development staff

to understand business requirements, architect scalable data pipelines, optimize queries, and ensure data integrity within the GCP environment. The current engagement is planned for three months and can start immediately.

KEY RESPONSIBILITIES:

Solution Architecture:

Evaluate data ingestion methods, data transformation pipelines, and data visualization techniques to optimize data analysis workflows.

Create data models and schema designs that facilitate efficient querying and data retrieval.

Ensure data security and compliance with relevant regulations by implementing appropriate access controls and encryption strategies.

Assist in the review and enhancements of the data architectures leveraging GCP services like BigQuery, Dataflow, Cloud Storage, Cloud SQL, and Cloud Functions to meet complex data processing needs.

Data Pipelines & Orchestration:

dbt (Data Build Tool): Proficiency in writing dbt models (sources, models, tests, snapshots), managing dbt projects, and utilizing dbt's features (version control, documentation, testing).

Data Pipelines: Building and maintaining robust data pipelines using tools like Apache Airflow, Prefect, or similar.

ETL/ELT Processes: Understanding and implementing data extraction, transformation, and loading processes.

Data Quality & Testing:

Data Validation: Implementing data quality checks and validations within dbt models (e.g., data type checks, uniqueness constraints, null checks).

Testing Frameworks: Utilizing dbt's testing framework or other testing tools to ensure data accuracy and integrity.

Cloud Integration:

Integrate data from diverse sources (on-premise, cloud-based applications, APIs) into BigQuery using data ingestion tools and techniques.

Develop data integration workflows to ensure data consistency and quality across different systems.

Leverage other GCP services like Cloud Dataflow for real-time data processing and stream analytics.

Collaboration and Communication:

Work closely with data analysts, business stakeholders, and software engineers to understand data requirements and translate them into actionable insights.

Document technical design decisions and provide clear explanations of complex data concepts.

Proactively identify potential issues and propose solutions to ensure data quality and system reliability.

Technical Expertise:

Deep understanding of data warehousing principles, data modeling, and dimensional design.

Expertise in SQL and proficiency in other programming languages like Python for data manipulation, automation, and interacting with APIs.

Familiarity with data visualization tools to present insights effectively.

Knowledge of cloud computing concepts and best practices on Google Cloud Platform.

REQUIRED SKILLS & QUALIFICATIONS:
• Proven experience in designing and implementing data solutions using Google BigQuery and other GCP services.
• Strong SQL skills with expertise in complex query optimization.
• Experience with data ingestion and transformation techniques.
• Excellent problem-solving and analytical abilities.
• Ability to work independently and as part of a team.
• Google Cloud Platform certifications (e.g., Certified Professional Data Engineer, Certified Professional Cloud Architect) are highly desirable.