

Data Engineer
CTG is seeking to fill a Data Engineer – Data Modeling opening for our client in Santa Clara, CA.
Location: Santa Clara, CA
Duration: 12 Months+ contract
Duties:
• Design, implement, and maintain robust data models to support business intelligence, analytics, and data science applications.
• Collaborate with cross-functional teams to gather requirements and translate them into scalable data solutions.
• Optimize data pipelines and workflows using Python/PySpark, ensuring efficiency and data quality.
• Work extensively with Google Cloud Platform (GCP), with a focus on BigQuery for data storage, querying, and processing.
• Develop and maintain ETL/ELT processes and ensure data consistency across multiple systems.
• Leverage advanced SQL to manipulate, analyze, and model large datasets.
• Provide technical expertise and support for projects integrating data from ERP systems.
• Monitor and troubleshoot data systems and pipelines, ensuring reliability and performance.
Skills:
• Advanced SQL and data modeling skills (8+ years’ experience).
• Proficient in Python and PySpark, particularly for data engineering applications.
• Strong experience with Google Cloud Platform, especially BigQuery.
• Working knowledge of ERP systems and data structures.
• Strong problem-solving skills and attention to detail.
• Ability to work independently and in a team-oriented environment.
Experience:
• More than 8 years of experience in data engineering, with a strong focus on data modeling and architecture.
• Proven track record of designing scalable data models and integrating complex data systems.
• Hands-on experience working with cloud-based data platforms and tools (preferably GCP).
Education:
• Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (Master’s preferred).
CTG is seeking to fill a Data Engineer – Data Modeling opening for our client in Santa Clara, CA.
Location: Santa Clara, CA
Duration: 12 Months+ contract
Duties:
• Design, implement, and maintain robust data models to support business intelligence, analytics, and data science applications.
• Collaborate with cross-functional teams to gather requirements and translate them into scalable data solutions.
• Optimize data pipelines and workflows using Python/PySpark, ensuring efficiency and data quality.
• Work extensively with Google Cloud Platform (GCP), with a focus on BigQuery for data storage, querying, and processing.
• Develop and maintain ETL/ELT processes and ensure data consistency across multiple systems.
• Leverage advanced SQL to manipulate, analyze, and model large datasets.
• Provide technical expertise and support for projects integrating data from ERP systems.
• Monitor and troubleshoot data systems and pipelines, ensuring reliability and performance.
Skills:
• Advanced SQL and data modeling skills (8+ years’ experience).
• Proficient in Python and PySpark, particularly for data engineering applications.
• Strong experience with Google Cloud Platform, especially BigQuery.
• Working knowledge of ERP systems and data structures.
• Strong problem-solving skills and attention to detail.
• Ability to work independently and in a team-oriented environment.
Experience:
• More than 8 years of experience in data engineering, with a strong focus on data modeling and architecture.
• Proven track record of designing scalable data models and integrating complex data systems.
• Hands-on experience working with cloud-based data platforms and tools (preferably GCP).
Education:
• Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (Master’s preferred).