

Sr. Data Engineer with GCP and Vertex AI
Role: Sr Data Engineer (GCP and Vertex AI)
Location: Remote (EST candidates preferred)
Job Type: Contract
Job Description:
• As a Sr. Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization.
• You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset.
• our role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices.
• A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques.
Key Responsibilities:
• Provide Technical Leadership
• Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets.
• Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms.
• Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries.
• Implement Automated Testing
• Optimize Data Workflows
• Mentor Team Members
• Draft and Review Documentation
• Cost/Benefit Analysis
Role: Sr Data Engineer (GCP and Vertex AI)
Location: Remote (EST candidates preferred)
Job Type: Contract
Job Description:
• As a Sr. Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization.
• You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset.
• our role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices.
• A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques.
Key Responsibilities:
• Provide Technical Leadership
• Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets.
• Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms.
• Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries.
• Implement Automated Testing
• Optimize Data Workflows
• Mentor Team Members
• Draft and Review Documentation
• Cost/Benefit Analysis