

Data/ML Engineer
Hiring: Data/ML Engineer – San Francisco, CA (Hybrid, 2 Days Onsite) | 12-Month Contract
We are looking for a Data/ML Engineer to join our team in San Francisco, CA. This role is hybrid (onsite 2 days per week, preferred) and offers a 12-month contract opportunity. If you're passionate about big data processing, cloud technologies, and MLOps, we’d love to hear from you!
🔹 What You’ll Do:
✅ Design and develop ETL/ELT pipelines for structured and unstructured data.
✅ Build scalable data architectures on AWS, GCP, or Azure.
✅ Optimize machine learning pipelines for training, validation, and deployment.
✅ Work with data scientists to productionize ML models (MLflow, TensorFlow, PyTorch, Scikit-learn).
✅ Implement MLOps best practices, including CI/CD pipelines for model deployment.
✅ Develop real-time data streaming solutions (Kafka, Kinesis, Flink).
✅ Automate workflows using Airflow, Prefect, or Dagster.
✅ Ensure data quality, governance, and compliance with industry standards.
✅ Monitor model performance and manage retraining pipelines.
🔹 What We’re Looking For:
✔ 3-7 years of experience in Data Engineering/ML Engineering.
✔ Strong coding skills in Python, SQL (Scala/Java is a plus).
✔ Expertise in big data frameworks (Spark, Hadoop, Dask).
✔ Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn).
✔ Hands-on with containerization (Docker, Kubernetes).
✔ Familiarity with feature engineering, feature stores, vector databases (Feast, Pinecone).
✔ Proficiency in MLOps tools (Kubeflow, MLflow, SageMaker, Vertex AI).
✔ Cloud experience (AWS S3, Lambda, SageMaker | GCP BigQuery, Vertex AI | Azure Synapse, ML Studio).
✔ Experience with monitoring/logging tools (Prometheus, Grafana, ELK Stack).
🔹 Bonus Points If You Have:
⭐ Experience in Retail, Finance, Healthcare, or E-commerce.
⭐ Exposure to A/B testing, recommendation systems, NLP applications.
⭐ Understanding of data privacy regulations (GDPR, CCPA).
Hiring: Data/ML Engineer – San Francisco, CA (Hybrid, 2 Days Onsite) | 12-Month Contract
We are looking for a Data/ML Engineer to join our team in San Francisco, CA. This role is hybrid (onsite 2 days per week, preferred) and offers a 12-month contract opportunity. If you're passionate about big data processing, cloud technologies, and MLOps, we’d love to hear from you!
🔹 What You’ll Do:
✅ Design and develop ETL/ELT pipelines for structured and unstructured data.
✅ Build scalable data architectures on AWS, GCP, or Azure.
✅ Optimize machine learning pipelines for training, validation, and deployment.
✅ Work with data scientists to productionize ML models (MLflow, TensorFlow, PyTorch, Scikit-learn).
✅ Implement MLOps best practices, including CI/CD pipelines for model deployment.
✅ Develop real-time data streaming solutions (Kafka, Kinesis, Flink).
✅ Automate workflows using Airflow, Prefect, or Dagster.
✅ Ensure data quality, governance, and compliance with industry standards.
✅ Monitor model performance and manage retraining pipelines.
🔹 What We’re Looking For:
✔ 3-7 years of experience in Data Engineering/ML Engineering.
✔ Strong coding skills in Python, SQL (Scala/Java is a plus).
✔ Expertise in big data frameworks (Spark, Hadoop, Dask).
✔ Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn).
✔ Hands-on with containerization (Docker, Kubernetes).
✔ Familiarity with feature engineering, feature stores, vector databases (Feast, Pinecone).
✔ Proficiency in MLOps tools (Kubeflow, MLflow, SageMaker, Vertex AI).
✔ Cloud experience (AWS S3, Lambda, SageMaker | GCP BigQuery, Vertex AI | Azure Synapse, ML Studio).
✔ Experience with monitoring/logging tools (Prometheus, Grafana, ELK Stack).
🔹 Bonus Points If You Have:
⭐ Experience in Retail, Finance, Healthcare, or E-commerce.
⭐ Exposure to A/B testing, recommendation systems, NLP applications.
⭐ Understanding of data privacy regulations (GDPR, CCPA).