

Big Data Developer
Title: Senior Big Data Engineer
Location: San Francisco, CA - Hybrid
Duration: 12 Months
Mandatory Skills: Either Java or Scala, SQL and NoSQL, Apache Spark, Cloud Platforms
Data Engineer Responsibilities:
• System development and maintenance activities of the team to meet service level agreements and create solutions with innovation, cost effectiveness, high quality and faster time to market.
• Support code versioning, and code deployments for Data Pipelines.
• Ensure test coverage for unit testing and support integration and performance testing.
• Contribute ideas to help ensure that required standards and processes are in place.
• Research and evaluate current and upcoming technologies and frameworks.
• Build data expertise and own data quality for your areas.
Minimum Qualifications:
• Strong experience on Java/Scala development experience.
• Strong experience on SQL and NoSQL experience (handling structured and unstructured data).
• Extensive experience with Spark Processing engine.
• Experience with Big data tools / technologies/ Streaming (Hive, Impala, OOZIE, Airflow, NIFI, Kafka)
• Experience with Data Modeling.
• Experience analyzing data to discover opportunities and address gaps.
• Experience working with cloud or on-prem Big Data platform(i.e. Netezza, Teradata, AWS Redshift, Google BigQuery, Azure Data Warehouse, or similar)
• Programming experience in Python - Nice to have but not mandatory.
Title: Senior Big Data Engineer
Location: San Francisco, CA - Hybrid
Duration: 12 Months
Mandatory Skills: Either Java or Scala, SQL and NoSQL, Apache Spark, Cloud Platforms
Data Engineer Responsibilities:
• System development and maintenance activities of the team to meet service level agreements and create solutions with innovation, cost effectiveness, high quality and faster time to market.
• Support code versioning, and code deployments for Data Pipelines.
• Ensure test coverage for unit testing and support integration and performance testing.
• Contribute ideas to help ensure that required standards and processes are in place.
• Research and evaluate current and upcoming technologies and frameworks.
• Build data expertise and own data quality for your areas.
Minimum Qualifications:
• Strong experience on Java/Scala development experience.
• Strong experience on SQL and NoSQL experience (handling structured and unstructured data).
• Extensive experience with Spark Processing engine.
• Experience with Big data tools / technologies/ Streaming (Hive, Impala, OOZIE, Airflow, NIFI, Kafka)
• Experience with Data Modeling.
• Experience analyzing data to discover opportunities and address gaps.
• Experience working with cloud or on-prem Big Data platform(i.e. Netezza, Teradata, AWS Redshift, Google BigQuery, Azure Data Warehouse, or similar)
• Programming experience in Python - Nice to have but not mandatory.