Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Lead Data Engineer

This role is for a Lead Data Engineer with 8+ years of experience in Data Engineering and Big Data concepts, specifically in the insurance domain. Contract length is unspecified; pay rate is also unspecified. Key skills include Snowflake, SQL, Python, Spark, and AWS.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Jersey City, NJ
🧠 - Skills detailed
#Storage #Data Migration #Data Storage #AWS Glue #Vault #Security #Spark (Apache Spark) #Computer Science #Snowflake #GIT #Python #Data Engineering #Data Vault #Programming #Data Security #SQL (Structured Query Language) #Agile #Leadership #Migration #PySpark #Data Quality #Data Ingestion #Data Modeling #Data Governance #Debugging #AWS (Amazon Web Services) #Data Processing #Aurora RDS #DevOps #Aurora #RDS (Amazon Relational Database Service) #S3 (Amazon Simple Storage Service) #Big Data #Cloud #Data Pipeline #"ETL (Extract #Transform #Load)" #Compliance #Jenkins
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Key Skills: Snowflake, SQL, Python, Spark, AWS- Glue, Big Data Concepts

Insurance domain exp is MUST, preferably with knowledge of claims and loss processes

Responsibilities:
• Lead the design, development, and implementation of data solutions using AWS and Snowflake.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Develop and maintain data pipelines, ensuring data quality, integrity, and security.
• Optimize data storage and retrieval processes to support data warehousing and analytics.
• Provide technical leadership and mentorship to junior data engineers.
• Work closely with stakeholders to gather requirements and deliver data-driven insights.
• Ensure compliance with industry standards and best practices in data engineering.
• Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions.

Must have:
• 8+ years of relevant experience in Data Engineering and delivery.
• 8+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations.
• Strong experience with SQL, python and Pyspark
• Good understanding of Data ingestion and data processing frameworks
• Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture)
• Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.
• Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment.
• Experience working in Agile Methodology

Good to have:
• Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines.
• Worked on cloud implementations, data migration, Data Vault 2.0, etc.

Requirements:
• Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
• Proven experience as a Data Engineer, with a focus on AWS and Snowflake.
• Strong understanding of data warehousing concepts and best practices.
• Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
• Experience in the insurance industry, preferably with knowledge of claims and loss processes.
• Proficiency in SQL, Python, and other relevant programming languages.
• Strong problem-solving skills and attention to detail.
• Ability to work independently and as part of a team in a fast-paced environment.

Preferred Qualifications:
• Experience with data modeling and ETL processes.
• Familiarity with data governance and data security practices.
• Certification in AWS or Snowflake is a plus.