

Data Engineer 3
Role: Data Engineer
Duration: 6+ Months
Location: Santa Clara, CA, 95051 (Hybrid)
Only w2 acceptable
No c2c
No 1099
No C2H
Description
We are seeking a skilled Data Application Engineer to design, build, and maintain data-driven applications and pipelines that enable seamless data integration, transformation, and delivery across systems. The ideal candidate will have a strong foundation in software engineering, database technologies, and cloud data platforms, with a focus on building scalable, robust, and efficient data applications.
Key Responsibilities
• Develop Data Applications: Build and maintain data-centric applications, tools, and APIs to enable real-time and batch data processing.
• Data Integration: Design and implement data ingestion pipelines, integrating data from various sources such as databases, APIs, and file systems.
• Data Transformation: Create reusable ETL/ELT pipelines to process and transform raw data into consumable formats using tools like Snowflake, DBT, or Python.
• Collaboration: Work closely with analysts, and stakeholders to understand requirements and translate them into scalable solutions.
• Documentation: Maintain comprehensive documentation for data applications, workflows, and processes.
Required Skills And Qualifications
• Education: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
• Programming: Proficiency in programming languages Python, C# , ASP.NET (Core)
• Databases: Strong understanding of SQL, database design, and experience with relational (e.g., Snowflake, SQL Server) databases
• Data Tools: Hands-on experience with ETL/ELT tools and frameworks such as Apache Airflow (DBT - Nice to Have)
• Cloud Platforms: Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their data services (e.g., S3, AWS Lambda etc.).
• Data Pipelines: Experience with real-time data processing tools (e.g., Kafka, Spark) and batch data processing.
• APIs: Experience designing and integrating RESTful APIs for data access and application communication.
• Version Control: Knowledge of version control systems like Git for code management.
• Problem-Solving: Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.
Preferred Skills
• Knowledge of containerization tools like Docker and orchestration platforms like Kubernetes.
• Experience with BI tools like Tableau, Power BI, or Looker.
Soft Skills
• Excellent communication and collaboration skills to work effectively in cross-functional teams.
• Ability to prioritize tasks and manage projects in a fast-paced environment.
• Strong attention to detail and commitment to delivering high-quality results.
#TB_EN Job #: 25-24359