

Backend ETL Developer Only W2
Hi,
Hope you are doing great!!
Job : Backend ETL developer only W2
Location : Denver, Colorado(onsite)
Type : contract
Only W2
Key Responsibilities:
• Design, develop, and optimize ETL processes for extracting data from various sources, transforming it, and loading it into databases, data warehouses, or data lakes.
• Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and ensure efficient data flows.
• Develop and maintain backend systems, including API integrations, to support data transfer and processing.
• Ensure data integrity, consistency, and quality throughout the ETL pipeline.
• Monitor and troubleshoot ETL processes to identify and resolve issues promptly.
• Write efficient SQL queries for data extraction, transformation, and reporting purposes.
• Optimize database queries and ETL jobs for performance and scalability.
• Collaborate with cloud architects to build scalable cloud-based ETL pipelines.
• Design and implement robust data integration solutions to enable real-time data processing when required.
• Document processes, workflows, and ETL pipeline configurations for future reference and troubleshooting.
• Work closely with the DevOps team to ensure proper deployment, version control, and environment management for ETL pipelines.
Required Skills and Qualifications:
• Proven experience as an ETL Developer, Data Engineer, or Backend Developer, working with ETL processes, data integration, and backend development.
• Strong programming skills in languages such as Python, Java, or Scala.
• In-depth knowledge of ETL tools such as Apache NiFi, Talend, Informatica, or custom-built solutions.
• Experience with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra).
• Strong experience with SQL and database query optimization.
• Familiarity with data warehousing concepts (e.g., Star Schema, Snowflake Schema) and tools like Amazon Redshift, Snowflake, or Google BigQuery.
• Experience working with cloud platforms (AWS, Azure, GCP) for ETL and data processing.
• Knowledge of data transformation languages like SQL, Python (Pandas), and ETL-specific languages.
• Experience with API integration and working with RESTful APIs.
• Familiarity with version control systems such as Git.
• Strong problem-solving skills and ability to debug and troubleshoot complex issues.
• Excellent communication skills, with the ability to collaborate effectively with cross-functional teams.
• Elasticsearch modeling, operations like insert/update/delete , writting query , converting std SQL into Elasticsearch Query.
• Turning Elasticsearch Query & near real time experience is value added. Thanks & Warm Regards
Ashok Kumar
Tanisha Systems Inc.
99 Wood Ave South, Suite # 308, Iselin, NJ 08830
Desk: (732) 746-0367
• 603
Email Id: Ashok.Kumar@tanishasystems.com