Sr. Cloud Engineer

This role is for a Sr. Cloud Engineer (13+ years experience) with strong Node.js and TypeScript skills, AWS expertise, and proficiency in Docker/Kubernetes. Contract length is unspecified, with a competitive pay rate. Remote work is allowed.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
January 16, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Georgia, United States
🧠 - Skills detailed
#TypeScript #Aurora #BI (Business Intelligence) #Scripting #Bash #Redshift #Talend #Kubernetes #Jenkins #JSON (JavaScript Object Notation) #Shell Scripting #AWS (Amazon Web Services) #Automated Testing #Data Extraction #Scala #AWS Glue #API (Application Programming Interface) #Lambda (AWS Lambda) #Oracle #Kafka (Apache Kafka) #Azure #Cloud #Python #SQL (Structured Query Language) #Deployment #Migration #Ansible #Java #MySQL #Data Engineering #Debugging #Databases #"ETL (Extract #Transform #Load)" #Docker #Leadership #Microservices
Role description
Log in or sign up for free to view the full role description and the link to apply.

Look for snr folks 13+ yrs min and should have strong hands on exp.

Linkedin before 2021

2 references

Candidate needs to have:

· Strong proficiency in Node.js and TypeScript development.

· Expertise in building secure and scalable Node.js Lambdas in an active-active multi-region AWS environment.

· High-level skills in AWS Cloud Development Kit (CDK).

· Proficiency in setting up CloudWatch dashboards and alarms.

· Strong adherence to TDD principles in development.

· Proficiency in Cloud Technologies (AWS preferred, Azure, Google Cloud).

· Skilled in Python, Java, shell scripting (Bash, PowerShell), and SQL.

· Experience with streaming data and data extraction from databases (Oracle, DB2, MySQL, etc.).

· Expertise in deploying and managing infrastructures with Docker, Kubernetes, or OpenShift.

· Familiarity with scalable data extraction tools (preferred).

· Knowledge of Kafka, Aurora, AWS Glue, and Redshift (preferred).

· Understanding of data engineering, real-time streaming/event processing, and JSON parsing.

· Experience automating application deployment, continuous delivery, and integration (Jenkins, Ansible, etc.).

· Background in building microservices and API architecture.

· Strong debugging and troubleshooting skills.

· Passion for learning, flexibility, and collaboration.

· Business Intelligence/Analytics experience (preferred).

Job Description:

· Develop modular designs for data streaming, cloud transformation/migration, and API product development.

· Build data APIs and delivery services supporting operational and analytical applications.

· Create and support data-centric products across various platforms and technologies.

· Analyze technical information to produce high-quality software.

· Collaborate effectively with teammates and offer innovative solutions.

· Utilize automated testing and CI/CD processes.

· Become an expert on developed products.

· Document solutions through written and diagram formats for team communication.

· Ensure code meets design goals and business needs through adherence to coding standards.

· Identify technical issues, articulate impacts, and prioritize solutions.

· Communicate proactively with team members and leadership.

· Collaborate with vendors as needed.

Thanks & Regards,

Alok Kumar | TALENDICA

Sr. Technical Reqruiter

44 Saratoga Lane, Monroe Township, NJ 08831