

DevOps/Data Engineer
Swoon is partnering with a leading global airline to find an experienced Data/DevOps Engineer for an exciting initial 8-month contract opportunity. This hybrid role, based in downtown Chicago, offers the chance to work with a high-impact team that’s modernizing enterprise data infrastructure and enabling cutting-edge analytics at scale. With a strong potential for extension or conversion based on performance, this is a great opportunity to join a world-class organization driving innovation in the aviation industry.
As a Data/DevOps Engineer, you’ll play a key role in supporting large-scale data pipelines, building infrastructure with Terraform, and leveraging modern cloud technologies such as AWS, Python, and CI/CD pipelines. You'll work closely with cross-functional teams to develop production-ready solutions that enable operational excellence and actionable insights across the business.
If you’re passionate about cloud data engineering, thrive in fast-paced environments, and bring expertise in AWS, DevOps practices, and data transformation, apply today to join a collaborative and forward-thinking team on the forefront of digital transformation in the airline industry!
Here are the details:
Location: Hybrid downtown Chicago, IL. Team goes into the office 3-4x per week
Duration: Initial 8-month contract with high potential to extend/convert based on performance
Pay Rate: $65-70/hr W2, open to C2C, Self-Corporation candidates as well
Job #: 15229, 15122
Top 5 Skillsets:
• DevOps
• AWS Cloud
• Terraform
• Python
• CI/CD pipelines
Nice to have skills or certifications:
• Blue-Green deployments
• Kubernetes
• Ansible Playbooks
Work Location(s):
Chicago Downtown
Interview Process:
2 Interview, first will be on Video followed by In-Person
Overview/Summary:
The Product Analytics team is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.
Job Overview and Responsibilities:
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in the company’s data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
• Support large scale data pipelines in a distributed and scalable environment
• Enable and optimize production AWS environment for data infrastructure and frameworks
• Expert in creating Terraform modules to automate deployments
• Knowledge of Databricks and Datalake technologies
• Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities
• Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies
• Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues
• AWS Certification
• Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks
• Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking
• Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement
• Must be fluent in English (written and spoken)
• Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners
• Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels
• Manage agile development and delivery by collaborating with project manager, product owner and development leads
Required:
• Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)
• 5+ years of experience in data engineering or ETL development role
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.
• Experience with Big Query, SQL server, etc.
• Experience with AWS cloud services: Redshift, S3, Athena, etc.
• Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.
• Passionate about solving problems through data and analytics, and creating data products including data models
• Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work
• Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner
• Proven proficiency with Microsoft Excel and PowerPoint
• Strong problem-solving skills, using data to tackle problems
• Outstanding writing, communication, and presentation skills
Preferred:
• Master's degree
• Experience with Quantum Metrics and Akamai
• Experience with languages: Python, R, etc.
• Strong experience with continuous integration & delivery using Agile methodologies
• Data engineering experience with transportation/airline industry
• Strong problem-solving skills
Swoon is partnering with a leading global airline to find an experienced Data/DevOps Engineer for an exciting initial 8-month contract opportunity. This hybrid role, based in downtown Chicago, offers the chance to work with a high-impact team that’s modernizing enterprise data infrastructure and enabling cutting-edge analytics at scale. With a strong potential for extension or conversion based on performance, this is a great opportunity to join a world-class organization driving innovation in the aviation industry.
As a Data/DevOps Engineer, you’ll play a key role in supporting large-scale data pipelines, building infrastructure with Terraform, and leveraging modern cloud technologies such as AWS, Python, and CI/CD pipelines. You'll work closely with cross-functional teams to develop production-ready solutions that enable operational excellence and actionable insights across the business.
If you’re passionate about cloud data engineering, thrive in fast-paced environments, and bring expertise in AWS, DevOps practices, and data transformation, apply today to join a collaborative and forward-thinking team on the forefront of digital transformation in the airline industry!
Here are the details:
Location: Hybrid downtown Chicago, IL. Team goes into the office 3-4x per week
Duration: Initial 8-month contract with high potential to extend/convert based on performance
Pay Rate: $65-70/hr W2, open to C2C, Self-Corporation candidates as well
Job #: 15229, 15122
Top 5 Skillsets:
• DevOps
• AWS Cloud
• Terraform
• Python
• CI/CD pipelines
Nice to have skills or certifications:
• Blue-Green deployments
• Kubernetes
• Ansible Playbooks
Work Location(s):
Chicago Downtown
Interview Process:
2 Interview, first will be on Video followed by In-Person
Overview/Summary:
The Product Analytics team is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.
Job Overview and Responsibilities:
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in the company’s data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
• Support large scale data pipelines in a distributed and scalable environment
• Enable and optimize production AWS environment for data infrastructure and frameworks
• Expert in creating Terraform modules to automate deployments
• Knowledge of Databricks and Datalake technologies
• Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities
• Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies
• Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues
• AWS Certification
• Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks
• Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking
• Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement
• Must be fluent in English (written and spoken)
• Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners
• Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels
• Manage agile development and delivery by collaborating with project manager, product owner and development leads
Required:
• Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)
• 5+ years of experience in data engineering or ETL development role
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.
• Experience with Big Query, SQL server, etc.
• Experience with AWS cloud services: Redshift, S3, Athena, etc.
• Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.
• Passionate about solving problems through data and analytics, and creating data products including data models
• Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work
• Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner
• Proven proficiency with Microsoft Excel and PowerPoint
• Strong problem-solving skills, using data to tackle problems
• Outstanding writing, communication, and presentation skills
Preferred:
• Master's degree
• Experience with Quantum Metrics and Akamai
• Experience with languages: Python, R, etc.
• Strong experience with continuous integration & delivery using Agile methodologies
• Data engineering experience with transportation/airline industry
• Strong problem-solving skills