Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer in Reston, VA/Plano, TX, with a contract length of "unknown" and a pay rate of "unknown." Requires 7+ years in AWS, 5+ years in Python/SQL, and experience with Big Data and Agile teams.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Plano, TX
🧠 - Skills detailed
#Big Data #Python #S3 (Amazon Simple Storage Service) #Spark (Apache Spark) #AWS S3 (Amazon Simple Storage Service) #Data Mining #Debugging #EC2 #Data Engineering #Lambda (AWS Lambda) #RDS (Amazon Relational Database Service) #Athena #Scala #Cloud #Agile #Oracle #PySpark #Programming #Web Services #MySQL #Data Modeling #SNS (Simple Notification Service) #AWS EMR (Amazon Elastic MapReduce) #ML (Machine Learning) #Batch #MongoDB #Kanban #SQL (Structured Query Language) #Redshift #Aurora #Databases #Security #Scrum #AWS (Amazon Web Services) #Informatica #Hadoop
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

NOTE : ONSITE INTERVIEW IS REQUIRED

Location : Reston VA /Plano TX (In Person Interview & 2 Days Hybrid)

Minimum Required Experiences:

7+ years of recent experience with building and deploying applications in AWS (S3, Hive, Glue, AWS Batch, Dynamo DB, Redshift, AWS EMR, Cloudwatch, RDS, Lambda, SNS, SWS etc.)

5+ years of Python, SQL, SparkSQL, PySpark

Excellent problem-solving skills and strong verbal & written communication skills

Ability to work independently as well as part of an agile team (Scrum / Kanban)

Identify customer needs and intended use of requested data in the development of database requirements and support the planning and engineering of enterprise databases.

Maintain comprehensive knowledge of database technologies, complex coding languages, and computer system skills.

Support the integration of data into readily available formats while maintaining existing structures and govern their use according to business requirements.

Analyze new data sources and monitor the performance, scalability, and security of data.

Create an initial analysis and deliver the user interface (UI) to the customer to enable further analysis.

4+ years’ experience with Big Data Hadoop clusters, AWS, and Python

Knowledge of Spark streaming technologies

Experience in working with agile development teams

Familiarity with Hadoop / Spark information architecture, Data Modeling, Machine Learning (ML)

Knowledge of Environmental, Social, and Corporate Governance (ESG)

Skills

Skilled in cloud technologies and cloud computing

Programming including coding, debugging, and using relevant programming languages

Experience in the process of analysing data to identify trends or relationships to inform conclusions about the data

Skilled in creating and managing databases with the use of relevant software such as MySQL, Hadoop, or MongoDB

Skilled in discovering patterns in large data sets with the use of relevant software such as Oracle Data Mining or Informatica

Experience using software and computer systems' architectural principles to integrate enterprise computer applications such as xMatters, AWS Application Integration, or WebSphere

Working with people with different functional expertise respectfully and cooperatively to work toward a common goal

Communication including communicating in writing or verbally, copywriting, planning and distributing communication, etc.

Tools

Skilled in AWS Analytics such as Athena, EMR, or Glue

Skilled in AWS Database products such as Neptune, RDS, Redshift, or Aurora

Skilled in SQL

Skilled in AWS Compute such as EC2, Lambda, Beanstalk, or ECS

Skilled in Amazon Web Services (AWS) offerings, development, and networking platforms

Skilled in AWS Management and Governance suite of products such as CloudTrail, CloudWatch, or Systems Manager

Skilled in Python object-oriented programming