Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Python Developer

This role is for a Python Developer/Data Analytics Software Engineer with a contract length of "unknown", a pay rate of "unknown", and a hybrid work location. Key skills include Python, PySpark, Angular, and Google Cloud technologies. A minimum of 3 years of experience in customer-facing products is required, along with a Bachelor's degree in Computer Science.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
664
🗓️ - Date discovered
February 18, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dearborn, MI
🧠 - Skills detailed
#Data Transformations #GCP (Google Cloud Platform) #Azure #Big Data #PySpark #Computer Science #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Deployment #Public Cloud #Cloud #Angular #BigQuery #Agile #Python #AWS (Amazon Web Services)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Analytics Software Engineer

Position Overview: We are seeking a Data Analytics Software Engineer with a strong focus on Data Transformation. This role requires expertise in crafting complex analytic solutions using Python and PySpark within an Agile framework. You will be part of a compact, multidisciplinary team that operates in close collaboration, involving direct and ongoing engagement with business stakeholders, product managers, and designers, emphasizing rapid and frequent deployment.

DO NOT apply if you do not have practical, professional experience in Python and Angular.

Key Responsibilities:
• Develop and implement complex analytic solutions using Python and PySpark.
• Manage intricate data transformations and apply Big Data methodologies and tools.
• Collaborate closely with business stakeholders, product managers, and designers.
• Ensure rapid and frequent deployment of solutions.

Required Skills:
• Advanced Software Engineering Expertise: Profound knowledge and application of Object-Oriented Design principles.
• Python / PySpark Development Proficiency: Demonstrated capability in developing Python / PySpark applications from conceptualization to deployment.
• Familiarity with Google Cloud Technologies: Solid understanding of Google Cloud solutions, including Angular, DataProc, BigQuery, Cloud Run, and Astronomer.
• Big Data & Analytics Acumen: Robust grasp of Big Data and Analytics, including the ability to navigate and address the challenges unique to these fields.

Preferred Skills:
• Ability to architect a solution from design to delivery.
• Ability to coach and mentor peers, both in day-to-day interactions and in department-based forums.
• If not GCP, AWS or Azure is neccessary.

Required Experience:
• Minimum 3 years of work experience in delivering customer-facing products.
• At least 2 years of strong development experience in Python on a Public Cloud Platform.
• Experience working in a large multi-national company with many diverse stakeholders and potentially competing objectives.

Preferred Experience:
• Experience developing software for an Automotive/Manufacturing company.

Education:
• Bachelor’s degree in Computer Science or a similar technical discipline.
• Google Cloud Certification (preferred).

Additional Information:
• Hybrid position - Candidate is required to be able to come on-site to the Dearborn offices.
• General coding proficiency required.
• Hacker Rank test will be required

If you are passionate about data analytics and eager to make a significant impact on product quality and value, we encourage you to apply!