

Principal Data Engineer (W2)
Job Title - Principal Data Engineer
Duration – 09 Months
Location – Indianapolis IN 46219 Hybrid - 3 days on-site
Job Description:
The Principal Data Engineer has the responsibility of preparing data for operational or analytical uses and delivering data engineering capabilities for the organization. The data engineer role would be responsible for building data pipelines to bring together information from different data source systems. Integrating, consolidating, and cleansing data, and structuring it for use in data science and data-driven applications, such as analytics or ML applications. They will make data easily accessible and strive to optimize their organization's data ecosystem and maintain the business’s identified data quality standards. The idea candidate engineers must understand different approaches to data architectures and applications to handle various data types (such as structure and unstructured data) and a variety of big data and cloud technologies, such as data ingestion and processing frameworks.
Required Skills:
- Bachelor's degrees in applied mathematics, computer science, physics or related engineering field
- Master’s degrees in computer science or computer engineering a plus
- Skilled in high level programming languages such as C#, Java, Python, R, Ruby, Scala and SQL
- Knowledge of using ETL tools and REST-oriented APIs for creating and managing data integration jobs
- Knowledge of data warehouses and data lakes technologies and practices
- Knowledge of NoSQL databases
- Knowledge of and experience using data modeling techniques and tools
-Experience with creating and communicating technical documentation for data architectures and designs
- Knowledge of and experience in designing and developing data pipelines workflows
- Knowledge of and experience with relational database such as MySQL, PostgreSQL, Oracle etc.
- Knowledge of Lambda architecture for unified data pipelines for batch and real-time processing
- Knowledge of and ability to adhere to quality and governance policies
- Experience with monitoring and optimizing data performance
- Knowledge of Data Quality Management practices
- Experience using Business intelligence (BI) platforms, such as MS PowerBI and the ability to configure them
- Ability to work with the interactive dashboards BI platforms use
- Basic understanding of machine learning and data scientist
- Knowledge of big data technologist such as: Hadoop, Spark, Hive, Redshift, Snowflake,
- Knowledge of and experience utilizing cloud-based data technologies such as: MS Data Factor, Synapse Data Engineering, Synapse Data Science, Synapse Data Warehousing, MS One Lake
Job Title - Principal Data Engineer
Duration – 09 Months
Location – Indianapolis IN 46219 Hybrid - 3 days on-site
Job Description:
The Principal Data Engineer has the responsibility of preparing data for operational or analytical uses and delivering data engineering capabilities for the organization. The data engineer role would be responsible for building data pipelines to bring together information from different data source systems. Integrating, consolidating, and cleansing data, and structuring it for use in data science and data-driven applications, such as analytics or ML applications. They will make data easily accessible and strive to optimize their organization's data ecosystem and maintain the business’s identified data quality standards. The idea candidate engineers must understand different approaches to data architectures and applications to handle various data types (such as structure and unstructured data) and a variety of big data and cloud technologies, such as data ingestion and processing frameworks.
Required Skills:
- Bachelor's degrees in applied mathematics, computer science, physics or related engineering field
- Master’s degrees in computer science or computer engineering a plus
- Skilled in high level programming languages such as C#, Java, Python, R, Ruby, Scala and SQL
- Knowledge of using ETL tools and REST-oriented APIs for creating and managing data integration jobs
- Knowledge of data warehouses and data lakes technologies and practices
- Knowledge of NoSQL databases
- Knowledge of and experience using data modeling techniques and tools
-Experience with creating and communicating technical documentation for data architectures and designs
- Knowledge of and experience in designing and developing data pipelines workflows
- Knowledge of and experience with relational database such as MySQL, PostgreSQL, Oracle etc.
- Knowledge of Lambda architecture for unified data pipelines for batch and real-time processing
- Knowledge of and ability to adhere to quality and governance policies
- Experience with monitoring and optimizing data performance
- Knowledge of Data Quality Management practices
- Experience using Business intelligence (BI) platforms, such as MS PowerBI and the ability to configure them
- Ability to work with the interactive dashboards BI platforms use
- Basic understanding of machine learning and data scientist
- Knowledge of big data technologist such as: Hadoop, Spark, Hive, Redshift, Snowflake,
- Knowledge of and experience utilizing cloud-based data technologies such as: MS Data Factor, Synapse Data Engineering, Synapse Data Science, Synapse Data Warehousing, MS One Lake