Job Title: Data Engineer
Location: Lansing MI
Job Description:
(Detail Position Summary in Bullet Point or Descriptive Statement)
- Lead the design and development of scalable and high-performance solutions using AWS services.
- Experience with Databricks Elastic search Kibanna S3.
- Experience with Extract Transform and Load (ETL) processes and data pipelines.
- Write clean maintainable and efficient code in Python/Scala.
- Experience with AWS Cloud-based Application Development
- Experience in Electronic Health Records (EHR) HL7 solutions.
- Implement and manage Elastic Search engine for efficient data retrieval and analysis.
- Experience with data warehousing data visualization Tools data integrity
- Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
- Excellent knowledge in designing both logical and physical database model
- Develop database objects including stored procedures functions
- Extensive knowledge on source control tools such as GIT
- Develop software design documents and work with stakeholders for review and approval.
- Exposure to flowcharts screen layouts and documentation to ensure logical flow of the system requirements
- Experience working on large agile projects.
- Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Skill Descriptions :
- (8-10 or more Required/Desired/Nice to Have skills in Bullet Points identified by their ranking)
- 12 years developing complex database systems.
- 8 years Databricks.
- 8 years using Elastic search Kibanna.
- 8 years using Python/Scala.
- 8 years Oracle.
- 5 years experience with Extract Transform and Load (ETL) processes and developing Data Pipelines.
- 5 years experience with AWS.
- Over 5 years experience with data warehousing data visualization Tools data integrity
- Over 5 years using CMM/CMMI Level 3 methods and practices.
- Over 5 years implemented agile development processes including test driven development.
- Over 3 years Experience or Knowledge on creating CI/CD pipelines using Azure.
Best Regards:
Monika G
Phone: 1-
Email:
Job Title: Data Engineer Location: Lansing MI Job Description: (Detail Position Summary in Bullet Point or Descriptive Statement) Lead the design and development of scalable and high-performance solutions using AWS services. Experience with Databricks Elastic search Kibanna S3. Experience with Ex...
Job Title: Data Engineer
Location: Lansing MI
Job Description:
(Detail Position Summary in Bullet Point or Descriptive Statement)
- Lead the design and development of scalable and high-performance solutions using AWS services.
- Experience with Databricks Elastic search Kibanna S3.
- Experience with Extract Transform and Load (ETL) processes and data pipelines.
- Write clean maintainable and efficient code in Python/Scala.
- Experience with AWS Cloud-based Application Development
- Experience in Electronic Health Records (EHR) HL7 solutions.
- Implement and manage Elastic Search engine for efficient data retrieval and analysis.
- Experience with data warehousing data visualization Tools data integrity
- Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
- Excellent knowledge in designing both logical and physical database model
- Develop database objects including stored procedures functions
- Extensive knowledge on source control tools such as GIT
- Develop software design documents and work with stakeholders for review and approval.
- Exposure to flowcharts screen layouts and documentation to ensure logical flow of the system requirements
- Experience working on large agile projects.
- Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Skill Descriptions :
- (8-10 or more Required/Desired/Nice to Have skills in Bullet Points identified by their ranking)
- 12 years developing complex database systems.
- 8 years Databricks.
- 8 years using Elastic search Kibanna.
- 8 years using Python/Scala.
- 8 years Oracle.
- 5 years experience with Extract Transform and Load (ETL) processes and developing Data Pipelines.
- 5 years experience with AWS.
- Over 5 years experience with data warehousing data visualization Tools data integrity
- Over 5 years using CMM/CMMI Level 3 methods and practices.
- Over 5 years implemented agile development processes including test driven development.
- Over 3 years Experience or Knowledge on creating CI/CD pipelines using Azure.
Best Regards:
Monika G
Phone: 1-
Email:
View more
View less