Job Title: Programmer Analyst 6 - Data Engineer
Job Location: Lansing MI (Hybrid)
Job Type: Contract
Job Description:
- Design and develop scalable cloud solutions using AWS
- Build and maintain data pipelines and ETL processes
- Develop code using Python and Scala
- Implement and manage ElasticSearch clusters
- Work with Databricks for data engineering workflows
- Design logical and physical database models
- Develop Oracle database objects including procedures and functions
- Participate in the full software development life cycle (SDLC) from requirements to deployment
- Create technical documentation and design artifacts
- Guide and mentor development team members
- Support CI/CD pipelines using Azure DevOps
- Ensure data integrity security and compliance
Requirement:
- 12 years developing complex database systems
- 8 years Databricks
- 8 years ElasticSearch and Kibana
- 8 years Python or Scala
- 8 years Oracle database
- 5 years ETL development and data pipelines
- 5 years AWS cloud
- 5 years data warehousing and data visualization tools
- 5 years CMM/CMMI Level 3 development practices
- 5 years agile development and test-driven development
Nice to have
Interested candidates can send their updated resumes at
Job Title: Programmer Analyst 6 - Data Engineer Job Location: Lansing MI (Hybrid) Job Type: Contract Job Description: Design and develop scalable cloud solutions using AWS Build and maintain data pipelines and ETL processes Develop code using Python and Scala Implement and manage ElasticSearch c...
Job Title: Programmer Analyst 6 - Data Engineer
Job Location: Lansing MI (Hybrid)
Job Type: Contract
Job Description:
- Design and develop scalable cloud solutions using AWS
- Build and maintain data pipelines and ETL processes
- Develop code using Python and Scala
- Implement and manage ElasticSearch clusters
- Work with Databricks for data engineering workflows
- Design logical and physical database models
- Develop Oracle database objects including procedures and functions
- Participate in the full software development life cycle (SDLC) from requirements to deployment
- Create technical documentation and design artifacts
- Guide and mentor development team members
- Support CI/CD pipelines using Azure DevOps
- Ensure data integrity security and compliance
Requirement:
- 12 years developing complex database systems
- 8 years Databricks
- 8 years ElasticSearch and Kibana
- 8 years Python or Scala
- 8 years Oracle database
- 5 years ETL development and data pipelines
- 5 years AWS cloud
- 5 years data warehousing and data visualization tools
- 5 years CMM/CMMI Level 3 development practices
- 5 years agile development and test-driven development
Nice to have
Interested candidates can send their updated resumes at
View more
View less