Job Summary
This role focuses on designing building and optimizing cloud-native and hybrid software solutions that support large-scale scientific data processing workflows and archival systems. The position plays a critical role in enabling researchers worldwide to efficiently access analyze and manage complex datasets generated by major space science missions.
The role requires deep expertise in Python development ETL pipeline design and cloud infrastructure. You will work closely with scientists and engineers to deliver scalable secure and reliable systems that integrate cloud and on-premises environments while supporting mission-critical data operations.
Key Responsibilities
-
Design develop and maintain cloud-native applications and hybrid architectures for large-scale data processing and archival systems
-
Build and optimize ETL data pipelines using Python and workflow orchestration tools
-
Integrate applications with relational databases for high-performance data storage and retrieval
-
Develop and manage cloud infrastructure using Infrastructure as Code tools
-
Build and maintain CI/CD pipelines to support automated testing and deployment
-
Ensure system security reliability and compliance with organizational standards and best practices
-
Collaborate with scientists and engineers to gather requirements and deliver scalable maintainable solutions
-
Troubleshoot and resolve complex issues in development and production environments
Required Qualifications
-
Bachelors degree in Computer Science Engineering or a related field
-
8 years of experience in cloud software development
-
Strong proficiency in Python and SQL for application development and data processing
-
Experience with cloud platforms and services including compute storage and messaging components
-
Experience working with relational databases such as PostgreSQL or MSSQL
-
Experience designing and managing data pipelines and workflow orchestration systems
-
Experience with containerization and orchestration technologies
-
Hands-on experience with CI/CD tools and automated deployment workflows
-
Strong problem-solving skills and ability to work effectively in a collaborative team environment
Preferred Qualifications
-
Experience with additional programming languages such as Java or
-
Advanced database skills including performance tuning and optimization
-
Familiarity with streaming or messaging technologies
-
Experience with Infrastructure as Code tools
-
Background working with scientific research or large-scale distributed data systems
Job Summary This role focuses on designing building and optimizing cloud-native and hybrid software solutions that support large-scale scientific data processing workflows and archival systems. The position plays a critical role in enabling researchers worldwide to efficiently access analyze and man...
Job Summary
This role focuses on designing building and optimizing cloud-native and hybrid software solutions that support large-scale scientific data processing workflows and archival systems. The position plays a critical role in enabling researchers worldwide to efficiently access analyze and manage complex datasets generated by major space science missions.
The role requires deep expertise in Python development ETL pipeline design and cloud infrastructure. You will work closely with scientists and engineers to deliver scalable secure and reliable systems that integrate cloud and on-premises environments while supporting mission-critical data operations.
Key Responsibilities
-
Design develop and maintain cloud-native applications and hybrid architectures for large-scale data processing and archival systems
-
Build and optimize ETL data pipelines using Python and workflow orchestration tools
-
Integrate applications with relational databases for high-performance data storage and retrieval
-
Develop and manage cloud infrastructure using Infrastructure as Code tools
-
Build and maintain CI/CD pipelines to support automated testing and deployment
-
Ensure system security reliability and compliance with organizational standards and best practices
-
Collaborate with scientists and engineers to gather requirements and deliver scalable maintainable solutions
-
Troubleshoot and resolve complex issues in development and production environments
Required Qualifications
-
Bachelors degree in Computer Science Engineering or a related field
-
8 years of experience in cloud software development
-
Strong proficiency in Python and SQL for application development and data processing
-
Experience with cloud platforms and services including compute storage and messaging components
-
Experience working with relational databases such as PostgreSQL or MSSQL
-
Experience designing and managing data pipelines and workflow orchestration systems
-
Experience with containerization and orchestration technologies
-
Hands-on experience with CI/CD tools and automated deployment workflows
-
Strong problem-solving skills and ability to work effectively in a collaborative team environment
Preferred Qualifications
-
Experience with additional programming languages such as Java or
-
Advanced database skills including performance tuning and optimization
-
Familiarity with streaming or messaging technologies
-
Experience with Infrastructure as Code tools
-
Background working with scientific research or large-scale distributed data systems
View more
View less