- Research new technologies and design complex secure scalable and reliable solutions focusing on ETL process enhancement
- Work with the modern data stack to deliver well-designed technical solutions
- Implement data governance practices
- Collaborate effectively with customer teams
- Take ownership of major solution components and their delivery
- Participate in requirements gathering and propose architecture approaches
- Lead data architecture implementation
- Develop core modules and scalable systems
- Conduct code reviews and write unit/integration tests
- Scale distributed systems and infrastructure
- Build/enhance data platforms leveraging AWS or Azure
Qualifications :
- 5 years of experience with Python and SQL
- Hands-on experience with AWS services (API Gateway Kinesis Athena RDS Aurora)
- Proven experience building ETL pipelines for analytics/internal operations
- Experience developing and integrating APIs
- Solid understanding of Linux OS
- Familiarity with distributed applications and DevOps tools
- Strong troubleshooting/debugging skills
- English level: Upper-Intermediate
WILL BE A PLUS:
- 2 years with Hadoop Spark or Airflow
- Experience with DAGs/orchestration tools
- Experience with Snowflake-based data warehouses
- Experience developing event-driven data pipelines
Additional Information :
PERSONAL PROFILE
- Strong communication skills
- Interest in dynamic research-focused environments
- Passion for innovation and continuous improvement
Remote Work :
No
Employment Type :
Full-time
Research new technologies and design complex secure scalable and reliable solutions focusing on ETL process enhancementWork with the modern data stack to deliver well-designed technical solutionsImplement data governance practicesCollaborate effectively with customer teamsTake ownership of major sol...
- Research new technologies and design complex secure scalable and reliable solutions focusing on ETL process enhancement
- Work with the modern data stack to deliver well-designed technical solutions
- Implement data governance practices
- Collaborate effectively with customer teams
- Take ownership of major solution components and their delivery
- Participate in requirements gathering and propose architecture approaches
- Lead data architecture implementation
- Develop core modules and scalable systems
- Conduct code reviews and write unit/integration tests
- Scale distributed systems and infrastructure
- Build/enhance data platforms leveraging AWS or Azure
Qualifications :
- 5 years of experience with Python and SQL
- Hands-on experience with AWS services (API Gateway Kinesis Athena RDS Aurora)
- Proven experience building ETL pipelines for analytics/internal operations
- Experience developing and integrating APIs
- Solid understanding of Linux OS
- Familiarity with distributed applications and DevOps tools
- Strong troubleshooting/debugging skills
- English level: Upper-Intermediate
WILL BE A PLUS:
- 2 years with Hadoop Spark or Airflow
- Experience with DAGs/orchestration tools
- Experience with Snowflake-based data warehouses
- Experience developing event-driven data pipelines
Additional Information :
PERSONAL PROFILE
- Strong communication skills
- Interest in dynamic research-focused environments
- Passion for innovation and continuous improvement
Remote Work :
No
Employment Type :
Full-time
View more
View less