- Maintain and enhance existing data processes to ensure reliability performance and scalability
- Improve and support Python-based services as well as develop new ones
- Design implement and optimize scalable data architecture
- Drive improvements in system reliability and performance across all data workflows
- Modernize data processes by introducing new tools and technologies
- Manage data flow between local storage AWS and Snowflake with a focus on performance and cost efficiency
- Lead migration of data pipelines and services to Snowflake
- Work with and improve the data orchestration layer (Airflow) for stable operation and scalability
- Collaborate with cross-functional teams to ensure architectural alignment and adoption of best practices
Qualifications :
- Proficiency in Python for data processing and platform development
- Advanced SQL skills with ability to write efficient complex queries
- Strong hands-on experience with Snowflake
- Proficiency in DBT for data transformation and modeling
- Practical experience with Airflow for orchestration of data workflows
- Excellent communication collaboration and problem-solving skills
- Ability to work effectively with both technical and non-technical stakeholders
- Proven ability to operate independently take ownership and drive initiatives to completion
- Bachelors degree in Computer Science Engineering or related field
Remote Work :
Yes
Employment Type :
Full-time
Maintain and enhance existing data processes to ensure reliability performance and scalabilityImprove and support Python-based services as well as develop new onesDesign implement and optimize scalable data architectureDrive improvements in system reliability and performance across all data workflow...
- Maintain and enhance existing data processes to ensure reliability performance and scalability
- Improve and support Python-based services as well as develop new ones
- Design implement and optimize scalable data architecture
- Drive improvements in system reliability and performance across all data workflows
- Modernize data processes by introducing new tools and technologies
- Manage data flow between local storage AWS and Snowflake with a focus on performance and cost efficiency
- Lead migration of data pipelines and services to Snowflake
- Work with and improve the data orchestration layer (Airflow) for stable operation and scalability
- Collaborate with cross-functional teams to ensure architectural alignment and adoption of best practices
Qualifications :
- Proficiency in Python for data processing and platform development
- Advanced SQL skills with ability to write efficient complex queries
- Strong hands-on experience with Snowflake
- Proficiency in DBT for data transformation and modeling
- Practical experience with Airflow for orchestration of data workflows
- Excellent communication collaboration and problem-solving skills
- Ability to work effectively with both technical and non-technical stakeholders
- Proven ability to operate independently take ownership and drive initiatives to completion
- Bachelors degree in Computer Science Engineering or related field
Remote Work :
Yes
Employment Type :
Full-time
View more
View less