- Design develop and maintain Python-based ETL pipelines for infrastructure and capacity data
- Implement robust data access layers using SQLAlchemy and psycopg2
- Integrate with monitoring and metrics systems (Prometheus / VictoriaMetrics) and inventory sources (NetBox)
- Develop aggregation and normalization logic for capacity data across regions and node types
- Persist and manage historical datasets in PostgreSQL
- Collaborate with Data Engineers and stakeholders to define forecasting and reporting requirements
- Improve reliability performance and maintainability of automated pipelines
- Replace manual Excel-based workflows with scalable auditable automated solutions
Qualifications :
- At least 5 years of professional experience in Python development including data processing and automation
- Strong hands-on experience with SQLAlchemy and psycopg2 for building and maintaining database interaction layers
- Solid background in working with SQL and relational databases preferably PostgreSQL
- Practical experience with monitoring and metrics systems such as Prometheus or VictoriaMetrics
- Proven ability to take ownership of solutions from design through to production delivery
- Excellent communication skills for collaboration with both technical and non-technical stakeholders
- Upper-Intermediate level of English
WILL BE A PLUS:
- Experience with data visualization tools such as Grafana Kibana or PowerBI
- Familiarity with NetBox or other CMDB/inventory management tools
- Knowledge of storage systems capacity planning or infrastructure lifecycle management
- Background in working with time-series data or forecasting-related solutions
- Proven track record in designing or maintaining ETL pipelines in production environments
- Understanding of data aggregation normalization and historical tracking processes
- Experience with cloud or hybrid infrastructure environments
Additional Information :
PERSONAL PROFILE:
- Self-driven and proactive
- Strong analytical and problem-solving skills
- Comfortable working in collaborative cross-functional teams
- Able to communicate complex technical ideas clearly
Remote Work :
Yes
Employment Type :
Full-time
Design develop and maintain Python-based ETL pipelines for infrastructure and capacity dataImplement robust data access layers using SQLAlchemy and psycopg2Integrate with monitoring and metrics systems (Prometheus / VictoriaMetrics) and inventory sources (NetBox)Develop aggregation and normalization...
- Design develop and maintain Python-based ETL pipelines for infrastructure and capacity data
- Implement robust data access layers using SQLAlchemy and psycopg2
- Integrate with monitoring and metrics systems (Prometheus / VictoriaMetrics) and inventory sources (NetBox)
- Develop aggregation and normalization logic for capacity data across regions and node types
- Persist and manage historical datasets in PostgreSQL
- Collaborate with Data Engineers and stakeholders to define forecasting and reporting requirements
- Improve reliability performance and maintainability of automated pipelines
- Replace manual Excel-based workflows with scalable auditable automated solutions
Qualifications :
- At least 5 years of professional experience in Python development including data processing and automation
- Strong hands-on experience with SQLAlchemy and psycopg2 for building and maintaining database interaction layers
- Solid background in working with SQL and relational databases preferably PostgreSQL
- Practical experience with monitoring and metrics systems such as Prometheus or VictoriaMetrics
- Proven ability to take ownership of solutions from design through to production delivery
- Excellent communication skills for collaboration with both technical and non-technical stakeholders
- Upper-Intermediate level of English
WILL BE A PLUS:
- Experience with data visualization tools such as Grafana Kibana or PowerBI
- Familiarity with NetBox or other CMDB/inventory management tools
- Knowledge of storage systems capacity planning or infrastructure lifecycle management
- Background in working with time-series data or forecasting-related solutions
- Proven track record in designing or maintaining ETL pipelines in production environments
- Understanding of data aggregation normalization and historical tracking processes
- Experience with cloud or hybrid infrastructure environments
Additional Information :
PERSONAL PROFILE:
- Self-driven and proactive
- Strong analytical and problem-solving skills
- Comfortable working in collaborative cross-functional teams
- Able to communicate complex technical ideas clearly
Remote Work :
Yes
Employment Type :
Full-time
View more
View less