- Design and optimize PostgreSQL data models for historical capacity and lifecycle tracking
- Build and maintain robust ETL pipelines using Python for structured and semi-structured (JSON) data
- Aggregate and structure data by Region Node Type and time dimensions
- Support time-series analysis and capacity forecasting use cases
- Develop and enable PowerBI datasets models and reports based on clean reliable data
- Ensure data quality performance and scalability across the pipeline
- Translate infrastructure and business requirements into scalable data solutions
- Collaborate closely with software developers and stakeholders on end-to-end data workflows
Qualifications :
- At least 5 years of experience as a Data Engineer or in a similar data-focused role
- Strong proficiency in SQL and relational databases preferably PostgreSQL
- Solid experience with Python for data transformation and pipeline development
- Hands-on experience working with JSON and semi-structured data formats
- Proven track record of building and optimizing ETL processes
- Practical experience with PowerBI including dataset modeling and report creation
- Experience working with time-series and historical datasets
- Strong understanding of data modelling principles for analytics and forecasting
- Upper-Intermediate level of English
WILL BE A PLUS:
- Experience with Kibana or other BI / visualization tools
- Familiarity with monitoring infrastructure or capacity planning data
- Exposure to forecasting techniques or growth trend analysis
- Experience integrating data from metrics and inventory systems
Additional Information :
PERSONAL PROFILE:
- Analytical mindset able to reason about data trends and anomalies beyond raw numbers
- Strong stakeholder collaboration skills translating business and infrastructure needs into actionable data models and reports
- High attention to detail and data quality with a strong sense of ownership and accountability
Remote Work :
Yes
Employment Type :
Full-time
Design and optimize PostgreSQL data models for historical capacity and lifecycle trackingBuild and maintain robust ETL pipelines using Python for structured and semi-structured (JSON) dataAggregate and structure data by Region Node Type and time dimensionsSupport time-series analysis and capacity fo...
- Design and optimize PostgreSQL data models for historical capacity and lifecycle tracking
- Build and maintain robust ETL pipelines using Python for structured and semi-structured (JSON) data
- Aggregate and structure data by Region Node Type and time dimensions
- Support time-series analysis and capacity forecasting use cases
- Develop and enable PowerBI datasets models and reports based on clean reliable data
- Ensure data quality performance and scalability across the pipeline
- Translate infrastructure and business requirements into scalable data solutions
- Collaborate closely with software developers and stakeholders on end-to-end data workflows
Qualifications :
- At least 5 years of experience as a Data Engineer or in a similar data-focused role
- Strong proficiency in SQL and relational databases preferably PostgreSQL
- Solid experience with Python for data transformation and pipeline development
- Hands-on experience working with JSON and semi-structured data formats
- Proven track record of building and optimizing ETL processes
- Practical experience with PowerBI including dataset modeling and report creation
- Experience working with time-series and historical datasets
- Strong understanding of data modelling principles for analytics and forecasting
- Upper-Intermediate level of English
WILL BE A PLUS:
- Experience with Kibana or other BI / visualization tools
- Familiarity with monitoring infrastructure or capacity planning data
- Exposure to forecasting techniques or growth trend analysis
- Experience integrating data from metrics and inventory systems
Additional Information :
PERSONAL PROFILE:
- Analytical mindset able to reason about data trends and anomalies beyond raw numbers
- Strong stakeholder collaboration skills translating business and infrastructure needs into actionable data models and reports
- High attention to detail and data quality with a strong sense of ownership and accountability
Remote Work :
Yes
Employment Type :
Full-time
View more
View less