- Design and implement a scalable data warehouse or data lakehouse to support analytics reporting and business KPIs
- Develop and maintain reliable batch and/or streaming data pipelines from internal databases and external systems
- Collaborate with stakeholders to translate business requirements into efficient data models and schemas
- Establish and maintain data modeling standards and best practices
- Implement monitoring data quality controls and observability for all data workflows
- Provide wellstructured datasets to enable selfservice analytics for BI and data teams
- Document the data platform including lineage definitions and contracts to create a shared source of truth for metrics
Qualifications :
- 6 years of experience in data engineering
- Strong programming skills in Python
- Proven track record of designing and delivering earlystage data platforms from concept (v1) to production
- Strong expertise with modern data tooling (e.g. Snowflake/BigQuery/Redshift dbt Airflow/Dagster/Prefect Fivetran/Airbyte etc.)
- Solid understanding of data modeling ETL/ELT and pipeline optimization
- Strong knowledge of data quality testing and monitoring best practices
- UpperIntermediate English or higher
WILL BE A PLUS
- Experience with AI/ML data pipelines
- Familiarity with finance/accounting datasets
- Knowledge of compliance frameworks such as SOX or GDPR
- Experience mentoring junior engineers
Additional Information :
PERSONAL PROFILE
- Proactive problemsolver with a handson approach
- Adaptable to fastmoving environments
- Strong communication skills for crossteam collaboration
- Ability to take ownership and drive initiatives to completion
Remote Work :
Yes
Employment Type :
Full-time
Design and implement a scalable data warehouse or data lakehouse to support analytics reporting and business KPIsDevelop and maintain reliable batch and/or streaming data pipelines from internal databases and external systemsCollaborate with stakeholders to translate business requirements into effic...
- Design and implement a scalable data warehouse or data lakehouse to support analytics reporting and business KPIs
- Develop and maintain reliable batch and/or streaming data pipelines from internal databases and external systems
- Collaborate with stakeholders to translate business requirements into efficient data models and schemas
- Establish and maintain data modeling standards and best practices
- Implement monitoring data quality controls and observability for all data workflows
- Provide wellstructured datasets to enable selfservice analytics for BI and data teams
- Document the data platform including lineage definitions and contracts to create a shared source of truth for metrics
Qualifications :
- 6 years of experience in data engineering
- Strong programming skills in Python
- Proven track record of designing and delivering earlystage data platforms from concept (v1) to production
- Strong expertise with modern data tooling (e.g. Snowflake/BigQuery/Redshift dbt Airflow/Dagster/Prefect Fivetran/Airbyte etc.)
- Solid understanding of data modeling ETL/ELT and pipeline optimization
- Strong knowledge of data quality testing and monitoring best practices
- UpperIntermediate English or higher
WILL BE A PLUS
- Experience with AI/ML data pipelines
- Familiarity with finance/accounting datasets
- Knowledge of compliance frameworks such as SOX or GDPR
- Experience mentoring junior engineers
Additional Information :
PERSONAL PROFILE
- Proactive problemsolver with a handson approach
- Adaptable to fastmoving environments
- Strong communication skills for crossteam collaboration
- Ability to take ownership and drive initiatives to completion
Remote Work :
Yes
Employment Type :
Full-time
View more
View less