Project Description: We are seeking a highly skilled Senior Data Engineer with deep expertise in Python SQL and scalable data architecture. You will be a crucial component of our data ecosystem focusing on building robust high-performance data pipelines and infrastructure to ensure reliable and timely data delivery for analytics and core business processes.
Hard Skills / Need to Have:
Expert proficiency in Python 3.7 (for pipeline development) and SQL (advanced level).
Deep understanding of multi-layered DWH architecture (e.g. Raw ODS/Staging Data Marts).
Extensive experience building ETLs with Airflow 2
Excellent knowledge of OOP design patterns clean architecture
Professional experience with a major cloud data platform specifically with Object Storage services (e.g. Google Cloud Storage Amazon S3 or Azure Blob Storage).
Experience working with a Cloud Data Warehouse (e.g. Google BigQuery Snowflake or Redshift).
Mandatory experience with Git for version control and collaborative development.
Hard Skills / Nice to Have (Optional):
Experience designing and working with high-load/low-latency data systems or message brokers (e.g. Kafka PubSub).
Professional proficiency in DevOps practices and CI/CD automation (e.g. GitHub Actions GitLab CI).
Experience with Infrastructure as Code (IaC) particularly Terraform.
Understanding of and experience with containerization and orchestration (Docker Kubernetes).
Participating in testing acceptance of developed functionality and investigation of data quality incidents.
Responsibilities and Tasks:
1. Data Engineering & Infrastructure (Core Focus)
Design develop and maintain end-to-end data pipelines (ETL/ELT) from diverse source systems to the Data Warehouse.
Build scalable and cost-efficient infrastructure for storing and processing large datasets within our cloud environment.
Implement and advance systems for monitoring alerting and automated validation of data flows and Data Quality.
Participate in system design and data architecture evolution focusing on performance resilience and security.
Proactively optimize existing data pipelines and DWH queries for efficiency and cost reduction.
2. Collaboration & Documentation
Collaborate closely with Data Analysts to technically clarify requirements for data delivery and transformation logic.
Conduct root cause analysis of data quality incidents and develop sustainable engineering solutions to prevent recurrence.
Create and maintain technical documentation covering DWH schemas pipeline architecture and data lineage.
Ready to Join
We look forward to receiving your application and welcoming you to our team!
For job seekers, BONAPOLIA offers a gateway to exciting career prospects and the chance to thrive in a fulfilling work environment. We believe that the right job can transform lives, and we are committed to making that happen for you.