We are seeking a Middle Data Engineer with proven expertise inAWS Snowflake and dbt to design and build scalable data pipelines and modern data infrastructure. Youll play a key role in shaping the data ecosystem ensuring data availability quality and performance across business units.
Requirements:
- 4 years of experience in Data Engineering roles.
- Experience with the AWS cloud platform.
- Proven experience with Snowflake in production environments.
- Hands-on experience building data pipelines using dbt.
- Python skills for data processing and orchestration.
- Deep understanding of data modeling and ELT best practices.
- Experience with CI/CD and version control systems (e.g. Git).
- Strong communication and collaboration skills.
Must-Have:
- Strong experience with Snowflake (e.g. performance tuning storage layers cost management)
- Production-level proficiency with dbt (modular development testing deployment)..
- Experience developing Python data pipelines.
- Proficiency in SQL (analytical queries performance optimization).
Nice-to-Have:
- Experience with orchestration tools like Airflow Prefect or Dagster.
- Familiarity with cloud platforms (e.g. GCP or Azure).
- Knowledge of data governance lineage and catalog tools.
- Experience in working in Agile teams and CI/CD deployment pipelines.
- Exposure to BI tools like Tableau or Power BI.
We offer*:
- Flexible working format - remote office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program tech talks and trainings centers of excellence and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers