At Derevo we empower companies and people by unlocking the value of our clients data and enhancing the talent of those who transform it.
With more than 15 years of experience we design end-to-end data and AI solutions: from integration into modern architectures to the implementation of intelligent models in key business processes.
If youre passionate about the world of data and innovation this could be your moment!
Were looking for your talent as a GCP Data Engineer!
What will your mission be
You will be a key player on the Data Integration team creating and implementing modern data architectures with high quality and scalability.
You will design maintain and optimize parallel processing systems applying best practices in storage and management across Data Warehouses Data Lakes and Lakehouses driving Big Databased analytical solutions within the Google Cloud Platform (GCP) ecosystem.
Your mission will be to turn complex problems into achievable solutions aligned with business objectives helping our clients make data-driven decisions.
How will you do it
- Actively participate in data integration and transformation projects collaborating with multidisciplinary teams and applying data engineering best practices.
- Design and execute ETL/ELT processes using BigQuery Dataflow Composer and other GCP services.
- Develop pipelines in Python (PySpark) and Apache Beam processing structured and semi-structured data.
- Implement optimized analytical data models using partitioning clustering and materialized views.
- Apply security and governance strategies with IAM Dataplex and Cloud DLP.
- Analyze and validate data quality to ensure consistency and accuracy.
- Participate in engineering sessions and sprint planning estimating tasks and proposing technical improvements.
- Collaborate with internal teams and clients providing a consultative perspective that adds value at every stage of the project.
What do we ask for
To feel right at home as a GCP Data Engineer at Derevo this is what were looking for:
- At least 2 years of experience as a Data Engineer working with Google Cloud Platform (GCP).
- Strong command of BigQuery Dataflow Composer Pub/Sub Datastream and Cloud Storage.
- Experience with Spark PySpark and data pipeline development.
- Solid knowledge of advanced SQL (T-SQL Spark SQL).
- Experience designing and maintaining Data Warehouses and Data Lakes.
- Knowledge of governance and security strategies (Row-Level/Column-Level Security IAM).
English is mandatory; basic Spanish will be considered a plus
We also value these skills and ways of working:
- Clear and approachable communication.
- Collaborative work in squads.
- Proactivity and solution-oriented mindset.
- Continuous learning and technical curiosity.
- Responsibility and organization.
- A consultative mindset focused on generating value for the client.
Your benefits with Derevo:
WELLNESS: We foster your overall well-being through personal professional and financial balance. Our statutory and additional benefits will help you achieve it.
LETS RELEASE YOUR POWER: Youll have the opportunity to specialize in different areas and technologies enabling interdisciplinary growth.
WE CREATE NEW THINGS: We like to think outside the box. Youll have the freedom and training needed to create innovative solutions.
WE GROW TOGETHER: Youll participate in cutting-edge multinational technology projects and work with international teams.
Where will you do it
You will work under a hybrid model coming to our offices 2 to 3 days a week in New Jersey 07302.
Become a Derevian & develop your superpower!
Required Experience:
Senior IC
At Derevo we empower companies and people by unlocking the value of our clients data and enhancing the talent of those who transform it. With more than 15 years of experience we design end-to-end data and AI solutions: from integration into modern architectures to the implementation of intelligent m...
At Derevo we empower companies and people by unlocking the value of our clients data and enhancing the talent of those who transform it.
With more than 15 years of experience we design end-to-end data and AI solutions: from integration into modern architectures to the implementation of intelligent models in key business processes.
If youre passionate about the world of data and innovation this could be your moment!
Were looking for your talent as a GCP Data Engineer!
What will your mission be
You will be a key player on the Data Integration team creating and implementing modern data architectures with high quality and scalability.
You will design maintain and optimize parallel processing systems applying best practices in storage and management across Data Warehouses Data Lakes and Lakehouses driving Big Databased analytical solutions within the Google Cloud Platform (GCP) ecosystem.
Your mission will be to turn complex problems into achievable solutions aligned with business objectives helping our clients make data-driven decisions.
How will you do it
- Actively participate in data integration and transformation projects collaborating with multidisciplinary teams and applying data engineering best practices.
- Design and execute ETL/ELT processes using BigQuery Dataflow Composer and other GCP services.
- Develop pipelines in Python (PySpark) and Apache Beam processing structured and semi-structured data.
- Implement optimized analytical data models using partitioning clustering and materialized views.
- Apply security and governance strategies with IAM Dataplex and Cloud DLP.
- Analyze and validate data quality to ensure consistency and accuracy.
- Participate in engineering sessions and sprint planning estimating tasks and proposing technical improvements.
- Collaborate with internal teams and clients providing a consultative perspective that adds value at every stage of the project.
What do we ask for
To feel right at home as a GCP Data Engineer at Derevo this is what were looking for:
- At least 2 years of experience as a Data Engineer working with Google Cloud Platform (GCP).
- Strong command of BigQuery Dataflow Composer Pub/Sub Datastream and Cloud Storage.
- Experience with Spark PySpark and data pipeline development.
- Solid knowledge of advanced SQL (T-SQL Spark SQL).
- Experience designing and maintaining Data Warehouses and Data Lakes.
- Knowledge of governance and security strategies (Row-Level/Column-Level Security IAM).
English is mandatory; basic Spanish will be considered a plus
We also value these skills and ways of working:
- Clear and approachable communication.
- Collaborative work in squads.
- Proactivity and solution-oriented mindset.
- Continuous learning and technical curiosity.
- Responsibility and organization.
- A consultative mindset focused on generating value for the client.
Your benefits with Derevo:
WELLNESS: We foster your overall well-being through personal professional and financial balance. Our statutory and additional benefits will help you achieve it.
LETS RELEASE YOUR POWER: Youll have the opportunity to specialize in different areas and technologies enabling interdisciplinary growth.
WE CREATE NEW THINGS: We like to think outside the box. Youll have the freedom and training needed to create innovative solutions.
WE GROW TOGETHER: Youll participate in cutting-edge multinational technology projects and work with international teams.
Where will you do it
You will work under a hybrid model coming to our offices 2 to 3 days a week in New Jersey 07302.
Become a Derevian & develop your superpower!
Required Experience:
Senior IC
View more
View less