Job Title: Data Cloud Platform Architect
Location: Porto Portugal
Work Regime: Full-time & Hybrid (2x to 3x office days)
Requirements
Mandatory Requirements:
- Bachelors or Masters degree in Computer Science Engineering or equivalent professional experience.
- 10 years of experience in AWS data services with exposure to Azure data services.
- Experience in developing Infrastructure as Code templates (e.g. Terraform) for platform deployment.
- Experience leading large-scale data migration and optimization projects.
- Experience with Iceberg tables and efficient management of large datasets.
- Experience with orchestration tools such as Apache Airflow and AWS Step Functions.
- Strong programming skills in Python and SQL with experience in prototyping and defining coding standards.
- Proficiency in designing scalable and efficient data solutions on AWS following best practices for cloud architecture and infrastructure.
- Knowledge of ETL tools and experience working with large volumes of data with preference for experience with Kafka.
- Proven ability to define and enforce data governance and security standards.
- Define data engineering standards by creating Proofs of Concept (PoCs) and production-grade prototypes.
- Responsible for establishing and optimizing CI/CD pipelines for data workloads.
- Familiarity with Azure Databricks for data engineering and analytics tasks will be considered an advantage.
- Fluency in Portuguese and English (C1 level or higher).
Complementary Requirements:
- Knowledge of Apache Flink Kafka and other streaming data technologies.
- Certification in AWS Azure or similar technologies.
- Experience with Dataiku.
Benefits
Important:
- Our company does not sponsor work visas or work permits. All applicants must have the legal right to work in the country where the position is based.
- Only candidates who meet the required qualifications and match the profile requested by our clients will be contacted.
#VisionaryFuture - Build the future join our living ecosystem!
Required Skills:
achelors or Masters degree in Computer Science Engineering or equivalent professional experience. 10 years of experience in AWS data services with exposure to Azure data services. Experience in developing Infrastructure as Code templates (e.g. Terraform) for platform deployment. Experience leading large-scale data migration and optimization projects. Experience with Iceberg tables and efficient management of large datasets. Experience with orchestration tools such as Apache Airflow and AWS Step Functions. Strong programming skills in Python and SQL with experience in prototyping and defining coding standards. Proficiency in designing scalable and efficient data solutions on AWS following best practices for cloud architecture and infrastructure. Knowledge of ETL tools and experience working with large volumes of data with preference for experience with Kafka. Proven ability to define and enforce data governance and security standards. Define data engineering standards by creating Proofs of Concept (PoCs) and production-grade prototypes. Responsible for establishing and optimizing CI/CD pipelines for data workloads. Familiarity with Azure Databricks for data engineering and analytics tasks will be considered an advantage. Fluency in Portuguese and English (C1 level or higher).
Job Title: Data Cloud Platform ArchitectLocation: Porto PortugalWork Regime: Full-time & Hybrid (2x to 3x office days)RequirementsMandatory Requirements:Bachelors or Masters degree in Computer Science Engineering or equivalent professional experience. 10 years of experience in AWS data services with...
Job Title: Data Cloud Platform Architect
Location: Porto Portugal
Work Regime: Full-time & Hybrid (2x to 3x office days)
Requirements
Mandatory Requirements:
- Bachelors or Masters degree in Computer Science Engineering or equivalent professional experience.
- 10 years of experience in AWS data services with exposure to Azure data services.
- Experience in developing Infrastructure as Code templates (e.g. Terraform) for platform deployment.
- Experience leading large-scale data migration and optimization projects.
- Experience with Iceberg tables and efficient management of large datasets.
- Experience with orchestration tools such as Apache Airflow and AWS Step Functions.
- Strong programming skills in Python and SQL with experience in prototyping and defining coding standards.
- Proficiency in designing scalable and efficient data solutions on AWS following best practices for cloud architecture and infrastructure.
- Knowledge of ETL tools and experience working with large volumes of data with preference for experience with Kafka.
- Proven ability to define and enforce data governance and security standards.
- Define data engineering standards by creating Proofs of Concept (PoCs) and production-grade prototypes.
- Responsible for establishing and optimizing CI/CD pipelines for data workloads.
- Familiarity with Azure Databricks for data engineering and analytics tasks will be considered an advantage.
- Fluency in Portuguese and English (C1 level or higher).
Complementary Requirements:
- Knowledge of Apache Flink Kafka and other streaming data technologies.
- Certification in AWS Azure or similar technologies.
- Experience with Dataiku.
Benefits
Important:
- Our company does not sponsor work visas or work permits. All applicants must have the legal right to work in the country where the position is based.
- Only candidates who meet the required qualifications and match the profile requested by our clients will be contacted.
#VisionaryFuture - Build the future join our living ecosystem!
Required Skills:
achelors or Masters degree in Computer Science Engineering or equivalent professional experience. 10 years of experience in AWS data services with exposure to Azure data services. Experience in developing Infrastructure as Code templates (e.g. Terraform) for platform deployment. Experience leading large-scale data migration and optimization projects. Experience with Iceberg tables and efficient management of large datasets. Experience with orchestration tools such as Apache Airflow and AWS Step Functions. Strong programming skills in Python and SQL with experience in prototyping and defining coding standards. Proficiency in designing scalable and efficient data solutions on AWS following best practices for cloud architecture and infrastructure. Knowledge of ETL tools and experience working with large volumes of data with preference for experience with Kafka. Proven ability to define and enforce data governance and security standards. Define data engineering standards by creating Proofs of Concept (PoCs) and production-grade prototypes. Responsible for establishing and optimizing CI/CD pipelines for data workloads. Familiarity with Azure Databricks for data engineering and analytics tasks will be considered an advantage. Fluency in Portuguese and English (C1 level or higher).
View more
View less