Portuguesecompany hires for remote position
Location:Lisbon Portugal (2 days onsite per week)
Only candidates already based in Portugal willbe considered.
LanguageRequirements: Fluent English and Portuguese
Seniority: 7 years
Hiring Model: B2B
Client Sector: Banking &Financial Services
Instructions:Please send your CV in English and make sure to include all skills andexperience that match the requirements of the opportunity. This willsignificantly increase your chances of success
We are seeking a senior BigData Architect to lead the designimplementation and optimization of large-scale data platformswithin the banking and financialservices sector. This role requires a highly technicalautonomous professional capable of delivering end-to-end scalable data solutions indistributed environments.
Important: Allmandatory requirements listed below mustbe clearly and explicitly stated in the CV.Applications that do not include all required skills and experience will not beconsidered.
Design and implement scalable resilient and secure Big Data architectures
Configure and manage distributed environments (on-premises and/orcloud) for large-scale data processing
Develop and maintain data pipelines using Apache Airflowincluding complex DAG design monitoringand CI/CD best practices
Build and optimize batch and streaming applications usingApache Spark
Lead technicaldecisions related to data storage processing governance and systemintegration
Collaborate with engineering analytics and businessteams to ensure dataquality reliability and availability
Produce clear and structured technical documentationcovering architectures data flows and components
Minimum7 years of professional experience in data engineering orarchitecture roles
Proven experience as a Big Data Architect or ina senior technicalleadership role
Strong expertise in Apache Spark (PySpark Spark SQLperformance tuning)
Solid hands-on experience with Apache Airflow (DAGsoperators sensors)
Deep understanding of distributed systems cluster computingand containerized environments
Experience with containerization and orchestration(Kubernetes is a plus)
Strong background in Linux environmentsnetworking security and automation
Advanced knowledge of Big Data technologiessuch as Hadoop Hive KafkaDelta Lake
Experience working with cloud platforms (AWS Azure or GCP)
Strong analyticaland problem-solving skills with a focus on scalable andresilient solution design
FluentEnglish (written and spoken) mandatory
Experience with microservices-based architectures
Familiarity with CI/CD tools such as GitLab Jenkins or Argo
Knowledge of relational and NoSQL databases
ClientSector: Banking & Financial Services
WorkModel: Hybrid 2days onsite per week
Location:Lisbon
Big Data Architect Apache Spark PySparkSpark SQL Apache Airflow DAGs Big Data Distributed Systems Hadoop HiveKafka Delta Lake Data Pipelines Batch Processing Streaming CloudComputing AWS Azure GCP Kubernetes Linux CI/CD Microservices BankingFinancial Services English Fluent
#