Senior Data Engineer Lewis Engineering Agentic AI
Job Summary
Thales is a global technology leader trusted by governments institutions and enterprises to tackle their most demanding challenges. From quantum applications and artificial intelligence to cybersecurity and 6G innovation our solutions empower critical decisions rooted in human intelligence. Operating at the forefront of defence and security aerospace and space cybersecurity and digital identity were driven by a mission to build a future we can all trust.
In Romania we are advancing innovation through software engineering research and development delivering solutions in key markets in which Thales Group operates. Our engineers design develop and integrate solutions that impact global industries from fully operational systems and subsystems for naval warfare and maritime security operations to air traffic management systems satellite-based solutions tactical indoor simulations identity and biometric technologies and more.
Senior Data Engineer
Background: Join us in building the future of engineering productivity. Were creating a ground-breaking agent garden that will transform how 40000 engineers work - enabling seamless connections between AI systems engineering tools and data while fostering a thriving marketplace of AI agents and services.
Main responsibilities:
- Design build and maintain scalable and reliable data pipelines with various data sources
- Develop ETL / ELT processes using tools like Data Fusion GCP and DataFlow GCP (Apache Beam)
- Collect and process the data in a suitable format for the organization needs. Perform and integrate data quality checks to identify and correct errors or discrepancies.
- Create and maintain documentation related to data flows and model transformations applied and validation procedures.
- Optimize performance and cost-efficiency of GCP data services
- Ensure security and compliance best practices in data handling
- Maintain clear and close collaboration with both the development team and the project stakeholders/ key users.
Requirements:
- Bachelors degree in Computer science Computer Engineering or relevant technical field.
- 5 year of experience with cloud data platforms (e.g. AWS Azure GCP). GCP is highly desirable or at least a minimal experience.
- Strong experience with DataFlow (GCP) and Apache Beam.
- Proficiency in Python (or similar languages) with solid software engineering fundamentals (testing modularity version control).
- Hands-on experience with SQL and NoSQL data stores such as PostgreSQL Redshift DynamoDB or MongoDB
- Good understanding of data warehousing and modern architectures (e.g. data Lakehouse data mesh)
- Familiarity with DevOps/CI-CD practices infrastructure-as-code (Terraform CloudFormation) and containerization (Docker/Kubernetes)
- Understanding of data quality observability lineage and metadata management practices
- Good communication and relationship with the stakeholders and team members
- Capable to give and receive feedback; able to listen and share able to give constructive feedback
- English Fluent French would be a plus
- Agile mindset & practices
At Thales were committed to fostering a workplace where respect trust collaboration and passion drive everything we do. Here youll feel empowered to bring your best self thrive in a supportive culture and love the work you do. Join us and be part of a team reimagining technology to create solutions that truly make a difference for a safer greener and more inclusive world.
Required Experience:
Senior IC
About Company
In all critical environments - air, land, sea, space and cyberspace - decision-makers, operators, crews and members of our armed services and security forces are faced with millions of important decisions every day. It is in supporting these people that Thales in the United States ha ... View more