SeniorLead Data Engineer

GFT

Not Interested
Bookmark
Report This Job

profile Job Location:

Hanoi - Vietnam

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

About GFT

GFT Technologies is an AI-centric global digital transformation company. We design advanced data and AI transformation solutions modernize technology architectures and develop next-generation core systems for industry leaders in Banking Insurance Manufacturing and Robotics. Partnering closely with our clients we push boundaries to unlock their full potential. With deep industry expertise cutting-edge technology and a strong partner ecosystem GFT delivers responsible AI-centric solutions that combine engineering excellence high-performance delivery and cost efficiency. Our team of 12000 technology experts operate in 20 countries worldwide offering career opportunities at the forefront of software innovation.

Role Summary

As aData Engineerat GFT you willbe responsible formanaging designing and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering involving the construction and optimization of data pipelines and architectures and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You willleveragetechnologies like Python Airflow Kubernetes and AWS to deliver high-quality data solutions.

Key Activities

  • Architect develop andmaintainscalable data infrastructure including data lakes pipelines and metadata repositories ensuring thetimelyandaccuratedelivery of data to stakeholders.
  • Work closely with data scientists to build and support data models integrate data sources and support machine learning workflows and experimentation environments.
  • Develop andoptimizelarge-scale batch and real-time data processing systems to enhance operational efficiency and meet businessobjectives.
  • Leverage Python Apache Airflow and AWS services to automate data workflows and processes ensuring efficient scheduling and monitoring.
  • Utilize AWS services such as S3 Glue EC2 and Lambda to manage data storage and compute resources ensuring high performance scalability and cost-efficiency.
  • Implement robust testing and validation procedures to ensure the reliability accuracy and security of data processing workflows.
  • Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Required Skills

  • Core Expertise:Proficiencyin Python for data processing and scripting (pandaspyspark) workflow automation (Apache Airflow) and experience with AWS services (Glue S3 EC2 Lambda).
  • Containerization & Orchestration:Experience working with Kubernetes andDocker formanaging containerized environments in the cloud.
  • Data Engineering Tools:Hands-on experience with columnar and big data databases (Athena Redshift Vertica Hive/Hadoop) along with version control systems like Git.
  • Cloud Services:Strong familiarity with AWS services for cloud-based data processing and management.
  • CI/CD Pipeline:Experience with CI/CD tools such as JenkinsCircleCI or AWSCodePipelinefor continuous integration and deployment.
  • Data Engineering Focus (75%):Expertisein building and managing robust data architectures and pipelines for large-scale data operations.
  • Data Science Support (25%):Ability to support data science workflows including collaboration on data preparation feature engineering and enabling experimentation environments.

Nice-to-have requirements

  • LangchainExperience:Familiarity withLangchainfor building data applications involving natural language processing or conversational AI frameworks.
  • Advanced Data Science Tools:Experience with AWSSagemakeror Databricks for enabling machine learning environments.
  • Big Data & Analytics:Familiarity with both RDBMS (MySQL PostgreSQL) and NoSQL (DynamoDB Redis) databases.
  • BI Tools:Experience with enterprise BI tools like Tableau Looker orPowerBI.
  • Messaging & Event Streaming:Familiarity with distributed messaging systems like Kafka or RabbitMQ for event streaming.
  • Monitoring & Logging:Experience with monitoring and log management tools such as the ELK stack or Datadog.
  • Data Privacy and Security:Knowledge of best practices for ensuring data privacy and security particularly in large data infrastructures.

What we offer you

You will be working with some of the brightest people in business and technology on challenging and rewarding projects in a team of like-minded individuals. GFT prides itself on its international environment that promotes professional and cultural exchange and encourages further individual development with the benefits below:

  • Competitive salary
  • 13th-month salary guarantee
  • Performance bonus
  • Professional English course for employees
  • Premium health insurance
  • Extensive annual leave

(Due to the high volume of applications we receive we are unable to respond to every candidate individually. If you have not received a response from GFT regarding your application within 10 workdays please consider that we have decided to proceed with other candidates. We truly appreciate your interest in GFT and thank you for your understanding.)


Required Experience:

Senior IC

About GFTGFT Technologies is an AI-centric global digital transformation company. We design advanced data and AI transformation solutions modernize technology architectures and develop next-generation core systems for industry leaders in Banking Insurance Manufacturing and Robotics. Partnering close...
View more view more

About Company

Company Logo

We see opportunity in technology. In domains such as cloud, AI, mainframe modernisation, DLT and IoT, we blend established practice with new thinking to help our clients stay ahead.

View Profile View Profile