drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Hanoi - Vietnam

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

About GFT

GFT Technologies is driving the digital transformation of the worlds leading financial institutions. Other sectors such as industry and insurance also leverage GFTs strong consulting and implementation skills across all aspects of pioneering technologies such as cloud engineering artificial intelligence the Internet of Things for Industry 4.0 and blockchain.

With its in-depth technological expertise strong partnerships and scalable IT solutions GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models while also reducing risk.

Weve been a pioneer of near-shore delivery since 2001 and now offer an international team spanning 16 countries with a global workforce of over 9000 people around the world. GFT is recognised by industry analysts such as Everest Group as a leader amongst the global mid-sized Service Integrators and ranked in the Top 20 leading global Service Integrators in many of the exponential technologies such as Open Banking Blockchain Digital Banking and Apps Services.

Sign-on Bonus:Eligible for candidates who are currently employed elsewhere and able to join GFT within 30 days of offer acceptance.

Role Summary

As a Data Engineer at GFT you will play a pivotal role in designing maintaining and enhancing various analytical and operational services and infrastructure crucial for the organizations functions. Youll collaborate closely with cross-functional teams to ensure the seamless flow of data for critical decision-making processes.

Key Activities

  • Data Infrastructure Design and Maintenance:Architect maintain and enhance analytical and operational services and infrastructure including data lakes databases data pipelines and metadata repositories to ensure accurate and timely delivery of actionable insights.
  • AWS Glue Development:Experience in AWS Glue data pipeline development (not drag-and-drop).
  • Collaboration:Work closely with data science teams to design and implement data schemas and models integrate new data sources with product teams and collaborate with other data engineers to implement cutting-edge technologies in the data space.
  • Data Processing:Develop and optimize large-scale batch and real-time data processing systems to support the organizations growth and improvement initiatives.
  • Workflow Management:Utilize workflow scheduling and monitoring tools like Apache Airflow and AWS Batch to ensure efficient data processing and management.
  • Quality Assurance:Implement robust testing strategies to ensure the reliability and usability of data processing systems.
  • Continuous Improvement:Stay abreast of emerging technologies and best practices in data engineering and propose and implement optimizations to enhance development efficiency.

Required Skills

  • 4-6 years of experience as a Data Engineer.
  • Professional Python developer with design pattern knowledge.
  • Experience with production-grade code.
  • Able to develop quickly work overtime and perform under pressure.
  • Technical Expertise: Proficient in Unix environments distributed and cloud computing Python frameworks (e.g. pandas pyspark) version control systems (e.g. git) and workflow scheduling tools (e.g. Apache Airflow).
  • Database Proficiency: Experience with columnar and big data databases like Athena Redshift Vertica and Hive/Hadoop.
  • Cloud Services: Familiarity with AWS or other cloud services like Glue EMR EC2 S3 Lambda etc.
  • Containerization: Experience with container management and orchestration tools like Docker ECS and Kubernetes.
  • CI/CD: Knowledge of CI/CD tools such as Jenkins CircleCI or AWS CodePipeline.

Nice-to-have Requirements

  • Programming Languages: Familiarity with JVM languages like Java or Scala.
  • Database Technologies: Experience with RDBMS (e.g. MySQL PostgreSQL) and NoSQL databases (e.g. DynamoDB Redis).
  • BI Tools: Exposure to enterprise BI tools like Tableau Looker or PowerBI.
  • Data Science Environments: Understanding of data science environments like AWS Sagemaker or Databricks.
  • Monitoring and Logging: Knowledge of log ingestion and monitoring tools like ELK stack or Datadog.
  • Data Privacy and Security: Understanding of data privacy and security tools and concepts.
  • Messaging Systems: Familiarity with distributed messaging and event streaming systems like Kafka or RabbitMQ.

Due to the high volume of applications we receive we are unable to respond to every candidate individually. If you have not received a response from GFT regarding your application within 10 workdays please consider that we have decided to proceed with other candidates. We truly appreciate your interest in GFT and thank you for your understanding.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.