drjobs Data Engineer II

Data Engineer II

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Lahore - Pakistan

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

We dont think about job roles in a traditional way. We are anti-silo. Anti-career stagnation. Anti-conventional.

Beyond ONE is a digital services provider radically reshaping the personalised digital ecosystems of consumers in high growth markets around the world. Were building a digital services aggregator platform with a strong telco foundation and a profitable growth strategy that empowers users to drive their own experiencesubscribe once source from many and only pay for what you actually use.

Since being founded in 2021 weve acquired Virgin Mobile MEA Friendi Mobile MEA and Virgin Mobile LATAM (with 6.5 million subscribers) and 1600 dedicated colleagues across Chile Colombia KSA Kuwait Mexico Oman Pakistan and UAE.

To disrupt for good takes a rebellious spirit a questioning mind and a warm heart. We really care about how to get things done and not who manages who. We benefit from our diversity and together we disrupt the way we and others thinkin about our lives for good.

Do you want to exchange ideas learn from each other and leave your mark on our journey This is the place for you.

Job Summary

Were looking for an in-house Data Engineer who will own our Apache Airflow environment end-to-enddesigning developing and operating scalable ETL/ELT pipelines that power analytics and machine-learning use-cases across multiple cloud platforms (GCP today AWS & Azure tomorrow). Youll join a growing Data & AI team that is modernising legacy Talend workflows into Python/dbt-based transformations and streaming ingestion built on Kafka/NiFi.

Why this role matters:
As Data Engineer Airflow & Cloud Platforms you will play a key role in modernizing our data infrastructure by building scalable resilient pipelines and orchestrations across multi-cloud environments. Your contributions will help shape the Data Engineering & Analytics team and ultimately the way we deliver analytics AI and real-time capabilities across our global operations.

What success looks like:
In your first year you will:

  • Migrate and refactor legacy Talend workflows into modular Python/dbt pipelines.

  • Establish a production-grade Apache Airflow environment with monitoring alerting and CI/CD automation.

  • Deliver streaming ingestion flows using Kafka/NiFi to support next-gen customer-facing use cases.

Why this is for you:
If youre keen on solving the orchestration and scaling challenges of a global multi-cloud data platform hit us up. Were looking for someone ready to tackle this challenge head-on and make an impact from day one.

Key Responsibilities

In this role you will:

  • Lead the development and maintenance of Airflow DAGs and deployment pipelines ensuring robust orchestration and SLA compliance.

  • Collaborate with Data Architects Analysts and ML Engineers in Agile sprints driving reliable delivery of data workflows.

  • Manage cloud data engineering across GCP (and eventually AWS/Azure) ensuring scalable and cost-effective pipeline deployments.

  • Drive the refactoring of Talend jobs into Python/dbt codebases with robust testing monitoring and documentation.

  • Build real-time ingestion flows via Kafka and NiFi enabling low-latency use-cases across regional systems.

  • Embed observability data quality checks and unit tests into every pipeline.

  • Contribute to peer reviews technical documentation and team knowledge sharing.

Qualifications & Attributes

Were seeking someone who embodies the following:

Education:
Bachelors in Computer Science Data Engineering or equivalent.

Experience:
36 years in data engineering preferably in telecom tech or cloud-native environments.

Technical Skills:
Must-haves:

  • Advanced Python (pandas pyarrow sqlalchemy) and SQL/dbt for data modeling and transformation.

  • Proven experience with Apache Airflow (or Cloud Composer) in production environments.

  • Track record of delivering ETL/ELT pipelines on cloud platforms (GCP preferred).

  • Familiarity with Kafka and/or NiFi for real-time streaming ingestion.

  • Hands-on use of Git CI/CD Docker and Terraform/IaC.

Nice-to-haves:

  • Experience with Talend migrations and open-table formats (Parquet Delta Iceberg).

  • Experience of working with Databricks and with any of the three cloud providers (Azure GCP or AWS)

What we offer:

  • Rapid learning opportunities - we enable learning through flexible career paths exposure to challenging & meaningful work that will help build and strengthen your expertise.
  • Hybrid work environment - flexibility to work from home 2 days a week in the UAE & Pakistan.
  • Healthcare and other local benefits offered in market.

By submitting your application you acknowledge and consent to the use of Greenhouse & BrightHire during the recruitment process. This may include the storage and processing of your data on servers located outside your country of residence. For further information please contact us at

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.