Senior Data Engineer (Remote)

360training

Not Interested
Bookmark
Report This Job

profile Job Location:

Karachi - Pakistan

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy
The job posting is outdated and position may be filled

Job Summary


360training
is a rapidly growing leader in online training and certification across a wide range of industries and professions. provides customers with the regulated training they need to get and keep jobs they want. Over the years we have continued to grow our expansive library of regulatory-approved training courses with new content suited for todays modern workforce. By offering these courses online all users experience the convenience and flexibility of earning their certifications in their own time from anywhere in the world.

At we promote a culture of excellence centered around our two core values: Deliver Results and Do the Right Thing. That focus fosters the success of our employees while maintaining a team-centric environment which inspires them to do their absolute best. One thing our associates get to experience is the ability to make an impact on day one of working here.
360training offers a compelling compensation package that ties to performance and impact. We offer quality health insurance for employees their dependents and parent including OPD dental and vision coverage to meet a variety of needs. We also offer benefits of life and disability time off leave encashment EOBI and gratuity. With twice a year promotion opportunity.

Shift Timings:05:00 PM - 02:00 AM PST (Pakistan Standard Time)
Senior Data Engineer
We are seeking an experienced Senior Software Data Engineer with a strong software development background to design build and scale modern data platforms and real-time data streaming solutions. This role is highly engineering-focused centered on backend development data architecture and distributed data systems rather than traditional reporting or visualization tools.
You will play a key role in architecting and developing batch and streaming data pipelines working with technologies such as C# Kafka Azure services Databricks and Snowflake and enabling reliable data movement transformation and processing at scale.
This role is ideal for a hands-on engineer who enjoys solving complex data problems through code architecture and modern cloud-native data platforms.
Key Responsibilities
  • Design develop and maintain scalable batch and real-time data pipelines using engineering-first approaches.
  • Build and optimize streaming data pipelines using Kafka / event-driven architectures.
  • Develop backend data services and integrations using C# and Microsoft technology stack.
  • Architect and implement data movement ingestion and transformation workflows across cloud and on-prem systems.
  • Work extensively with Databricks Delta Lake and Snowflake to process and store large-scale datasets.
  • Design robust data architecture supporting high availability scalability and performance.
  • Implement data transformation logic using code-driven frameworks rather than GUI-based tools.
  • Optimize data processing and storage for performance cost and reliability.
  • Implement CI/CD pipelines and version control using Git and DevOps best practices.
  • Collaborate closely with product engineering and platform teams to translate business needs into technical solutions.
  • Ensure data quality observability security and governance across the data ecosystem.
  • Proactively identify architectural improvements and modernize legacy data workflows.
Required Skills & Experience
  • 5 years of experience in software development and data engineering roles.
  • Strong software engineering background preferably with C# / .NET.
  • Hands-on experience building data streaming pipelines using Kafka or similar technologies.
  • Solid experience with cloud-based data platforms such as Databricks and Snowflake.
  • Strong understanding of data ingestion transformation and movement across distributed systems.
  • Experience working with Azure services (e.g. Azure Data Factory Event Hubs Azure Functions or similar).
  • Proficiency in SQL and experience working with relational and semi-structured data.
  • Experience designing scalable fault-tolerant data architectures.
  • Familiarity with CI/CD Git-based workflows and DevOps practices.
  • Experience working in Agile/Scrum environments using tools like JIRA.
  • Strong problem-solving mindset with a focus on performance and reliability.
Nice to Have
  • Experience with microservices architecture.
  • Knowledge of event-driven systems and distributed system design.
  • Exposure to data governance security and compliance best practices.
  • Experience migrating or modernizing legacy data systems.
Qualifications
  • Bachelors or masters degree in computer science Software Engineering Information Technology or a related field.
  • Proven track record of delivering complex production-grade data and software systems.
  • Strong communication skills and the ability to collaborate across engineering teams


Required Experience:

Senior IC

360training is a rapidly growing leader in online training and certification across a wide range of industries and professions. provides customers with the regulated training they need to get and keep jobs they want. Over the years we have continued to grow our expansive library of regulatory-appro...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

The #1 online training provider for over 25 years. Join 11 million learners & advance your career with certification in an accredited course today!

View Profile View Profile