drjobs Senior Data Engineer (ADB + Spark)

Senior Data Engineer (ADB + Spark)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Sofia - Bulgaria

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

We are looking for a motivated Senior Data Engineer (ADB Spark) who is willing to dive into the new project with a modern stack. If youre driven by a curiosity to learn and a desire to produce meaningful results please apply!

About Our Customer

You will work with the 6thlargest privately owned organization in the United States. The ustomer is one of the Big Four accounting organizations and the largest professional services network in the world in terms of revenue and number of professionals. The company provides audit tax consulting enterprise risk and financial advisory services to 263900 professionals globally.

Project Tech Stack

Azure Cloud Microservices Architecture .NET 8 Core services Python Mongo Azure SQL Angular 18 Kendo GitHub Enterprise with Copilot

Requirements

  • 5 years of handson experience in software development
  • Extensive experience working with Apache Spark including platforms such as Databricks and/or Azure Synapse/Fabric
  • Proficient in Python with strong skills in data manipulation using Pandas/Polars and similar
  • Solid understanding of columnar data storage formats particularly Parquet with practical experience using Delta Tables
  • Proven expertise in data processing analysis and transformation workflows
  • Strong analytical and problemsolving abilities with a detailoriented mindset
  • Practical and pragmatic approach to balancing standardized processes with flexibility to meet project goals effectively
  • Excellent organizational skills with the ability to selfmanage prioritize tasks structure workload and meet tight deadlines

Nice to have

  • Experience working with Azure Cloud services (or other major cloud platforms) including a range of SaaS offerings such as Service Bus Data Lake Blob Storage Redis and more
  • Familiarity with FastAPI
  • Expertise in containerization and orchestration tools such as Docker and Kubernetes
  • Solid understanding of microservices architecture and its implementation in scalable systems

English level

Intermediate

Responsibilities

  • Define and enforce best practices and coding standards across the project
  • Conduct thorough code reviews to ensure adherence to established guidelines and maintain high code quality
  • Working both independently and in close collaboration with others in the team
  • Communicating clear instructions to team members and helping manage the flow of daytoday operations
  • Communicating with the client regularly
  • Design develop and maintain robust and scalable Spark applications
  • Write clean maintainable and efficient code following best practices and coding standards
  • Optimize code for performance and scalability ensuring efficient data handling
  • Work closely with crossfunctional teams to deliver highquality software solutions
  • Identify and resolve technical issues ensuring the reliability and performance of applications
  • Create and maintain comprehensive documentation for code processes and workflows
  • Provide guidance and mentorship to junior developers fostering a collaborative and productive team environment

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.