drjobs Sr Kafka Integration Engineer

Sr Kafka Integration Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Markham - Canada

Monthly Salary drjobs

$ 135000 - 150000

Vacancy

1 Vacancy

Job Description

We are on the lookout for a highly skilled Lead Systems Connectivity Specialist to enhance our technology team. In this capacity you will be central to the creation deployment and upkeep of sophisticated data integration infrastructures. Your work will heavily feature live data conduit solutions primarily using Apache Kafka and cloudbased data transformation leveraging the AWS suite with a focus on services like AWS Glue. The ideal applicant will have deepseated knowledge of corporatelevel system connectivity approaches reactive eventbased architectures and solutions built for cloud environments. You will liaise effectively with diverse teams spanning engineering product development and business operations to facilitate optimized and steadfast data transit throughout our intricate digital ecosystem.


Core Contributions:

  • Spearhead the development and operation of scalable resilient data integration frameworks with a strong emphasis on Apache Kafka including its associated components like Kafka Connect Streams and Schema Registry.

  • Establish and refine ETL/ELT processes utilizing AWS Glue managing the flow of data from various sources into designated storage platforms like data lakes and centralized warehouses.

  • Ensure data integrity and governance across integrated systems by meticulously managing the AWS Glue Data Catalog.

  • Champion and implement industryleading practices for microservices APIdriven integrations and eventcentric architectural designs across the organization.

  • Work alongside technical leaders data experts and business stakeholders to define connectivity needs and translate them into effective technical blueprints.

  • Maintain and enhance current integration points to boost operational stability and throughput.

  • Continuously monitor and adjust Kafka environments targeting key performance indicators such as data flow rates and response times to ensure system vitality.

  • Produce and curate thorough documentation detailing integration designs operational procedures and data flow diagrams.

  • Implement comprehensive system monitoring event logging and alert systems to ensure operational uptime and reliability.

  • Offer mentorship and expert advice to developing engineers within the connectivity domain.

  • Interconnect and orchestrate a variety of cloud platforms databases and external API endpoints using AWS services (Lambda S3 API Gateway SNS SQS EventBridge) and other relevant integration technologies.

  • Stay abreast of the latest developments and best practices in data integration Apache Kafka and Amazon Web Services.

  • Adhere to strict security protocols and ensure all developed solutions comply with established best practices.



Requirements

Essential Proficiencies:

  • A Bachelors degree in a technologyfocused discipline (e.g. Computer Science Engineering) or commensurate professional experience.

  • A minimum of five years dedicated to architecting and deploying integration solutions at an enterprise scale.

  • Profound expertise with Apache Kafka covering the development and oversight of data producers/consumers topic configuration connector deployment and performance optimization.

  • Significant experience with AWS cloud services particularly AWS Glue for data transformation catalog management and task automation.

  • Broad understanding of data modeling data warehousing philosophies and ETL/ELT processes.

  • Proficiency in handling multiple data formats (such as JSON Avro Parquet XML) and API interaction models (REST SOAP gRPC).

  • Capability in at least one programming language pertinent to integration tasks (e.g. Python Java Scala).

  • Working knowledge of both SQLbased and NoSQL database systems.

  • Strong familiarity with common integration designs eventbased processing and microservice architectures.

  • Experience with version control software (like Git) and continuous integration/delivery (CI/CD) methodologies.

  • Demonstrable analytical acumen effective problemresolution skills and strong interpersonal abilities fostering a cooperative team environment.

  • Familiarity with a wider range of AWS tools including S3 Lambda API Gateway Kinesis Redshift RDS and DynamoDB.

Advantageous Skills:

  • Professional certification related to Apache Kafka (e.g. Confluent).

  • AWS certifications (e.g. Data Analytics Developer Solutions Architect).

  • Experience with alternative data pipeline technologies (such as Apache Airflow Talend Informatica).

  • Knowledge of containerization tools (Docker Kubernetes).

  • Experience with Infrastructure as Code (IaC) utilities (Terraform AWS CloudFormation).

  • A solid understanding of data governance principles and security best practices.



Kafka

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.