drjobs Fullstack data engineer_Quinnox

Fullstack data engineer_Quinnox

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

We need profiles here on urgent basis. The candidate has to join by 15th June. Hence the movement will be very fast. Please only share immidate joiners.

Kindly do comply with the experience criterias.

Experience: 8 years to 13 years

Work location:Bangalore

Shift: 2 PM to 11 PM

Notice Period: Immediate

Hybrid Mode:10 days in a month Work from office

Skill Sets: Minimum 3 years in SQL Snow flake and Python

DBT- 1 years

Budget - 25 Lakhs

We are seeking a Full-Stack Data Engineer to design develop and manage scalable data pipelines storage and transformation solutions. This role requires expertise in cloud-based data platforms data warehouse / data lake house design and development workflow automation and data integration to support business intelligence and analytics. The ideal candidate will have a strong background in data engineering cloud technologies and software development with a focus on performance security (especially data segregation) and automation.

Key Responsibilities

1. Data Platform Design & Implementation

Architect and deploy scalable secure and high-performing Snowflake environments in line with data segregation policies.

Automate infrastructure provisioning testing and deployment for seamless operations.

2. Data Integration & Pipeline Development

Develop optimize and maintain data pipelines (ETL/ELT) to ensure efficient data ingestion transformation and migration.

Implement best practices for data consistency quality and performance across cloud and on-premises systems.

3. Data Transformation & Modeling

Design and implement data models that enable efficient reporting and analytics.

Develop data transformation processes using Snowflake DBT and Python to enhance usability and accessibility.

4. Networking Security & Compliance

Configure and manage secure network connectivity for data ingestion.

Ensure compliance with GDPR CISO policies and industry security standards.

5. Data Quality & Governance

Ensure the Data Segregation Policy is firmly followed for the data sets enabled.

Implement data validation anomaly detection and quality assurance frameworks.

Collaborate with the Data Governance team to maintain compliance and integrate quality checks into data pipelines.

6. Real-Time & Batch Data Processing

Build and optimize real-time streaming and batch processing solutions using Kafka Kinesis or Apache Airflow.

Ensure high-throughput low-latency data processing efficiency.

7. Stakeholder Collaboration & Business Alignment

Work closely with business stakeholders analysts and data teams to deliver tailored solutions.

Translate complex technical insights for both technical and non-technical audiences.

8. Performance Optimization & Continuous Improvement

Identify and implement automation cost optimization and efficiency improvements in data pipelines.

Qualifications & Experience

Education:

Bachelors degree in Computer Science Data Science Information Systems or related field (Masters preferred).

Experience:

8 years in data engineering analytics engineering or related fields.

Proven experience with Snowflake DBT Snaplogic and modern data technologies.

Expertise in cloud data platforms real-time & batch processing and CI/CD automation.

Strong background in data modeling architecture and cost-efficient pipeline management.

Experience in deploying data segregation policies especially logical segregation. Ideally being able to design these policies or support the teams with the design.

Technical Skills

Data Engineering & Warehousing: Snowflake (must have) DBT (must have) Snaplogic ETL/ELT APIs Data Warehousing & Lakehouse Architecture.

Programming & Scripting: Advanced SQL Python DBT Bash/Shell scripting.

Cloud & Infrastructure: AWS/Azure/GCP Terraform CloudFormation Security (IAM VPN Encryption).

Data Processing: Kafka Kinesis Apache Airflow Dagster Prefect.

DevOps & CI/CD: Git GitHub Actions Jenkins Docker Kubernetes.

Data Governance & Quality: Data validation metadata management GDPR CCPA compliance.

Soft Skills

Strong communication skills for both technical and non-technical audiences.

Aptitude for bridging the gap between technical and business-focused groups.

Comfortable engaging with individuals at all organisational levels.

Problem-solving & analytical mindset proactive and solution-oriented.

Agile delivery experience (Scrum Kanban) with the ability to scope estimate and deliver tasks independently.

Innate drive to learn and adopt new skills and technologies

Proficiency in managing projects efficiently and addressing challenges collaboratively and proactively.

PFB submission tracker format -

Name

Skill

Notice Period

Preferred Location

Contact Number

Personal E-Mail

Current Company

Total Yrs of Exp

Relevant Yrs of Exp

Expected CTC

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.