Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email10years
Not Disclosed
Salary Not Disclosed
1 Vacancy
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long term project. Here are a few details.
We are seeking a highly skilled Senior Data Engineer with 10 years of experience to join our team and support large-scale data initiatives. This role demands deep expertise in Snowflake Python DBT and AWS services along with proven experience in building robust scalable ELT pipelines using tools like Apache Airflow. The ideal candidate is a strong communicator independent contributor and 100% hands-on in all relevant technologies.
Design develop and maintain ELT data pipelines using Apache Airflow DBT and Snowflake in an AWS environment.
Write modular production-grade Python code for data ingestion transformation and automation tasks.
Develop complex DBT models (incremental models snapshots) with reusable macros and documentation.
Implement data pipeline orchestration and job scheduling using Apache Airflow (MWAA).
Optimize SQL queries and Snowflake performance (partitioning clustering caching).
Integrate and process data from multiple sources including REST APIs.
Set up and maintain monitoring and logging using AWS CloudWatch.
Follow best practices for cloud security including IAM roles encryption and access control.
Participate in version control and CI/CD processes using Git.
Collaborate with cross-functional teams in a client-facing capacity.
Strong experience with Snowflake (data modeling optimization clustering caching).
Deep SQL expertise (complex joins CTEs window functions subqueries performance tuning).
Proficient with DBT (Data Build Tool):
Creating and managing DBT models macros tests and documentation.
Version controlling with Git and setting up CI/CD for DBT pipelines.
Strong hands-on skills in Python:
Use of Pandas NumPy for data manipulation.
Scripting for automation and API integration.
Writing custom Jinja/DBT macros and conditional logic.
AWS S3 (data lake architecture security best practices).
AWS Lambda (serverless compute for lightweight ETL tasks).
AWS Redshift (data warehousing performance optimization).
AWS Glue (ETL orchestration nice to have).
Amazon Athena (querying S3 using SQL nice to have).
IAM (access control) KMS/encryption and best practices for cloud security.
AWS CloudWatch (monitoring alerting for Airflow and data pipeline failures).
Expertise in Apache Airflow (preferably MWAA):
DAG creation and monitoring.
Integration with DBT S3 Redshift and external APIs.
10 years of overall IT experience with at least 5 years in Data Engineering.
Client-facing experience with excellent verbal and written communication.
Ability to work independently with minimal supervision (Individual Contributor).
Proven track record of delivering large-scale ELT solutions in cloud environments.
Experience with MWAA (Managed Workflows for Apache Airflow).
Familiarity with AWS Glue for ETL development.
Understanding of data governance and quality frameworks.
Familiarity with financial domain data structures compliance and reporting.
Problem-solving mindset and attention to detail.
Strong documentation and organizational skills.
Comfortable collaborating in agile/scrum teams.
Education
B.E/
Full Time