drjobs Senior Data Engineer

Senior Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

10years

Job Location drjobs

Orlando, FL - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.


We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long term project. Here are a few details.


Requirements

Role: Engineer

Location:OrlandoFL
Exp: 10 Years


Requirements

We are seeking a highly skilled Senior Data Engineer with 10 years of experience to join our team and support large-scale data initiatives. This role demands deep expertise in Snowflake Python DBT and AWS services along with proven experience in building robust scalable ELT pipelines using tools like Apache Airflow. The ideal candidate is a strong communicator independent contributor and 100% hands-on in all relevant technologies.


Key Responsibilities:

  • Design develop and maintain ELT data pipelines using Apache Airflow DBT and Snowflake in an AWS environment.

  • Write modular production-grade Python code for data ingestion transformation and automation tasks.

  • Develop complex DBT models (incremental models snapshots) with reusable macros and documentation.

  • Implement data pipeline orchestration and job scheduling using Apache Airflow (MWAA).

  • Optimize SQL queries and Snowflake performance (partitioning clustering caching).

  • Integrate and process data from multiple sources including REST APIs.

  • Set up and maintain monitoring and logging using AWS CloudWatch.

  • Follow best practices for cloud security including IAM roles encryption and access control.

  • Participate in version control and CI/CD processes using Git.

  • Collaborate with cross-functional teams in a client-facing capacity.


Required Technical Skills:

Data Engineering & ELT:

  • Strong experience with Snowflake (data modeling optimization clustering caching).

  • Deep SQL expertise (complex joins CTEs window functions subqueries performance tuning).

  • Proficient with DBT (Data Build Tool):

    • Creating and managing DBT models macros tests and documentation.

    • Version controlling with Git and setting up CI/CD for DBT pipelines.

Programming:

  • Strong hands-on skills in Python:

    • Use of Pandas NumPy for data manipulation.

    • Scripting for automation and API integration.

    • Writing custom Jinja/DBT macros and conditional logic.

AWS Cloud Platform:

  • AWS S3 (data lake architecture security best practices).

  • AWS Lambda (serverless compute for lightweight ETL tasks).

  • AWS Redshift (data warehousing performance optimization).

  • AWS Glue (ETL orchestration nice to have).

  • Amazon Athena (querying S3 using SQL nice to have).

  • IAM (access control) KMS/encryption and best practices for cloud security.

  • AWS CloudWatch (monitoring alerting for Airflow and data pipeline failures).

Workflow Orchestration:

  • Expertise in Apache Airflow (preferably MWAA):

    • DAG creation and monitoring.

    • Integration with DBT S3 Redshift and external APIs.


Must-Have Qualifications:

  • 10 years of overall IT experience with at least 5 years in Data Engineering.

  • Client-facing experience with excellent verbal and written communication.

  • Ability to work independently with minimal supervision (Individual Contributor).

  • Proven track record of delivering large-scale ELT solutions in cloud environments.


Preferred/Optional Experience:

  • Experience with MWAA (Managed Workflows for Apache Airflow).

  • Familiarity with AWS Glue for ETL development.

  • Understanding of data governance and quality frameworks.

  • Familiarity with financial domain data structures compliance and reporting.


Soft Skills:

  • Problem-solving mindset and attention to detail.

  • Strong documentation and organizational skills.

  • Comfortable collaborating in agile/scrum teams.



Benefits



Snowflake, Phython, SQL, DBT, Airflow, AWS, S3, Lambda, Redshift, CloudWatch, Git, CI/CD, Pandas, NumPy, REST, Jinja, IAM, Encryption, Monitoring, Orchestration, Automation, Data Modeling, Performance Tuning, Troubleshooting, Communication,

Education

B.E/

Employment Type

Full Time

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.