Senior Data Engineer
HIGHLIGHTS
Location: Scottsdale AZ 85251
Position Type: Direct Hire
Hourly / Salary: Based on Experience
Residency Status: US Citizen or Green Card Holder ONLY
Overview:
Our client is seeking a Senior Data Engineer with the technical expertise architectural insight and problem-solving skills to design build and maintain our modern data platform. The ideal candidate will be a hands-on engineer who can deliver at scale ensure data quality and work across teams to support both operational and analytical workloads.
This role is critical to our data engineering practice. They are looking for a hands-on leader who can operate in both structured and fast-changing environments while delivering high-quality scalable solutions.
Responsibilities:
Design build and maintain scalable data pipelines (ETL/ELT) for both batch and streaming use cases.
Implement data validation and integrity frameworks ensuring accuracy completeness and reconciliation across systems.
Administer and optimize cloud-based data services (AWS Snowflake Databricks etc.).
Ensure compliance with data governance and regulations (PCI-DSS GDPR CPRA SOX etc.).
Deliver proof of concepts and lightweight prototypes to validate architectural improvements.
Collaborate with business units audit teams and stakeholders to align data architecture with organizational needs.
Support reporting and analytics teams by enabling Tableau (or future-state tools) through well-structured data models and governance.
Seek out new work proactively and mentor other team members.
Own projects end-to-end and deliver measurable results.
Diagnose and resolve complex data issues quickly.
Qualifications:
Bachelor s degree in Computer Science Information Technology or equivalent experience.
10 years of experience in data engineering.
Mastery of complicated SQL for loading data including complex joins subqueries windowing functions and common table expressions.
Advanced proficiency in Python developing and supporting robust pipelines frameworks and automation solutions.
Proven history designing and managing Data Warehousing solutions including proper loading techniques star schema modeling slowly changing dimensions and aggregate strategies.
Strong understanding of OLTP modeling (3NF) and how to denormalize for performance or downstream use.
Production experience with unstructured and semi-structured data (e.g. JSON MongoDB APIs) and integrating it into enterprise data ecosystems.
Practical expertise implementing and optimizing pipeline orchestration with tools such as Airflow or Prefect.
Experience implementing and optimizing 3rd party transformation tools such as dbt Talend or FiveTran.
Demonstrated success in delivering scalable solutions using cloud-based data platforms like Snowflake or Databricks.
Applied experience leveraging AWS data services (S3 Glue Redshift Lambda Secrets Manager etc.) in secure and cost-effective ways.
Demonstrated ability to ensure compliance with regulatory requirements including PCI-DSS GDPR CPRA CCPA and SOX.
Effective communication skills for engaging both technical and non-technical audiences and collaborating with business and audit stakeholders.
Proven capability mentoring peers promoting best practices and contributing to team growth.
High adaptability to changing requirements emerging technologies and competing priorities.
Nice to have: Hands-on experience with Data Lake and Lakehouse technologies (e.g. Delta Lake Iceberg Hudi) for managing large-scale data assets.
We are GTN The Go To Network
Senior Data Engineer HIGHLIGHTS Location: Scottsdale AZ 85251 Position Type: Direct Hire Hourly / Salary: Based on Experience Residency Status: US Citizen or Green Card Holder ONLYOverview: Our client is seeking a Senior Data Engineer with the technical expertise architectural insight and probl...
Senior Data Engineer
HIGHLIGHTS
Location: Scottsdale AZ 85251
Position Type: Direct Hire
Hourly / Salary: Based on Experience
Residency Status: US Citizen or Green Card Holder ONLY
Overview:
Our client is seeking a Senior Data Engineer with the technical expertise architectural insight and problem-solving skills to design build and maintain our modern data platform. The ideal candidate will be a hands-on engineer who can deliver at scale ensure data quality and work across teams to support both operational and analytical workloads.
This role is critical to our data engineering practice. They are looking for a hands-on leader who can operate in both structured and fast-changing environments while delivering high-quality scalable solutions.
Responsibilities:
Design build and maintain scalable data pipelines (ETL/ELT) for both batch and streaming use cases.
Implement data validation and integrity frameworks ensuring accuracy completeness and reconciliation across systems.
Administer and optimize cloud-based data services (AWS Snowflake Databricks etc.).
Ensure compliance with data governance and regulations (PCI-DSS GDPR CPRA SOX etc.).
Deliver proof of concepts and lightweight prototypes to validate architectural improvements.
Collaborate with business units audit teams and stakeholders to align data architecture with organizational needs.
Support reporting and analytics teams by enabling Tableau (or future-state tools) through well-structured data models and governance.
Seek out new work proactively and mentor other team members.
Own projects end-to-end and deliver measurable results.
Diagnose and resolve complex data issues quickly.
Qualifications:
Bachelor s degree in Computer Science Information Technology or equivalent experience.
10 years of experience in data engineering.
Mastery of complicated SQL for loading data including complex joins subqueries windowing functions and common table expressions.
Advanced proficiency in Python developing and supporting robust pipelines frameworks and automation solutions.
Proven history designing and managing Data Warehousing solutions including proper loading techniques star schema modeling slowly changing dimensions and aggregate strategies.
Strong understanding of OLTP modeling (3NF) and how to denormalize for performance or downstream use.
Production experience with unstructured and semi-structured data (e.g. JSON MongoDB APIs) and integrating it into enterprise data ecosystems.
Practical expertise implementing and optimizing pipeline orchestration with tools such as Airflow or Prefect.
Experience implementing and optimizing 3rd party transformation tools such as dbt Talend or FiveTran.
Demonstrated success in delivering scalable solutions using cloud-based data platforms like Snowflake or Databricks.
Applied experience leveraging AWS data services (S3 Glue Redshift Lambda Secrets Manager etc.) in secure and cost-effective ways.
Demonstrated ability to ensure compliance with regulatory requirements including PCI-DSS GDPR CPRA CCPA and SOX.
Effective communication skills for engaging both technical and non-technical audiences and collaborating with business and audit stakeholders.
Proven capability mentoring peers promoting best practices and contributing to team growth.
High adaptability to changing requirements emerging technologies and competing priorities.
Nice to have: Hands-on experience with Data Lake and Lakehouse technologies (e.g. Delta Lake Iceberg Hudi) for managing large-scale data assets.
We are GTN The Go To Network
View more
View less