Position: Sr. Data Engineer
Location: Reston VA #Hybrid
Duration: 6months #C2H
Rate: $60/hr
Job Description:
- Has an experienced Senior Data Engineer (Contractor) to join our Federal Employee Program technology platform team.
- This role is a critical contributor in designing developing and optimizing cloud-based data solutions using Snowflake.
- Youll leverage advanced Snowflake capabilities build modern data pipelines and enable scalable analytics and reporting for enterprise healthcare operations.
- The ideal candidate will demonstrate deep Snowflake and SQL expertise hands-on experience with Snowpark (Python) and a strong foundation in data architecture governance and automation.
Key Responsibilities:
- Design develop and optimize data pipelines and transformations within Snowflake using SQL and Snowpark (Python).
- Build and maintain Streams Tasks Materialized Views and Dashboards to enable real-time and scheduled data operations.
- Develop and automate CI/CD pipelines for Snowflake deployments(Jenkins)
- Collaborate with data architects analysts and cloud engineers to design scalable and efficient data models.
- Implement data quality lineage and governance frameworks aligned with enterprise standards and compliance (e.g. HIPAA PHI/PII).
- Monitor data pipelines for performance reliability and cost efficiency; proactively optimize workloads and resource utilization.
- Integrate Snowflake with dbt Kafka for end-to-end orchestration and streaming workflows.
- Conduct root cause analysis and troubleshooting for complex data and performance issues in production.
- Collaborate across technology and business teams to translate complex data needs into elegant maintainable solutions.
Required Skills & Experience:
- 5 years of experience in data engineering or equivalent field.
- 3 years hands-on experience with Snowflake Data Cloud including:
o Streams Tasks Dashboards and Materialized Views
o Performance tuning resource monitors and warehouse optimization - Strong proficiency in SQL (complex queries stored procedures optimization).
- Proficiency in Python with demonstrated experience using Snowpark for data transformations.
- Experience building CI/CD pipelines for Snowflake using modern DevOps tooling.
- Solid understanding of data modeling methodologies (Kimball Data Vault or 3NF).
- Experience with data governance lineage and metadata tools (Collibra Alation or Azure Purview).
- Strong troubleshooting analytical and communication skills with the ability to engage both technical and business audiences.
Preferred Qualifications:
- Experience with dbt or Kafka for orchestration and streaming.
- Exposure to data quality frameworks such as Great Expectations or Monte Carlo.
- Understanding of real-time and batch data ingestion architectures.
- Snowflake Certification (SnowPro Core or Advanced).
- Prior experience in healthcare insurance or other regulated data environments.
Soft Skills & Professional Attributes:
- Excellent problem-solving and root-cause analysis capabilities.
- Strong communication and documentation skills across technical and non-technical audiences.
- Proven ability to work collaboratively in Agile or cross-functional DevOps teams.
- A growth mindset with a commitment to continuous learning and process improvement.
- Ability to thrive in a fast-paced mission-driven environment supporting critical healthcare data operations.
Thanks & Regards
--
LAXMAN
Team Lead - Talent Acquisition
KMM Technologies Inc.
CMMI Level 2 ISO 9001 ISO 20000 ISO 27000 Certified
WOSB SBA 8(A) MDOT MBE & NMSDC MBE
Contract Vehicles: 8(a) STARS III & Schedule 70
Tel: Email:
Position: Sr. Data Engineer Location: Reston VA #Hybrid Duration: 6months #C2H Rate: $60/hr Job Description: Has an experienced Senior Data Engineer (Contractor) to join our Federal Employee Program technology platform team. This role is a critical contributor in designing developing and optimizi...
Position: Sr. Data Engineer
Location: Reston VA #Hybrid
Duration: 6months #C2H
Rate: $60/hr
Job Description:
- Has an experienced Senior Data Engineer (Contractor) to join our Federal Employee Program technology platform team.
- This role is a critical contributor in designing developing and optimizing cloud-based data solutions using Snowflake.
- Youll leverage advanced Snowflake capabilities build modern data pipelines and enable scalable analytics and reporting for enterprise healthcare operations.
- The ideal candidate will demonstrate deep Snowflake and SQL expertise hands-on experience with Snowpark (Python) and a strong foundation in data architecture governance and automation.
Key Responsibilities:
- Design develop and optimize data pipelines and transformations within Snowflake using SQL and Snowpark (Python).
- Build and maintain Streams Tasks Materialized Views and Dashboards to enable real-time and scheduled data operations.
- Develop and automate CI/CD pipelines for Snowflake deployments(Jenkins)
- Collaborate with data architects analysts and cloud engineers to design scalable and efficient data models.
- Implement data quality lineage and governance frameworks aligned with enterprise standards and compliance (e.g. HIPAA PHI/PII).
- Monitor data pipelines for performance reliability and cost efficiency; proactively optimize workloads and resource utilization.
- Integrate Snowflake with dbt Kafka for end-to-end orchestration and streaming workflows.
- Conduct root cause analysis and troubleshooting for complex data and performance issues in production.
- Collaborate across technology and business teams to translate complex data needs into elegant maintainable solutions.
Required Skills & Experience:
- 5 years of experience in data engineering or equivalent field.
- 3 years hands-on experience with Snowflake Data Cloud including:
o Streams Tasks Dashboards and Materialized Views
o Performance tuning resource monitors and warehouse optimization - Strong proficiency in SQL (complex queries stored procedures optimization).
- Proficiency in Python with demonstrated experience using Snowpark for data transformations.
- Experience building CI/CD pipelines for Snowflake using modern DevOps tooling.
- Solid understanding of data modeling methodologies (Kimball Data Vault or 3NF).
- Experience with data governance lineage and metadata tools (Collibra Alation or Azure Purview).
- Strong troubleshooting analytical and communication skills with the ability to engage both technical and business audiences.
Preferred Qualifications:
- Experience with dbt or Kafka for orchestration and streaming.
- Exposure to data quality frameworks such as Great Expectations or Monte Carlo.
- Understanding of real-time and batch data ingestion architectures.
- Snowflake Certification (SnowPro Core or Advanced).
- Prior experience in healthcare insurance or other regulated data environments.
Soft Skills & Professional Attributes:
- Excellent problem-solving and root-cause analysis capabilities.
- Strong communication and documentation skills across technical and non-technical audiences.
- Proven ability to work collaboratively in Agile or cross-functional DevOps teams.
- A growth mindset with a commitment to continuous learning and process improvement.
- Ability to thrive in a fast-paced mission-driven environment supporting critical healthcare data operations.
Thanks & Regards
--
LAXMAN
Team Lead - Talent Acquisition
KMM Technologies Inc.
CMMI Level 2 ISO 9001 ISO 20000 ISO 27000 Certified
WOSB SBA 8(A) MDOT MBE & NMSDC MBE
Contract Vehicles: 8(a) STARS III & Schedule 70
Tel: Email:
View more
View less