Our client a global healthcare company is seeking an Enterprise Data Platform Lead Consultant highly proficient in AWS Python and Snowflake to spearhead the design and implementation of their Enterprise Data Platform (EDP) solutions for crossdomain reporting and analytics. You will drive cloudbased data integration storage and curation using AWS Python and Snowflake ensuring alignment with strategic initiatives of client programs including but not limited to the Spectra to Quest Lab transition. This role demands technical leadership in scenarios where data gravity necessitates EDPbased reporting outside SAP Datasphere.
This is a remote contract role with minimal travel. The length of this contract will be through the end of 2025 with the likelihood of renewal well into 2026.
Job Responsibilities:
- Lead for Enterprise Data Platform reporting capabilities for crossdomain reporting and analytical needs.
- Architect and deliver cloudbased analytical solutions leveraging AWS Python and Snowflake.
- Design and implement endtoend data integration storage and curation pipelines for highperformance analytical use cases.
- Function as the technical leader in implementing EDP solutions that support dataintensive initiatives within the clients program especially where reporting must occur outside of DataSphere due to data gravity considerations.
- Collaborate with data engineers analysts and business units to capture requirements and translate them into effective data models and pipelines.
- Ensure scalability governance and security are core to the EDP solution design.
- Support and guide project teams enforcing data platform architecture best practices and performance optimization strategies.
Requirements
- 5 years in designing Enterprise Data Platforms with expertise in AWS (certifications preferred) Python (Pandas PySpark) and Snowflake.
- Proficiency in data integration tools (e.g. Apache Airflow dbt Fivetran) and SQL/NoSQL databases.
- Handson experience with data lakehouses realtime analytics and cloud security frameworks.
- Experience leading largescale migrations (e.g. legacy to cloud) and multidomain data curation.
Preferred Qualifications:
- AWS Certified Solutions Architect Snowflake SnowPro Core/Advanced or Python certifications.
- Familiarity with Databricks Tableau or Power BI is a plus.
- Fluent in English; ability to collaborate with global teams across EU time zones.
- Strong problemsolving skills and stakeholder management for technical and nontechnical audiences.
5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake. Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases. Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks. Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.