Important Information:
Years of Experience: 7 years in Product Management Data Engineering or Technical Program Management
Job Mode: Full-time
Work Mode: Remote
The Product Manager Enterprise Data Lake will lead the delivery execution and rollout of a new enterprise data platform built on Snowflake or Databricks. This role is responsible for translating business and data requirements into technical execution plans driving the development of ingestion pipelines data models and cloud infrastructure to enable scalable analytics and data science operations. The ideal candidate combines strong technical understanding with exceptional project management and stakeholder coordination skills.
Own the delivery and execution of an enterprise data lake on Snowflake or Databricks from initial setup through production rollout.
Translate business and data requirements into clear actionable technical stories and sprint plans for engineering teams.
Define and manage data ingestion and integration pipelines from core SaaS platforms and enterprise systems.
Partner with data engineers and architects to design efficient data models storage layers and optimization strategies.
Oversee the full development lifecycle ensuring milestones dependencies and deliverables are met on time.
Prioritize and manage the product backlog balancing business impact technical complexity and cross-functional dependencies.
Build and maintain a phased delivery roadmap including MVP and subsequent enhancement releases.
Collaborate with infrastructure and DevOps teams to establish reliable scalable and cost-effective cloud environments.
Define and monitor operational KPIs such as data pipeline success rates processing latency and system uptime.
Ensure readiness for production release including documentation monitoring and smooth handoff to analytics and data science teams.
Proven experience leading data platform or data lake initiatives in cloud environments.
Deep understanding of agile methodologies sprint planning and backlog management.
Strong technical literacy in data architecture pipelines and storage optimization.
Excellent communication and stakeholder management skills across technical and business domains.
Ability to balance short-term delivery with long-term platform scalability and maintainability.
Hands-on experience with data lake solutions on Snowflake or Databricks.
Familiarity with data integration tools and ETL/ELT pipelines (e.g. Airflow dbt Fivetran Azure Data Factory).
Knowledge of cloud infrastructure (AWS Azure or GCP) and DevOps processes for data environments.
Understanding of enterprise data governance security and compliance frameworks.
Ability to define KPIs for operational efficiency and data reliability.
Snowflake / Databricks
Python SQL dbt Airflow
AWS / Azure / GCP
ETL/ELT tools (Fivetran Data Factory Glue etc.)
Agile tools (Jira Confluence)
Data Platform Product Management
Agile Execution and Roadmapping
Cross-Functional Collaboration
Cloud Data Architecture Understanding
Backlog Prioritization
KPI Tracking and Reporting
Encora is the preferred digital engineering and modernization partner of some of the worlds leading enterprises and digital native companies. With over 9000 experts in 47 offices and innovation labs worldwide Encoras technology practices include Product Engineering & Development Cloud Services Quality Engineering DevSecOps Data & Analytics Digital Experience Cybersecurity and AI & LLM Engineering.
At Encora we hire professionals based solely on their skills and qualifications and do not discriminate based on age disability religion gender sexual orientation socioeconomic status or nationality.
Would you like me to create a shorter version optimized for external job posting (more concise and marketing-oriented) or keep this structured internal version for documentation and client review
Required Experience:
IC
As Encora Inc. expands its footprint in Latin America, its acquisition of Nearsoft provides our clients with a unique chance to Nearshore on a global scale.