Summary of Key Responsibilities
Responsibilities and essential job functions include but are not limited to the following:
Lead large-scale complex cross-functional projects to build the technical roadmap for the WFM Data Services platform.
Review and approve design artifacts to ensure alignment with architectural standards.
Build and own automation and monitoring frameworks that provide reliable accurate and easy-to-understand metrics and operational KPIs for data pipeline quality.
Execute proof of concept (POC) on new technologies and tools to select the best solutions.
Support business objectives by collaborating with business partners to identify opportunities and drive resolution.
Communicate project status and issues to senior leadership and stakeholders.
Direct project teams and cross-functional teams on all technical aspects of projects.
Partner with engineering teams to build and support real-time highly available data pipelines and technology capabilities.
Translate strategic requirements into business requirements to ensure solutions meet business needs.
Define and implement data retention policies and procedures.
Define and implement data governance policies and procedures.
Identify design and implement internal process improvements including automating manual processes optimizing data delivery and re-designing infrastructure for greater scalability.
Enable teams to pursue insights and applied breakthroughs while driving solutions to enterprise scale.
Build infrastructure for optimal extraction transformation and loading (ETL) of data from a wide variety of structured and unstructured sources using big data technologies.
Develop analytics tools that utilize data pipelines to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
Work with stakeholders including Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Perform root cause analysis to identify permanent resolutions for software or business process issues.
Basic Qualifications
Azure Data Engineer with 5 years of experience designing and implementing scalable data pipelines using Azure and Databricks to support large-scale data processing and advanced analytics.
Extensive experience building and optimizing ETL workflows using Databricks and PySpark to process and analyze massive datasets ensuring high performance and data accuracy.
5 years of experience with object-oriented or object-function scripting languages such as Python Java etc.
3 years of experience leading development of large-scale cloud-based services with platforms like AWS GCP or Azure and developing and operating cloud-based distributed systems.
Experience building and optimizing data pipelines architectures and data sets.
Ability to build processes supporting data transformation data structures and metadata management.
Summary of Key Responsibilities Responsibilities and essential job functions include but are not limited to the following: Lead large-scale complex cross-functional projects to build the technical roadmap for the WFM Data Services platform. Review and approve design artifacts to ensure alig...
Summary of Key Responsibilities
Responsibilities and essential job functions include but are not limited to the following:
Lead large-scale complex cross-functional projects to build the technical roadmap for the WFM Data Services platform.
Review and approve design artifacts to ensure alignment with architectural standards.
Build and own automation and monitoring frameworks that provide reliable accurate and easy-to-understand metrics and operational KPIs for data pipeline quality.
Execute proof of concept (POC) on new technologies and tools to select the best solutions.
Support business objectives by collaborating with business partners to identify opportunities and drive resolution.
Communicate project status and issues to senior leadership and stakeholders.
Direct project teams and cross-functional teams on all technical aspects of projects.
Partner with engineering teams to build and support real-time highly available data pipelines and technology capabilities.
Translate strategic requirements into business requirements to ensure solutions meet business needs.
Define and implement data retention policies and procedures.
Define and implement data governance policies and procedures.
Identify design and implement internal process improvements including automating manual processes optimizing data delivery and re-designing infrastructure for greater scalability.
Enable teams to pursue insights and applied breakthroughs while driving solutions to enterprise scale.
Build infrastructure for optimal extraction transformation and loading (ETL) of data from a wide variety of structured and unstructured sources using big data technologies.
Develop analytics tools that utilize data pipelines to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
Work with stakeholders including Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Perform root cause analysis to identify permanent resolutions for software or business process issues.
Basic Qualifications
Azure Data Engineer with 5 years of experience designing and implementing scalable data pipelines using Azure and Databricks to support large-scale data processing and advanced analytics.
Extensive experience building and optimizing ETL workflows using Databricks and PySpark to process and analyze massive datasets ensuring high performance and data accuracy.
5 years of experience with object-oriented or object-function scripting languages such as Python Java etc.
3 years of experience leading development of large-scale cloud-based services with platforms like AWS GCP or Azure and developing and operating cloud-based distributed systems.
Experience building and optimizing data pipelines architectures and data sets.
Ability to build processes supporting data transformation data structures and metadata management.
View more
View less