Data Full Stack Engineer (Workday ERP & Databricks)
Job Summary
Data Full Stack Engineer (Workday ERP & Databricks)
Job Summary
We are seeking a Data Full Stack Engineer with strong expertise in Workday ERP (Finance & HCM) and modern data engineering platforms (Databricks).
The ideal candidate will own the end-to-end data lifecycle including extraction of Workday data building scalable pipelines and delivering analytics-ready datasets for Finance and HR use cases.
This role bridges ERP domain knowledge and data engineering enabling reliable and governed data solutions in a Lakehouse architecture.
Key Responsibilities
Workday Integration & Data Extraction
- Design and develop integrations to extract data from:
- Workday Financial modules (GL AP AR Invoicing Supplier/Customer data)
- Workday HCM modules (Workers Compensation Absence Recruiting)
- Build and maintain:
- RaaS (Reports-as-a-Service)
- WQL (Workday Query Language) reports
- Develop integrations using:
- REST / SOAP APIs
- Manage:
- Integration System Users (ISUs)
- Security roles and access controls
Data Engineering (Databricks)
- Build scalable ETL/ELT pipelines using:
- PySpark
- Spark SQL
- Delta Lake
- Design and implement:
- Lakehouse architecture (Bronze / Silver / Gold layers)
- Optimize pipelines for:
- Performance
- Reliability
- Cost efficiency
Data Modeling & Transformation
- Develop enterprise-scale data models
- Transform Workday data into:
- Analytics-ready datasets
- Ensure data quality and consistency
Data Governance & Security
- Implement:
- Data quality checks
- Validation frameworks
- Monitoring and alerting
- Manage:
- Access control
- Data lineage
- Compliance requirements
Business & Analytics Enablement
- Collaborate with:
- Finance HR and Analytics teams
- Deliver:
- Reporting datasets
- Dashboards (Power BI / Tableau)
- Support:
- Business insights and decision-making
Collaboration & Agile Delivery
- Work in Agile/Scrum environments
- Partner with:
- Workday consultants
- Data engineers
- Business stakeholders
Required Skills
Workday Expertise (MANDATORY)
- Workday Financial Management (GL AP AR etc.)
- Workday Reporting (RaaS WQL)
- Workday integrations (REST/SOAP APIs)
Data Engineering
- Databricks (Azure preferred)
- PySpark Spark SQL Delta Lake
- ETL/ELT pipeline development
Programming & Databases
- Python
- SQL (Advanced)
- Data modeling
Cloud
- Azure (preferred) / AWS / GCP
Additional Skills
- Data governance & security
- CI/CD pipelines
- API integration
Experience Required
- 5-10 years of experience
- Minimum:
- 3 years in Workday
- 3 years in Databricks / data engineering
Good to Have
- Power BI / Tableau
- Unity Catalog (Databricks governance)
- Experience with Finance & HR analytics
- Knowledge of AI/ML data pipelines