Roles: Senior Data Engineer
Location: Remote
Duration: 6 Months
Scope of Work
- Manage all new data ingestions for UKS operational needs
- Build new data integrations for all new data sources and providers including third-party data providers
- Manage new data egress patterns to send/share data outputs from Data 360 to multiple destinations
- Ingest new data via existing ingestion pipelines
- Monitor maintain and troubleshoot data pipelines
- Track and report statistics including data volume number of data streams and other ingestion KPIs
Required Skills & Qualifications
- Strong experience in data engineering ETL (preferably Informatica) and data architecture
- Experience working with Salesforce platforms and data models
- Hands-on experience with Airflow and MuleSoft
- Strong SQL skills with Snowflake and other relational databases
- Experience writing complex Python modules for custom ETL workflows
- Expertise in building batch and real-time data pipelines
- Experience with dbt for data transformation and modeling
- Familiarity with AWS/GCP and distributed/event-driven architectures
- Strong understanding of data modeling data warehousing and big data processing
- Experience with data quality monitoring and pipeline orchestration frameworks
- Ability to work closely with Product Data Science and Analytics teams
- Experience working on Salesforce Flows and Apex is a plus
Roles: Senior Data Engineer Location: Remote Duration: 6 Months Scope of Work Manage all new data ingestions for UKS operational needs Build new data integrations for all new data sources and providers including third-party data providers Manage new data egress patterns to send/share data outputs f...
Roles: Senior Data Engineer
Location: Remote
Duration: 6 Months
Scope of Work
- Manage all new data ingestions for UKS operational needs
- Build new data integrations for all new data sources and providers including third-party data providers
- Manage new data egress patterns to send/share data outputs from Data 360 to multiple destinations
- Ingest new data via existing ingestion pipelines
- Monitor maintain and troubleshoot data pipelines
- Track and report statistics including data volume number of data streams and other ingestion KPIs
Required Skills & Qualifications
- Strong experience in data engineering ETL (preferably Informatica) and data architecture
- Experience working with Salesforce platforms and data models
- Hands-on experience with Airflow and MuleSoft
- Strong SQL skills with Snowflake and other relational databases
- Experience writing complex Python modules for custom ETL workflows
- Expertise in building batch and real-time data pipelines
- Experience with dbt for data transformation and modeling
- Familiarity with AWS/GCP and distributed/event-driven architectures
- Strong understanding of data modeling data warehousing and big data processing
- Experience with data quality monitoring and pipeline orchestration frameworks
- Ability to work closely with Product Data Science and Analytics teams
- Experience working on Salesforce Flows and Apex is a plus
View more
View less