drjobs Principal Data Engineer

Principal Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Irvine - USA

Monthly Salary drjobs

$ 175000 - 195000

Vacancy

1 Vacancy

Job Description

Job Details

XPO Headquarters - Irvine CA
Full Time
$175000.00 - $195000.00 Salary

Description

Who We Are:

Xponential Fitness is the curator of leading brands across every vertical in the boutique fitness industry. Xponential Fitness portfolio of brands includes Club Pilates the nations largest Pilates brand; CycleBar the nations largest indoor cycling brand; StretchLab a concept offering one-on-one and group stretching services; YogaSix the largest franchised yoga brand; Pure Barre a total body workout that uses the ballet barre to perform small isometric movements; Rumble a boxing-inspired full-body workout; and BFT a functional training and strength-based fitness program; and Lindora a medically supervised weight loss clinic.

Job Overview:

The Principal Data Engineer will lead the design and evolution of Xponential Fitnesss enterprise data architecture enabling scalable secure and high-performing data infrastructure that powers real-time analytics AI/ML capabilities and strategic decision-making across the organization. This role sits at the intersection of data strategy and engineering execution responsible for building an integrated data ecosystem that supports Xponentials diverse portfolio of brands and digital platforms. The Principal Data Engineer will partner cross-functionally with leaders in AI application development business intelligence and cybersecurity to ensure the right data is available trustworthy and actionable across the enterprise. This leader will play a critical role in advancing the companys data modernization agenda establishing best-in-class data practices and unlocking value through insights automation and innovation.

Key Responsibilities:

  • Enterprise Data Architecture & Engineering.
  • Design and implement resilient cloud-native data architectures supporting both batch and real-time pipelines.
  • Lead the ingestion transformation and orchestration of data via Fivetran Apache Airflow and Python-based ETL/ELT.
  • Standardize pipelines from member management and point of sales systems digital platforms and MarTech tools into a centralized lakehouse and warehouse.
  • Partner with software engineering teams to ensure pipelines are CI/CD-enabled using GitHub Actions and CodePipeline.
  • Cloud Infrastructure & Platform Integration.
  • Optimize compute storage and processing layers to ensure scalable secure and cost-effective data operations.
  • Integrate modern container orchestration caching and task automation approaches to support data enrichment transformation and delivery at scale.
  • Leverage infrastructure-as-code and CI/CD pipelines to standardize deployments and reduce operational overhead.
  • Align data platform architecture with application and DevOps workflows to support consistent governed and observable services across brands and environments.
  • AI/ML Data Enablement.
  • Collaborate with AI engineers to enable end-to-end MLOps feature engineering pipelines and training data provisioning.
  • Ensure pipelines support model retraining scoring and inference workloads across ECS and Lambda environments.
  • Prepare time-series transactional and behavioral datasets for model consumption.
  • Governance Security & Compliance.
  • Define and enforce data governance policies including lineage metadata management and data quality rules.
  • Implement encryption RBAC and masking strategies to protect personal and sensitive business data.
  • Ensure infrastructure and data flows meet regulatory and contractual obligations (e.g. SOX PCI GDPR).
  • Monitoring Observability & Cost Optimization.
  • Instrument data workflows with CloudWatch Kinesis Firehose Sumo Logic Sentry and New Relic for real-time visibility.
  • Tune Snowflake performance control costs and monitor data freshness across the platform.
  • Automate validation and anomaly detection to ensure continuous pipeline reliability.
  • Collaboration & Technical Leadership.
  • Mentor data engineers promoting best practices in scalable design modular pipeline development and IaC.
  • Lead architecture reviews and cross-functional design sessions across data application and security teams.
  • Translate technical decisions into business impact narratives for leadership and stakeholders.

Pay Range: $17500 - $195000

Benefits:

  • Medical Dental and Vision benefits
  • This role is eligible for a monthly cell phone allowance
  • Empower is our 401k company. We offer Traditional and Roth 401k plans. Employer match is 4% and starts matching at the beginning of year 2. Your 401k would be fully vested at the start of year 3
  • Complimentary corporate memberships to XPLUS and XPASS
  • Discounts on retail brand merchandise- up to 30% off wholesale price
  • On-site gym
  • On Campus Amenities: Reborn Coffee Shop Hangar 24 Mini Putting Green Basketball Court Bird Sanctuary Car Washing Services (M/W) Dry Cleaning Services

Xponential Fitness LLC provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race color religion age sex national origin disability status genetics protected veteran status sexual orientation gender identity or expression or any other characteristic protected by federal state or local laws.

Qualifications

Qualifications:

  • Experience & Leadership: 10 years of experience in data engineering cloud architecture or big data infrastructure 5 years in a senior or leadership capacity with a track record of building scalable data platforms and proven ability to lead complex cross-functional initiatives and influence architectural decisions across technology and business teams.
  • Technical Expertise:
  • Data Architecture & Pipelines: Expertise in ELT/ETL design real-time streaming data modeling and orchestration frameworks.
  • Cloud Services & Infrastructure: Hands-on experience with scalable compute (e.g. container-based workloads) relational and non-relational storage caching systems and infrastructure automation tools.
  • Modern Data Stack: Proficient in tools like Snowflake dbt Apache Airflow Fivetran and orchestration via GitHub Actions or CodePipeline.
  • Programming & Automation: Strong skills in SQL and Python; experienced with CI/CD workflows and infrastructure-as-code.
  • Graph Databases: Familiarity with graph-based data modeling and platforms like Neo4j Amazon Neptune for relationship-driven use cases.
  • Monitoring & Observability: Implementation of log aggregation container monitoring and data pipeline observability using tools such as CloudWatch Sumo Logic Sentry or New Relic.
  • AI/ML & Analytics Enablement: Experience partnering with AI/ML teams to design pipelines that support model development training and deployment. Exposure to MLOps principles and feature engineering workflows.
  • Governance & Compliance: Familiarity with regulatory requirements (SOX PCI GDPR) and best practices for data security access control and metadata management.
  • Education: Bachelors or Masters degree in Computer Science Engineering Information Systems or a related field.

Required Experience:

Staff IC

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.