Randstad is seeking a Senior Data & AI Platform Engineer for an onsite role in Washington DC supporting a premier national transportation leader. This pivotal role is designed for a technical visionary who will architect and scale a unified self-service data ecosystem using a modern Databricks lakehouse architecture. You will bridge the gap between complex data engineering and advanced AI leading the design of automated pipelines MLOps frameworks and API-first integrations while mentoring a talented team of engineers. This is a unique opportunity to modernize critical infrastructure and drive data-driven innovation for an organization that connects communities across America.
Key Responsibilities- Architectural Leadership: Lead the evolution of a scalable Data & AI platform integrating Databricks SAP (Datasphere/S/4) and Denodo virtualization into a governed self-service ecosystem.
- Solution Delivery: Act as a hands-on lead developer for complex data pipelines feature stores and API-driven integrations that power enterprise-wide analytics and digital experiences.
- MLOps & Automation: Design and implement production-grade MLOps pipelines including versioning CI/CD and monitoring to accelerate the deployment of intelligent models.
- Engineering Excellence: Establish and enforce standards for ingestion transformation and governance-as-code controls embedded directly into technical workflows.
- Mentorship: Foster a culture of excellence by providing technical guidance code reviews and professional development for junior and mid-level engineers.
- Strategic Collaboration: Partner with architects product owners and governance leads to align technical solutions with the broader enterprise data roadmap.
Qualifications & Skills- Education: Bachelors degree in Computer Science Data Engineering or a related technical field (equivalent experience considered).
- Experience: 4 6 years of deep technical experience in data engineering software development or data architecture.
- Technical Proficiency: Advanced expertise in Python and SQL with significant experience in Apache Spark and modern Lakehouse architectures (Databricks preferred).
- AI/ML Expertise: Proven experience building MLOps pipelines and internal platform services such as feature stores or semantic layers.
- Modern Infrastructure: Strong understanding of API-first and event-driven architectures secure service-to-service communication and RBAC security.
- Agile Leadership: Demonstrated ability to lead multi-functional teams through technical challenges within a scaled Agile environment.
- Soft Skills: Exceptional communication and problem-solving abilities with the capacity to explain complex technical concepts to diverse stakeholders.
Required Skills :
Basic Qualification :
Additional Skills :
Background Check : No
Drug Screen : No
Randstad is seeking a Senior Data & AI Platform Engineer for an onsite role in Washington DC supporting a premier national transportation leader. This pivotal role is designed for a technical visionary who will architect and scale a unified self-service data ecosystem using a modern Databricks lakeh...
Randstad is seeking a Senior Data & AI Platform Engineer for an onsite role in Washington DC supporting a premier national transportation leader. This pivotal role is designed for a technical visionary who will architect and scale a unified self-service data ecosystem using a modern Databricks lakehouse architecture. You will bridge the gap between complex data engineering and advanced AI leading the design of automated pipelines MLOps frameworks and API-first integrations while mentoring a talented team of engineers. This is a unique opportunity to modernize critical infrastructure and drive data-driven innovation for an organization that connects communities across America.
Key Responsibilities- Architectural Leadership: Lead the evolution of a scalable Data & AI platform integrating Databricks SAP (Datasphere/S/4) and Denodo virtualization into a governed self-service ecosystem.
- Solution Delivery: Act as a hands-on lead developer for complex data pipelines feature stores and API-driven integrations that power enterprise-wide analytics and digital experiences.
- MLOps & Automation: Design and implement production-grade MLOps pipelines including versioning CI/CD and monitoring to accelerate the deployment of intelligent models.
- Engineering Excellence: Establish and enforce standards for ingestion transformation and governance-as-code controls embedded directly into technical workflows.
- Mentorship: Foster a culture of excellence by providing technical guidance code reviews and professional development for junior and mid-level engineers.
- Strategic Collaboration: Partner with architects product owners and governance leads to align technical solutions with the broader enterprise data roadmap.
Qualifications & Skills- Education: Bachelors degree in Computer Science Data Engineering or a related technical field (equivalent experience considered).
- Experience: 4 6 years of deep technical experience in data engineering software development or data architecture.
- Technical Proficiency: Advanced expertise in Python and SQL with significant experience in Apache Spark and modern Lakehouse architectures (Databricks preferred).
- AI/ML Expertise: Proven experience building MLOps pipelines and internal platform services such as feature stores or semantic layers.
- Modern Infrastructure: Strong understanding of API-first and event-driven architectures secure service-to-service communication and RBAC security.
- Agile Leadership: Demonstrated ability to lead multi-functional teams through technical challenges within a scaled Agile environment.
- Soft Skills: Exceptional communication and problem-solving abilities with the capacity to explain complex technical concepts to diverse stakeholders.
Required Skills :
Basic Qualification :
Additional Skills :
Background Check : No
Drug Screen : No
View more
View less