drjobs Technology and Operations - Data Architect IV

Technology and Operations - Data Architect IV

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Burbank - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Business Unit/Group: WBD Studio Economics
Requisition Number: 21605-1
Intended Start Date: 8/26/2025
Contract Duration: 1-year
Possibility For Extension / Conversion Yes
Max Hourly Pay Rate: BR/hr
OT Required / Expected No
WB Games Resource(s) No
CNN Resource(s) No

What We Do/Project

As part of the Studio Economics transformation we are establishing a modern governed and reusable data foundation to power financial forecasting title economics sales planning and AI-driven insights across the Studios.

The Senior Data Architect plays a critical role in designing this foundation ensuring that data models pipelines and integration frameworks are scalable performant and aligned with enterprise data governance and platform goals.

Embedded within the Platform Pod the Senior Data Architect partners with the Platform Owner Cloud Architect data engineering teams and product-aligned pods to ensure the architecture supports both immediate product needs and long-term platform evolution. This role directly enables the delivery of reliable real-time and reusable data products across multiple Studio Economics workstreams.

Job Responsibilities / Typical Day in the Role
Design Scalable and Consistent Data Architecture
Define and maintain canonical data models entity relationships and semantic layer specifications that ensure consistent use of data across products and domains.
Develop and evolve logical and physical data models that support real-time analytics forecasting and scenario planning.
Collaborate with product-aligned pods to design domain-aligned data products that are modular governed and discoverable.
Build Reusable Performant Data Pipelines
Architect data pipelines that support both batch and near real-time processing using AWS-native services (e.g. Glue Kinesis Lambda Step Functions).
Guide ingestion transformation and enrichment strategies that optimize for resilience scalability and lineage traceability.
Work closely with the Cloud Architect to ensure that infrastructure and orchestration layers meet pipeline and data SLAs.
Embed Governance and Stewardship by Design
Partner with enterprise data governance teams to implement standardized metadata lineage and access controls using tools such as Lake Formation Informatica or Snowflake.
Define data quality rules validation checkpoints and anomaly detection processes to support trusted analytics and ML pipelines.
Contribute to the enterprise data catalog and enable self-service access through secure well-documented APIs and schemas.
Collaborate Across Platform and Product Pods
Work with the Platform Owner to define and deliver shared data services and reusable semantic models that support multi-pod alignment.
Support data scientists and analysts by enabling ML/AI-ready data pipelines and structuring data to accelerate model development and deployment.
Participate in cross-pod architecture planning to coordinate integration strategies resolve semantic conflicts and align on domain boundaries.

Must Have Skills / Requirements
1) Experience in data architecture and engineering with a focus on cloud-native data platforms and modern analytics workflows.
a. 7 years of experience; Designing and delivering data architecture for cloud-based platforms with strong knowledge of AWS (e.g. Glue Lambda Step Functions Lake Formation) and modern tooling (e.g. Snowflake Databricks Informatica).
2) Hands-On Pipeline Design and Orchestration
a. 7 years of experience; Experience architecting and optimizing complex data pipelines ensuring performance resilience and real-time capabilities; Hands-on experience building batch and streaming pipelines that are performant resilient and traceable using orchestration frameworks that support real-time and ML/AI-ready processing.
3) Expertise in Canonical Modeling and Semantic Design
a. 7 years of experience; Proven ability to design scalable reusable data models and translate them into physical implementations that align with business domains and analytic needs; Deep proficiency in designing canonical and semantic data models with proven experience aligning data structures to business domains and analytic use cases.

Nice to Have Skills / Preferred Requirements
1) None

Functional Knowledge / Skills in the following areas:
1) You ll thrive in this role if you naturally:
a. Think in Domains and Products
b. You understand that good data architecture starts with clear business semantics and you design models that reflect the real-world entities behind Studio workflows.
c. Bridge the Gap Between Models and Platforms
d. You work fluidly across logical design physical deployment and infrastructure orchestration partnering with Cloud Architects and Engineers to bring your models to life.
e. Govern Through Enablement
f. You ensure compliance lineage and quality by embedding governance directly into design making the right path the easy path for product and engineering teams.
g. Build for Reuse and Interoperability
h. You optimize for consistency and extensibility designing assets and APIs that can be adopted across products use cases and future data science workflows.
i. Promote Transparency and Stewardship
j. You advocate for shared ownership of data definitions and practices and help business and technical stakeholders understand the value of consistency and quality.
2) You re likely a fit for this role if you bring:
a. Cross-Functional Collaboration
b. Strong ability to work across disciplines including engineering analytics product and compliance and communicate design decisions in a way that drives alignment and adoption.
c. Bias for Structure and Clarity
d. You drive resolution of semantic conflicts minimize redundancy and create architectural clarity that simplifies downstream implementation.
e. Excellent collaboration and communication skills with the ability to facilitate cross-functional alignment and translate architectural decisions across technical and business audiences.

Technology Requirements:
1) You re likely a fit for this role if you bring:
a. Deep Data Architecture Experience
b. Experience designing and delivering data architecture for cloud-based platforms with strong knowledge of AWS (e.g. Glue Lambda Step Functions Lake Formation) and modern tooling (e.g. Snowflake Databricks Informatica).
c. Expertise in Canonical Modeling and Semantic Design
d. Proven ability to design scalable reusable data models and translate them into physical implementations that align with business domains and analytic needs.
e. Deep proficiency in designing canonical and semantic data models with proven experience aligning data structures to business domains and analytic use cases.
f. Hands-On Pipeline Design and Orchestration
g. Experience architecting and optimizing complex data pipelines ensuring performance resilience and real-time capabilities.
h. Hands-on experience building batch and streaming pipelines that are performant resilient and traceable using orchestration frameworks that support real-time and ML/AI-ready processing.
i. Governance and Metadata Awareness
j. Familiarity with data governance practices including stewardship lineage access controls and cataloging across enterprise environments.
k. Familiarity with data governance frameworks including data quality rules lineage tracking access controls and enterprise metadata management.
2) Other Qualifications:
a. Ability to partner with platform and cloud engineering teams to ensure infrastructure and orchestration layers support data reliability scalability and SLAs.
b. Strong understanding of data product design principles and ability to develop modular reusable data services that support multiple products and delivery pods.
c. Experience contributing to or maintaining an enterprise data catalog and enabling self-service access through secure well-documented APIs.

Education / Certifications
1) None

Interview Process / Next Steps
1) 1st round with HM and WBD Data Architect
2) 2nd round with Senior Manager

Additional Notes
Sourcing in CA Burbank
Hybrid schedule (Tues-Thurs) required.

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.