Senior DataOps Architect

NEXUS CORPORATION

Not Interested
Bookmark
Report This Job

profile Job Location:

Tokyo - Japan

profile Salary: Not Disclosed
profile Experience Required: 4-5years
Posted on: 2 days ago
Vacancies: 1 Vacancy

Job Summary

Business Overview:
The Technology Platforms Division (TPD) drives the growth of our ecosystem by delivering innovative high-quality technology platforms characterized by integrated control and strategic partnerships.

Within TPD the Cloud Platform Supervisory Department (CPSD) develops and manages our state-of-the-art cloud platform empowering global scalability and accelerating innovation across its diverse business units.

Department Overview:
The Data Platform Department (DPD) at our Group develops and maintains a comprehensive data platform empowering over 70 services with solutions for data ingestion discovery governance analytics and querying. We support data-driven decision-making across one of Japans largest data ecosystems providing the tools and infrastructure to support key domains such as Data Lakes Data Warehouses and Business Intelligence.

Position Details:
As a Senior Architect you will play a critical role in driving the architectural vision and technical direction of our software platforms. Your expertise and leadership will guide our software engineering efforts ensuring the delivery of scalable robust and innovative solutions.

Responsibilities

- Design and implement scalable data platform solutions across on-premises and GCP hybrid environments ensuring seamless integration and optimal performance

- Architect DataOps pipelines and infrastructure automation using Infrastructure as Code (Terraform Ansible) for multi-environment deployments

- Lead the design and maintenance of large-scale data systems spanning on-prem data centers and cloud platforms (GCP primary Azure secondary)

- Develop and enforce DataOps best practices including CI/CD for data pipelines data quality frameworks and automated testing strategies

- Define cloud migration strategies from on-premises to GCP including hybrid architecture patterns and data residency considerations

- Build and optimize data infrastructure using Kubernetes containerization and orchestration tools for both on-prem and cloud environments

- Create and maintain comprehensive technical and architectural documentation for data platforms and DataOps workflows

- Drive technological innovation by evaluating and introducing modern DataOps tools platforms and methodologies aligned with business objectives

- Mentor engineering teams on DataOps principles cloud-native architectures and infrastructure automation practices

- Collaborate with data engineering platform engineering and security teams to establish governance compliance and operational excellence

- Monitor industry trends in DataOps cloud technologies and data infrastructure to proactively recommend improvements



Requirements

Mandatory Qualifications:

- Bachelors degree in computer science Data Engineering or related field

- 10 years of experience in infrastructure/DevOps/DataOps

- Proven track record architecting and managing on-premises data infrastructure and hybrid cloud environments

- Deep expertise with GCP services (BigQuery Dataflow Composer GKE Cloud Storage IAM) for data platforms (5 years)

- Hands-on experience with on-premises infrastructure including bare metal virtualization (VMware KVM) storage systems and networking (5 years)

- Expert-level proficiency with Kubernetes Helm and service mesh technologies (Istio Linkerd) in production environments (5 years)

- Experience with data pipeline orchestration tools (Apache Airflow Prefect Dagster)

- Knowledge of monitoring and observability stacks (Prometheus Grafana ELK Datadog)

- Experience with GitOps workflows and CI/CD platforms (GitLab CI GitHub Actions Jenkins)

- Background in data security compliance frameworks and disaster recovery planning

- Excellent communication skills with ability to translate technical concepts to non-technical stakeholders

- Strong experience with Infrastructure as Code tools: Terraform Ansible CloudFormation

- Proficiency with Azure cloud services and hybrid connectivity (ExpressRoute VPN) is a plus




Required Skills:

Position Details:

As a Senior Architect you will play a critical role in driving the architectural vision and technical direction of our clients software platforms. Your expertise and leadership will guide their software engineering efforts ensuring the delivery of scalable robust and innovative solutions.

Responsibilities

- Design and implement scalable data platform solutions across on-premises and GCP hybrid environments ensuring seamless integration and optimal performance

- Architect DataOps pipelines and infrastructure automation using Infrastructure as Code (Terraform Ansible) for multi-environment deployments

- Lead the design and maintenance of large-scale data systems spanning on-prem data centers and cloud platforms (GCP primary Azure secondary)

- Develop and enforce DataOps best practices including CI/CD for data pipelines data quality frameworks and automated testing strategies

- Define cloud migration strategies from on-premises to GCP including hybrid architecture patterns and data residency considerations

- Build and optimize data infrastructure using Kubernetes containerization and orchestration tools for both on-prem and cloud environments

- Create and maintain comprehensive technical and architectural documentation for data platforms and DataOps workflows

- Drive technological innovation by evaluating and introducing modern DataOps tools platforms and methodologies aligned with business objectives

- Mentor engineering teams on DataOps principles cloud-native architectures and infrastructure automation practices

- Collaborate with data engineering platform engineering and security teams to establish governance compliance and operational excellence

- Monitor industry trends in DataOps cloud technologies and data infrastructure to proactively recommend improvements



Requirements

Mandatory Qualifications:

- Bachelors degree in computer science Data Engineering or related field

- 10 years of experience in infrastructure/DevOps/DataOps

- Proven track record architecting and managing on-premises data infrastructure and hybrid cloud environments

- Deep expertise with GCP services (BigQuery Dataflow Composer GKE Cloud Storage IAM) for data platforms (5 years)

- Hands-on experience with on-premises infrastructure including bare metal virtualization (VMware KVM) storage systems and networking (5 years)

- Expert-level proficiency with Kubernetes Helm and service mesh technologies (Istio Linkerd) in production environments (5 years)

- Experience with data pipeline orchestration tools (Apache Airflow Prefect Dagster)

- Knowledge of monitoring and observability stacks (Prometheus Grafana ELK Datadog)

- Experience with GitOps workflows and CI/CD platforms (GitLab CI GitHub Actions Jenkins)

- Background in data security compliance frameworks and disaster recovery planning

- Excellent communication skills with ability to translate technical concepts to non-technical stakeholders

- Strong experience with Infrastructure as Code tools: Terraform Ansible CloudFormation

- Proficiency with Azure cloud services and hybrid connectivity (ExpressRoute VPN) is a plus




Required Education:

JLPT N1

Business Overview:The Technology Platforms Division (TPD) drives the growth of our ecosystem by delivering innovative high-quality technology platforms characterized by integrated control and strategic partnerships.Within TPD the Cloud Platform Supervisory Department (CPSD) develops and manages our ...
View more view more