At EY youll have the chance to build a career as unique as you are with the global scale support inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better too. Join us and build an exceptional experience for yourself and a better working world for all.
The opportunity
Your role will be a Technology Lead or Senior Technology Lead in the Cloud Engineering team. You will be responsible to be a part of the delivery of IT projects for our customers across the globe.
Your key responsibilities
- Architect implement and manage CI/CD pipelines using Azure DevOps and AWS tools for data and analytics workloads.
- Build and maintain CI/CD pipelines across on-premises and multi-cloud platforms (Azure AWS GCP) ensuring consistent delivery practices.
- Orchestrate end-to-end DevOps workflows using tools like ArgoCD Azure DevOps Harness GitHub Actions or GitLab CI.
- Maintain and optimize source control systems (Git SVN) enforcing branching strategies and code quality standards.
- Integrate DevSecOps practices including automated testing code quality and vulnerability scanning (SonarQube Checkmarx Veracode Fortify etc.) within CI/CD pipelines.
- Build and maintain data pipelines leveraging Azure Data Lake Databricks and AWS data services (e.g. S3 Glue Redshift).
- Support migration and modernization of legacy data platforms to Azure and AWS.
- Collaborate with data engineers analysts and business stakeholders to deliver end-to-end data solutions.
- Automate infrastructure provisioning using Terraform ARM Templates CloudFormation or Bicep.
- Automate Kubernetes-based deployments (AKS/EKS/GKE) using Helm charts and manage service mesh and traffic routing with Istio for enhanced observability and resilience.
- Develop automation scripts using Python PowerShell Shell Bash Groovy and leverage cloud-native CLI tools (Azure CLI AWS CLI) for operational tasks.
- Manage configuration and orchestration using Ansible Chef or Puppet.
- Innovate in building independent automation solutions.
- Implement data governance security and compliance best practices across Azure and AWS environments.
- Monitor troubleshoot and optimize data pipelines and platform performance using Azure Monitor AWS CloudWatch Log Analytics and related tools.
- Demonstrate working knowledge of cloud-native services across Azure AWS and GCP including PaaS SaaS and IaaS offerings.
- Ensure cloud security best practices are followed with a strong understanding of identity access management and network security in cloud environments.
- Comprehensive understanding of how IT operations are managed.
Skills and attributes for success
- Strong hands-on experience with Azure DevOps GitHub Actions Jenkins and AWS Native tools for CI/CD automation release management and environment provisioning.
- Deep expertise in Azure Data Lake Databricks and AWS data services (e.g. S3 Glue Redshift).
- Proficiency in scripting languages: Python PowerShell Shell.
- Experience with infrastructure automation tools: Terraform ARM Templates CloudFormation Bicep.
- Knowledge of data governance security and compliance in cloud environments.
- Familiarity with monitoring and observability tools: Azure Monitor AWS CloudWatch Log Analytics.
- Ability to design scalable secure and cost-effective data architectures.
- Strong understanding of cloud-native services across Azure AWS and GCP including PaaS SaaS and IaaS offerings.
- Demonstrated experience in designing building and implementing DevOps solutions for projects of varying complexity with emphasis on Kubernetes (AKS/EKS/GKE) and automation.
- Capability to identify communicate and mitigate project risks.
- Ability to create sustainable systems and services through automation and continuous improvement.
- Experience implementing DevSecOps practices including automated testing code quality and vulnerability scanning (e.g. SonarQube Checkmarx Veracode Fortify).
- Strong understanding of agile methodologies.
- Ability to deliver best practices around provisioning operations and management of multi-cloud environments.
- Excellent communication analytical and problem-solving skills.
- Ability to work collaboratively in cross-functional teams manage communication and deliverables from offshore teams and mentor others.
- Capability to assist the team in debugging and troubleshooting imperative and declarative scripts.
- Experience identifying software packages and solutions to meet client requirements developing RFPs and assisting in proposal evaluation (business and technology fit pricing and support).
- Experience designing and developing AI-infused DevOps frameworks is a plus.
To qualify for the role you must have
- BE/ with a sound industry experience of 6 to 8 years
- Strong knowledge of cloud computing in multi-cloud environments with Azure as the primary platform and exposure to AWS.
- DevOps Setting up CI/CD pipeline using Azure DevOps GitHub Actions or GitLab CI
- Hands-on experience with Azure Data Lake Databricks (including Spark Delta Lake) and familiarity with AWS data services.
- Practical experience with Docker and Kubernetes (AKS/EKS/GKE).
- Proficiency in PowerShell Python Groovy Shell scripting and cloud-native CLI tools.
- Must be good in IAC tools such as Terraform or ARM or Bicep
- Understanding data governance security and compliance in cloud environments.
Preferred Skills
- Microsoft Certified: Azure DevOps Engineer Expert (AZ-400)
- Microsoft Certified: Azure Data Engineer Associate (DP-203)
- Microsoft Certified: Azure Solutions Architect Expert (AZ-305)
- AWS Certified DevOps Engineer Professional
- AWS Certified Solutions Architect Associate
- Databricks Certified Data Engineer Associate/Professional
- Experience with GitHub Actions GitLab CI or other modern CI/CD tools
- Experience with configuration management tools such as Ansible Chef or Puppet
- Experience with container orchestration and management (Kubernetes AKS EKS)
- Familiarity with monitoring and observability tools (Azure Monitor AWS CloudWatch Prometheus Grafana)
- Exposure to GenAI technologies and integration with data platforms
- Experience working in agile and cross-functional teams
- Strong documentation presentation and stakeholder communication skills
Your people responsibilities
Foster teamwork and lead by example
Participating in the organization-wide people initiatives
Ability to travel in accordance with client and other job requirements
Excellent written and oral communication skills; writing publishing and conference-level presentation skills a plus
Technologies and Tools
- Cloud platform AzureAWS
- SDLC Methodologies: Agile /Scrum
- Version Control Tools GitHub/ Bitbucket/GitLab
- CI/CD Automation Tools Azure DevOps/Github actions/AWS CodePipeline/GitHub Actions/GitLab CI/Jenkins/ Harness/ArgoCD
- Data Platform- Azure Data Lake Databricks Microsoft Fabric AWS S3 Glue Redshift
- Container Management Tools Docker/Kubernetes/Docker Swarm
- Application Performance Management Tools Prometheus/ Dynatrace/ AppDynamics
- Monitoring Tools Splunk/ Datadog/ Grafana
- IAC Tools Terraform/ARM Templates/ Bicep
- Artifact Management Tools Jfrog Artifactory/Nexus/CloudRepo/Azure Artifactory
- Scripting - Python/Groovy/PowerShell/Shell Scripting
- SAST/ DAST: SonarQube / Veracode/ Fortify
- GitOps Tool Argo CD/ Flux CD
- GenAI Technology Chat GPT OpenAI
What we look for
- Demonstrated experience in building and automating data platforms using Azure AWS and Databricks.
- Proven track record in implementing CI/CD for data workloads and multiple technologies with strong use of DevOps tools and containerization.
- Strong understanding of Microsoft Fabric AWS data services and modern data architectures.
- Experience with infrastructure automation (IaC tools such as Terraform ARM Templates CloudFormation Bicep) and application automation using Azure DevOps and AWS DevOps tools.
- Working experience in Azure and AWS with a solid grasp of cloud architecture strategy and cloud-related concepts.
- Good exposure to cloud and container monitoring logging and troubleshooting (Azure Monitor AWS CloudWatch etc.).
- Ability to design conduct and experiment with new technologies and approaches.
- Ability to work collaboratively in cross-functional teams and mentor others.
- Excellent communication analytical and problem-solving skills.
What we offer
EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations Argentina China India the Philippines Poland and the UK and with teams from all EY service lines geographies and sectors playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants we offer a wide variety of fulfilling career opportunities that span all business GDS you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Well introduce you to an ever-expanding ecosystem of people learning skills and insights that will stay with you throughout your career.
- Continuous learning: Youll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: Well provide the tools and flexibility so you can make a meaningful impact your way.
- Transformative leadership: Well give you the insights coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: Youll be embraced for who you are and empowered to use your voice to help others find theirs.
EY Building a better working world
EY exists to build a better working world helping to create long-term value for clients people and society and build trust in the capital markets.
Enabled by data and technology diverse EY teams in over 150 countries provide trust through assurance and help clients grow transform and operate.
Working across assurance consulting law strategy tax and transactions EY teams ask better questions to find new answers for the complex issues facing our world today.