drjobs Senior Python Azure Developer With PySparks AWS To Azure Migration Expert

Senior Python Azure Developer With PySparks AWS To Azure Migration Expert

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Hyderabad - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Title: Senior and Python Azure developer AWS to Azure Migration expert)

Experience: 7-10 Yrs.

Primary Skills:

Python PySpark - Python API

Hands-on experience with Azure Serverless (Azure Functions)

AWS to Azure Cloud Migration (Preferred)

Nice to have: Experience in Databricks ADF mapping dataflow and Synapse Spark



Scope of Work:

Hand-on experience in migration of Python application from AWS to Azure environment

Experience in PySpark - Python API

Experience in Databricks ADF mapping dataflow and Synapse Spark

Analyse source architecture Source code and AWS service dependencies to identify code remediations scenarios.

Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure including Azure service dependencies and other application dependencies remediations at source code.

8 years of experience in application development with Python

Experience in Unit testing application testing support and troubleshooting on Azure.

Experience in application deployment scripts/pipelines App service APIM AKS/Microservices/containerized apps Kubernetes helm charts.

Hands-on experience in developing apps for AWS and Azure (Must Have)

Hands-on experience with Azure services for application development (AKS Azure Functions) and deployments.

Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless.

Tech stack details:

Confluent Kafka AWS S3 Sync connector

Azure Blob Storage

AWS lambda to Azure Functions (Serverless) Python



S3 to Azure Blob Storage

AWS to Azure SDK Conversion (Must Have)



Qualifications
Educational qualification:

B.E/

Experience :

7-10 years

Mandatory/requires Skills :

Hand-on experience in migration of and Python application from AWS to Azure environment
8 years of experience in application development with NPython

Experience in PySpark - Python API

Experience in Databricks ADF mapping dataflow and Synapse Spark

Experience in Unit testing application testing support and troubleshooting on Azure.

Experience in application deployment scripts/pipelines App service APIM AKS/Microservices/containerized apps Kubernetes helm charts.

Hands-on experience in developing apps for AWS and Azure (Must Have)

Hands-on experience with Azure services for application development (AKS Azure Functions) and deployments.

Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless.

Analyse source architecture Source code and AWS service dependencies to identify code remediations scenarios.

Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure including Azure service dependencies and other application dependencies remediations at source code.

Tech stack details:

Confluent Kafka AWS S3 Sync connector

Azure Blob Storage

AWS lambda to Azure Functions (Serverless) Python



S3 to Azure Blob Storage

AWS to Azure SDK Conversion (Must Have)

Remote Work :

No

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.