Roles & Responsibilities
Participating in requirement gathering sessions
Meeting with Client business analyst to understand requirements.
Collaborating with Solution Architects and Design teams for implementing data pipeline migrations.
Deploying Data pipeline in GCP custom Java scripts BigQuery on Google Cloud Platform for Bigdata scripts
Configuring and executing projects Test scripts using TESTNG framework
Using DevSecOps tools available on JENKINs and Google Cloud platform
Participation in SCRUM meeting and sprint grooming sessions.
Identifies areas for process improvements
Perform proof of concepts to evaluate technical fitment of frames works technologies for suitability.
Participate in the ongoing migration roadmap technical skills to review verify and validate the software code developed in the project and troubleshooting techniques and fix the code bugs.
Experience Required
Minimum 4 years of experience in Implementing data migration program from Hadoop (On-prem) with Java Spark to GCP BigQuery Dataproc
Minimum experience of 4 years in Integrating plugins for GitHub Action to CICD platform to ensure software quality and security.
Experience of 4 years on Google Cloud platform (GCP) tools mainly BigQuery Dataproc Cloud composer Cloud storage etc.
Minimum 4 years of experience in Cloud Deployments.
Configuring scripts in Cloud console / Jenkins for Automated executions
Prior work experience in BFSI Warehouse functional domains will be an added advantage.
Hands on experience on Pipeline creation with Scala scripts from scratch and troubleshooting cloud configuration issues.
Should have worked on Git and Continuous Integration environment like GitHub action Jenkins
Should have experience SQLs NoSQL Databases
Should have experience in working in teams following Agile or XP methodology.
Should have BFSI domain experience to understand the business requirement.
Minimum 2 years of Senior programming level experience involving some architecture and high-level design.
Understanding of distributed systems and related concepts required
Technical/Functional Skills
MUST HAVE: Java Spark BigQuery Dataproc
NICE TO HAVE: Event Engine Cloud composer Shell scripts Hadoop Hive Sqoop Pyspark Scheduler
Interested candidates please WhatsApp me your updated CVs () and you can also mail me at . Java, Spark, Hadoop, Hive, Sqoop, Scheduler, BigQuery, Dataproc, Event Engine, Cloud composer, Shell scripts