Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailPrimary responsibilities of the role
Understand business requirements from the PBI assigned and get clarifications
Break work in simpler code
Do code development using various AWS services like AWS cloud EC2 IAM KMS keys AWS Lambda Batch Terraform/CFT Event bridge Managed Kafka Kinesis Glue PySpark
Ensure the software systems are scalable reliable and efficient.
Protect data and applications by implementing robust security measures.
Unit test case creation and testing.
Promote code to higher environment using CI/CD release QA and UAT testing support. Fix bugs 1
Production deployment and post deployment support
Production issue support troubleshooting and optimizing for performance and efficiency .
Minimum Requirement :
35 years of AWS ETL development experience.
Must have experience on AWS cloud EC2 IAM KMS keys AWS Lambda Batch Terraform/CFT Event bridge Managed Kafka Kinesis Glue PySpark.
Understanding of data modelling concepts.
Knowledge of Python and other programming languages.
Knowledge of different SQL/NOSQL data storage techniques.
SQL competence (query performance tuning index management etc) and a grasp of database structure are required.
Worked in Agile methodology and its ceremonies.
Passionate about sophisticated data structures and problem solutions.
Capability to analyze and troubleshoot complicated data sets.
Design automate and support sophisticated data extraction transformation and loading applications.
Analytical abilities
Good in oral and written communication
Experience with data warehousing and business intelligence (BI)
Knowledge of industries technology trends
Full Time