Job Summary (List Format):
- Design and build data platforms and supporting tools using AWS technologies (Lambda S3 Step Functions).
- Migrate legacy data ingestion pipelines to a shared data management system (BDMS).
- Load datasets into BDMS and create tools for data analysts to modify datasets.
- Collaborate with analysts to validate cleanse and enrich data delivering it to AWS S3 and Delta tables using Databricks Lambda Step Functions and Jupyter Notebooks.
- Work closely with product owners and engineers to develop solutions in an Agile environment.
- Partner with data analysts/scientists to identify data collection efficiencies and automation opportunities.
- Develop infrastructure as code (IaC) and apply DevOps practices.
- Implement microservices using AWS serverless technologies.
- Utilize programming skills in Python or Java (minimum 4 years experience).
- Apply knowledge of database systems and SQL (Oracle PostgreSQL).
- Leverage AWS experience (S3 Lambda CloudWatch) for cloud-based solutions.
- Preferably have experience with data migrations Spark Databricks and IaC tools (Terraform/CloudFormation).
- Hold a degree in Computer Science Engineering Mathematics or have equivalent work experience.