Job Overview:
We are looking for an experienced Senior AWS Data Architect to join our team. In this role you will design implement and optimize scalable data architectures on AWS ensuring robust data pipelines and efficient data storage and retrieval. You will collaborate with cross-functional teams to translate business requirements into technical solutions and contribute to a data-driven decision-making environment.
Key Responsibilities:
- Data Architecture Design: Develop maintain and optimize complex data architectures using AWS services such as Redshift RDS S3 Glue and Athena and Snowflake ensuring high availability scalability and performance.
- Data Integration & ETL Pipelines: Design and build ETL workflows using AWS Glue Lambda and other tools to ingest data from various sources into a unified data environment.
- Data Governance & Security: Implement data governance best practices ensuring data quality security compliance and proper access controls within AWS and Snowflake environments.
- Database Management: Oversee the setup configuration and maintenance of data stores including both relational and NoSQL databases like DynamoDB Redshift and Snowflake.
- Performance Optimisation: Monitor and improve data performance and storage costs implementing best practices for data partitioning compression and caching on AWS and Snowflake.
- Collaboration & Stakeholder Engagement: Work closely with data engineers analysts and business stakeholders to understand data needs and provide reliable solutions.
- Documentation & Standards: Document data architectures data flows and standards to ensure consistent and repeatable processes across teams.
Professional Experience
- Over 10 years experience in designing planning maintaining and implementing large data platforms and applications both on-premises and in cloud.
- Experienced as AWS Cloud Solutions Architect or Big Data Architect Senior Software Engineer
- Skilled in Agile Scrum SDLC delivering operations specifications development testing and maintenance.
- Strong experience with ETL Technologies using AWS PySpark Informatica etc
- Expert in product support and analytics insights (BI) data mining modeling algorithm development and simulations.
Technical Skills
- Cloud & Orchestration: AWS (EC2 EBS S3 IAM VPC Route53 ELB CloudWatch) ECS Docker CloudFormation Elastic Beanstalk.
- Databases: PostgreSQL MySQL SQL Server DynamoDB Datawarehouse.
- Languages: PySpark Python SQL R.
- ETL & Big Data: Informatica PowerCenter Kafka
- BI Tools: Tableau Power BI Splunk.
- Project Tools: JIRA HP ALM.