As a Data Engineer you will design develop and maintain data solutions that facilitate data generation collection and processing. Your typical day will involve creating data pipelines ensuring data quality and implementing ETL processes to effectively migrate and deploy data across various systems contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities: Full understanding of AWS resilience patterns AWS backup and how to manage incident response with break glass approaches as detailed knowledge of Lambda including runtimes packaging storage concurrency retries and dead-letter understanding of VPCs Route tables IGW NAT ACLs Endpoints Transit Gateway DNS VPC hands-on experience with AWS CLI and working knowledge of knowledge & troubleshooting experience of Iceberg Glue Athena Redshift DynamoDB RDS. (At least 3 or 4 of these services)Ability to perform deep dive using CloudWatch & X-Ray for troubleshooting (covering logs metrics alerts traces filters agents etc.)Thorough understanding of batch and streaming data pipelines involving Kafka Flink EMR understanding of how Identify and authentication works on AWS using IAM SSO STS/Trust policies Resource policies etc. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and big data technologies.- Familiarity with data integration tools and ETL processes.- Knowledge of cloud computing concepts and services.
As a Data Engineer you will design develop and maintain data solutions that facilitate data generation collection and processing. Your typical day will involve creating data pipelines ensuring data quality and implementing ETL processes to effectively migrate and deploy data across various systems...
As a Data Engineer you will design develop and maintain data solutions that facilitate data generation collection and processing. Your typical day will involve creating data pipelines ensuring data quality and implementing ETL processes to effectively migrate and deploy data across various systems contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities: Full understanding of AWS resilience patterns AWS backup and how to manage incident response with break glass approaches as detailed knowledge of Lambda including runtimes packaging storage concurrency retries and dead-letter understanding of VPCs Route tables IGW NAT ACLs Endpoints Transit Gateway DNS VPC hands-on experience with AWS CLI and working knowledge of knowledge & troubleshooting experience of Iceberg Glue Athena Redshift DynamoDB RDS. (At least 3 or 4 of these services)Ability to perform deep dive using CloudWatch & X-Ray for troubleshooting (covering logs metrics alerts traces filters agents etc.)Thorough understanding of batch and streaming data pipelines involving Kafka Flink EMR understanding of how Identify and authentication works on AWS using IAM SSO STS/Trust policies Resource policies etc. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and big data technologies.- Familiarity with data integration tools and ETL processes.- Knowledge of cloud computing concepts and services.
View more
View less