REQUIREMENTS:
- Total experience 5years.
- Hands on experience in Data Engineering Data Lakes Data Mesh or Data Warehousing/ETL environments.
- Strong working knowledge in Python SQL Airflow and PySpark.
- Hands-on experience implementing projects applying SDLC practices.
- Hands on experience in building data pipelines and building data frameworks for unit testing data lineage tracking and automation.
- Experience with building and maintaining a cloud system.
- Familiarity with databases like DB2 and Teradata.
- Strong working knowledge in Apache Spark Apache Kafka Hadoop and MapReduce.
- Strong troubleshooting skills and ability to design for scalability and flexibility.
- Expertise in Spanner for high-availability scalable database solutions.
- Knowledge of data governance and security practices in cloud-based environments.
- Problem-solving mindset with the ability to tackle complex data engineering challenges
- Familiar with containerization technologies (Docker/Kubernetes).
- Excellent communication and collaboration skills.
RESPONSIBILITIES:
- Writing and reviewing great quality code.
- Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements
- Mapping decisions with requirements and be able to translate the same to developers
- Identifying different solutions and being able to narrow down the best option that meets the clients requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation
- Writing and reviewing design documents explaining overall architecture framework and high-level design of the application for the developers
- Reviewing architecture and design on various aspects like extensibility scalability security design patterns user experience NFRs etc. and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies patterns and frameworks to materialize it
- Understanding and relating technology integration scenarios and applying these learnings in projects.
- Resolving issues that are raised during code/review through exhaustive systematic analysis of the root cause and being able to justify the decision taken
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications :
Bachelors or masters degree in computer science Information Technology or a related field.
Remote Work :
No
Employment Type :
Full-time