REQUIREMENTS:
- Total experience 5 years.
- Excellent knowledge and experience in Big data engineer.
- Strong experience with Apache Spark and Python for large-scale data processing.
- Solid knowledge of Hadoop MapReduce Hive and Big Data ecosystems.
- Hands-on experience with GCP services especially Pub/Sub.
- Proficient in working with relational databases (PostgreSQL) and NoSQL systems (MongoDB Kafka).
- Experience writing and optimizing SQL-like queries (SQL MQL HQL).
- Expertise in building scalable efficient data pipelines.
- Experience with version control systems (especially Git) and CI/CD tools.
- Solid understanding of distributed computing parallel processing and big data best practices
- Strong problem-solving and debugging skills
- Experience working in Agile/Scrum environments
- Familiarity with data modeling data warehousing and building distributed systems.
- Expertise in Spanner for high-availability scalable database solutions.
- Knowledge of data governance and security practices in cloud-based environments.
- Problem-solving mindset with the ability to tackle complex data engineering challenges.
- Strong communication and teamwork skills with the ability to mentor and collaborate effectively.
RESPONSIBILITIES:
- Writing and reviewing great quality code.
- Understanding the clients business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
- Mapping decisions with requirements and be able to translate the same to developers.
- Identifying different solutions and being able to narrow down the best option that meets the clients requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation.
- Writing and reviewing design document explaining overall architecture framework and high-level design of the application for the developers.
- Reviewing architecture and design on various aspects like extensibility scalability security design patterns user experience NFRs etc. and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies patterns and frameworks to materialize it.
- Understanding and relating technology integration scenarios and applying these learnings in projects.
- Resolving issues that are raised during code/review through exhaustive systematic analysis of the root cause and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements
Qualifications :
Bachelors or masters degree in computer science Information Technology or a related field.
Remote Work :
No
Employment Type :
Full-time