Company Overview:
Niyati Tech is a leading player in the Information Technology & Services industry specializing in providing innovative solutions to enterprises.
Role and Responsibilities:
The Data Platform Engineer at Niyati Tech will play a crucial role in driving insights from data accelerating machine learning at scale and building innovative AI workflows. The responsibilities include building highlyscalable and secure data infrastructure developing transformation systems for various data stores and building tools and applications to streamline data management and access. The role also involves reviewing and influencing design and architecture with stability maintainability and scale in mind identifying patterns providing solutions to a class of problems and handling dependencies with minimal oversight.
Candidate Qualifications:
The ideal candidate should have a good understanding of distributed systems scalability and availability along with at least 2 years of experience in Spark/Scala/Java 1 year of experience in Kafka and Flint/Spark Streaming 2 years of experience in Airflow and experience in building ETL pipelines on a large scale. Strong database and storage fundamentals experience with cloud deployments and basic working knowledge of Kubernetes are also required.
Required Skills:- Airflow
- Pyspark
- Spark/Scala/Java
kubernetes,redis,gcp,data streaming,data analysis,java,amazon dynamodb,pyspark,machine learning,airflow,nosql,generative ai,llms,kafka,aws,flink,etl,sparksql,scala,distributed systems