REQUIREMENTS:
- Total experience 7 years.
- Strong working experience in data engineering.
- Strong experience in building scalable ETL/ELT data pipelines
- Hands on experience in Big Data technologies such as Hadoop MapReduce Spark (including tuning & optimization)
- Handson with Python PySpark Azure (ADF ADLS Gen2 Synapse Analytics) and Kafka for streaming data
- String knowledge of Cloud computing (preferably AWS) EC2 S3 RDS Redshift Glue EMR
- Strong experience with Databricks Delta Lake and Databricks SQL
- Hands on experience in Data modeling (Star Schema Snowflake Schema Normalization/Denormalization) and Data Governance & Security (Lineage Encryption Access Control)
- Experience with SQL and MySQL
- Experience in Data Quality frameworks and Proficiency in building and maintaining ETL/ELT pipelines using tools such as Apache Airflow DBT or similar.
- Familiarity with data visualization tools and strategies is a plus (e.g. Power BI Tableau).
- Knowledge of data governance security practices and compliance standards like GDPR and CCPA.
- Excellent problemsolving communication and collaboration skills.
RESPONSIBILITIES:
- Writing and reviewing great quality code.
- Understanding the clients business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
- Mapping decisions with requirements and be able to translate the same to developers.
- Identifying different solutions and being able to narrow down the best option that meets the clients requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation
- Writing and reviewing design documents explaining overall architecture framework and highlevel design of the application for the developers.
- Reviewing architecture and design on various aspects like extensibility scalability security design patterns user experience NFRs etc. and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and nonfunctional requirements; and defining technologies patterns and frameworks to materialize it.
- Understanding and relating technology integration scenarios and applying these learnings in projects.
- Resolving issues that are raised during code/review through exhaustive systematic analysis of the root cause and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications :
Bachelors or masters degree in computer science Information Technology or a related field.
Remote Work :
Yes
Employment Type :
Fulltime