Roles & Responsibilities
- Analyze data mapping documents and business requirements to design comprehensive test plans and cases.
- Perform source-to-target data reconciliation check data loading and ensure transformation rules are applied correctly.
- Write complex SQL scripts for validation (count data completeness data consistency data truncation).
- Identify log and track data defects using tools like JIRA or HP ALM or Octane.
- Automate test scripts and validate data volume performance and scalability.
- Expert-level knowledge of SQL for data analysis.
- Experience with tools such as Informatica and IDMC.
- Understanding of data warehouse concepts and architectures (e.g. star/snowflake schema).
- Familiarity with Hadoop or Spark is often preferred.
- Validate HiveQL HDFS file structures and data processing within the Hadoop cluster.
- Strong analytical and troubleshooting skills.
- Excellent communication for collaborating with developers and stakeholders.
- Knowledge in Metadata dependent ETL process and batch/job framework
Tools & Skills:
- IDMC Informatica Power center Hive Kafka Knowledge SQL Oracle PL/SQL
- Domain: Banking knowledge Payments knowledge preferred.
- Concept: Data Warehousing Data Transformation ETL/ELT Data Quality.
Roles & Responsibilities Analyze data mapping documents and business requirements to design comprehensive test plans and cases. Perform source-to-target data reconciliation check data loading and ensure transformation rules are applied correctly. Write complex SQL scripts for validation (count data...
Roles & Responsibilities
- Analyze data mapping documents and business requirements to design comprehensive test plans and cases.
- Perform source-to-target data reconciliation check data loading and ensure transformation rules are applied correctly.
- Write complex SQL scripts for validation (count data completeness data consistency data truncation).
- Identify log and track data defects using tools like JIRA or HP ALM or Octane.
- Automate test scripts and validate data volume performance and scalability.
- Expert-level knowledge of SQL for data analysis.
- Experience with tools such as Informatica and IDMC.
- Understanding of data warehouse concepts and architectures (e.g. star/snowflake schema).
- Familiarity with Hadoop or Spark is often preferred.
- Validate HiveQL HDFS file structures and data processing within the Hadoop cluster.
- Strong analytical and troubleshooting skills.
- Excellent communication for collaborating with developers and stakeholders.
- Knowledge in Metadata dependent ETL process and batch/job framework
Tools & Skills:
- IDMC Informatica Power center Hive Kafka Knowledge SQL Oracle PL/SQL
- Domain: Banking knowledge Payments knowledge preferred.
- Concept: Data Warehousing Data Transformation ETL/ELT Data Quality.
View more
View less