Requirement:
- 7.5 years of total experience in data operations ETL development and technical solutioning.
- Proven experience in leading large-scale data initiatives and RFP responses.
- Strong hands-on experience with ETL tools (e.g. Informatica Talend SSIS DataStage) and data pipeline orchestration tools (e.g. Apache Airflow Azure Data Factory).
- Exposure to multiple technology stacks including cloud platforms (AWS Azure GCP) databases (SQL NoSQL) and big data ecosystems (Hadoop Spark).
- Experience with monitoring tools (e.g. Splunk Dynatrace Prometheus Grafana) and ITSM platforms (e.g. ServiceNow JIRA).
- Experience with CI/CD DevOps practices and monitoring tools for data environments.
Proven track record of process automation and performance tuning in complex data landscapes. - Excellent communication and stakeholder management skills.
Responsibility:
- Lead the technical and business solutioning of RFPs and client proposals related to data engineering data operations and platform modernization
- Collaborate with architecture governance and business teams to align on technical strategy and data management standards.
- Continuously evaluate emerging technologies and frameworks to modernize legacy systems and improve efficiency.
- Ready to lead the delivery whenever required
- Mentor and guide a team of data engineers and operations analysts fostering a culture of technical excellence and continuous improvement.
- Act as a primary liaison with business support and governance teams for operational matters.
- Contribute to maintain compliance with security audit and regulatory standards in operations.
- Drive automation and self-healing mechanisms to reduce manual interventions and improve system resilience.
- Drive process automation across data operations using scripts tools and workflow orchestration.
- Implement and enforce best practices including metadata management lineage tracking data quality monitoring and master data management
Qualifications :
Bachelors or masters degree in computer science Information Technology or a related field.
Remote Work :
No
Employment Type :
Full-time
Requirement:7.5 years of total experience in data operations ETL development and technical solutioning.Proven experience in leading large-scale data initiatives and RFP responses.Strong hands-on experience with ETL tools (e.g. Informatica Talend SSIS DataStage) and data pipeline orchestration tools ...
Requirement:
- 7.5 years of total experience in data operations ETL development and technical solutioning.
- Proven experience in leading large-scale data initiatives and RFP responses.
- Strong hands-on experience with ETL tools (e.g. Informatica Talend SSIS DataStage) and data pipeline orchestration tools (e.g. Apache Airflow Azure Data Factory).
- Exposure to multiple technology stacks including cloud platforms (AWS Azure GCP) databases (SQL NoSQL) and big data ecosystems (Hadoop Spark).
- Experience with monitoring tools (e.g. Splunk Dynatrace Prometheus Grafana) and ITSM platforms (e.g. ServiceNow JIRA).
- Experience with CI/CD DevOps practices and monitoring tools for data environments.
Proven track record of process automation and performance tuning in complex data landscapes. - Excellent communication and stakeholder management skills.
Responsibility:
- Lead the technical and business solutioning of RFPs and client proposals related to data engineering data operations and platform modernization
- Collaborate with architecture governance and business teams to align on technical strategy and data management standards.
- Continuously evaluate emerging technologies and frameworks to modernize legacy systems and improve efficiency.
- Ready to lead the delivery whenever required
- Mentor and guide a team of data engineers and operations analysts fostering a culture of technical excellence and continuous improvement.
- Act as a primary liaison with business support and governance teams for operational matters.
- Contribute to maintain compliance with security audit and regulatory standards in operations.
- Drive automation and self-healing mechanisms to reduce manual interventions and improve system resilience.
- Drive process automation across data operations using scripts tools and workflow orchestration.
- Implement and enforce best practices including metadata management lineage tracking data quality monitoring and master data management
Qualifications :
Bachelors or masters degree in computer science Information Technology or a related field.
Remote Work :
No
Employment Type :
Full-time
View more
View less