RESPONSIBILITIES:
- Design build and maintain scalable data pipelines and ETL processes.
- Collaborate with data scientists analysts and stakeholders to understand data requirements.
- Optimize data systems for performance reliability and scalability.
- Ensure data quality governance and security practices are applied.
- Implement data integration solutions across diverse sources and formats.
- Work with cloud platforms such as AWS Azure or GCP (if applicable).
QUALIFICATIONS:
- Experience Required: 5 years
- Proficiency in SQL Python or Scala.
- Experience with ETL tools (e.g. Apache NiFi Talend Informatica).
- Knowledge of data warehouse solutions (e.g. Snowflake Redshift BigQuery).
- Familiarity with big data frameworks (e.g. Spark Hadoop).
Vertical:
Technology
RESPONSIBILITIES: Design build and maintain scalable data pipelines and ETL processes.Collaborate with data scientists analysts and stakeholders to understand data requirements.Optimize data systems for performance reliability and scalability.Ensure data quality governance and security practices are...
RESPONSIBILITIES:
- Design build and maintain scalable data pipelines and ETL processes.
- Collaborate with data scientists analysts and stakeholders to understand data requirements.
- Optimize data systems for performance reliability and scalability.
- Ensure data quality governance and security practices are applied.
- Implement data integration solutions across diverse sources and formats.
- Work with cloud platforms such as AWS Azure or GCP (if applicable).
QUALIFICATIONS:
- Experience Required: 5 years
- Proficiency in SQL Python or Scala.
- Experience with ETL tools (e.g. Apache NiFi Talend Informatica).
- Knowledge of data warehouse solutions (e.g. Snowflake Redshift BigQuery).
- Familiarity with big data frameworks (e.g. Spark Hadoop).
Vertical:
Technology
View more
View less