As a Data Engineer you will be playing a key role in shaping data strategies utilizing extensive experience in languages like SQL Python and Java. You will be handling diverse databases implementing big data technologies managing ETL processes and contributing to the evolution of data warehousing concepts on cloud platforms such as AWS Azure or Google Cloud.
Responsibilities:
- Develop and implement a comprehensive data quality framework including standards processes and procedures.
- Design and execute data cleansing processes to rectify data quality issues and improve overall data accuracy.
- Implement robust monitoring mechanisms to continuously assess and report on data quality metrics.
- Collaborate with cross-functional teams including data engineers analysts and business stakeholders to understand data requirements and quality expectations.
- Define and track key data quality metrics providing regular reports and insights to relevant stakeholders.
- Investigate and resolve data quality issues performing root cause analysis to address underlying problems.
- Maintain comprehensive documentation of data quality processes standards and remediation strategies.
- Develop and implement automated data quality checks and validation processes.
- Conduct training sessions to educate team members on data quality best practices and standards.
Qualification
- Bachelors or higher degree in Computer Science Information Technology or a related field.
- Minimum of 2-4 years of hands-on experience in data engineering roles.
- Proficient in languages such as SQL Python and/or Java.
- Expertise in database systems such as PostgreSQL MySQL and/or NoSQL databases.
- Experience with big data technologies like Hadoop Spark or similar frameworks.
- Hands-on experience with ETL tools such as Apache Airflow NiFi Talend or Informatica.
- Familiarity with cloud platforms especially AWS Azure or Google Cloud.
- Strong understanding of data warehousing concepts and experience with platforms mainly PostgreSQL Citus and others like Snowflake Redshift or BigQuery.
- Proficient in using version control systems like Git for code and configuration management.
- Strong analytical and problem-solving skills with an ability to think critically and adapt to evolving data challenges.
As a Data Engineer you will be playing a key role in shaping data strategies utilizing extensive experience in languages like SQL Python and Java. You will be handling diverse databases implementing big data technologies managing ETL processes and contributing to the evolution of data warehousing co...
As a Data Engineer you will be playing a key role in shaping data strategies utilizing extensive experience in languages like SQL Python and Java. You will be handling diverse databases implementing big data technologies managing ETL processes and contributing to the evolution of data warehousing concepts on cloud platforms such as AWS Azure or Google Cloud.
Responsibilities:
- Develop and implement a comprehensive data quality framework including standards processes and procedures.
- Design and execute data cleansing processes to rectify data quality issues and improve overall data accuracy.
- Implement robust monitoring mechanisms to continuously assess and report on data quality metrics.
- Collaborate with cross-functional teams including data engineers analysts and business stakeholders to understand data requirements and quality expectations.
- Define and track key data quality metrics providing regular reports and insights to relevant stakeholders.
- Investigate and resolve data quality issues performing root cause analysis to address underlying problems.
- Maintain comprehensive documentation of data quality processes standards and remediation strategies.
- Develop and implement automated data quality checks and validation processes.
- Conduct training sessions to educate team members on data quality best practices and standards.
Qualification
- Bachelors or higher degree in Computer Science Information Technology or a related field.
- Minimum of 2-4 years of hands-on experience in data engineering roles.
- Proficient in languages such as SQL Python and/or Java.
- Expertise in database systems such as PostgreSQL MySQL and/or NoSQL databases.
- Experience with big data technologies like Hadoop Spark or similar frameworks.
- Hands-on experience with ETL tools such as Apache Airflow NiFi Talend or Informatica.
- Familiarity with cloud platforms especially AWS Azure or Google Cloud.
- Strong understanding of data warehousing concepts and experience with platforms mainly PostgreSQL Citus and others like Snowflake Redshift or BigQuery.
- Proficient in using version control systems like Git for code and configuration management.
- Strong analytical and problem-solving skills with an ability to think critically and adapt to evolving data challenges.
View more
View less