Job Title: Data Engineer
Location: Jersey City NJ 07311 - Hybrid
Duration: 06 Months
Summary: - Builds a comprehensive QA framework in Azure DataBricks that gather store detect and process large volumes of data for QA analysis and detection of bugs found in ETL code.
Responsibilities: - Develop QA ETL pipelines ensure data quality optimize data storage solutions. Skills: Python SQL cloud data tools experience (e.g. AWS Glue Azure Data Factory Azure DataBricks). Work with the QA analysts to test code and data as part of the SDLC process.
- The Security Data Operations team is looking for a QA Data Engineer to design and develop a QA framework and pipeline for the CyberDW on the *** Data Lakehouse.
- This role is a necessary component and part of the overall SDLC effort to centralize Information Security data and test the accuracy of storing and transforming the ETL process through the medallion architecture.
- To achieve success this individual will be required to analyze and develop the ingestion pipeline that will be used to test the data feeds from raw transformation and to the target process.
- This individual will function as a development resource in the Information Security Data Operations team and be a technical liaison between IT and the CDAO Data Engineering teams.
- They will be responsible for creating a mechanism to create JIRA tickets for the developers to remediate and fix their code if bug arise.
- These QA processes will be built to be flexible adaptable and scalable.
- They will be responsible for timely follow-ups and routine updates to the CyberDW sprint schedule as well as adherence to the internal IT SDLC agile processes.
#CareerBuilder #Monster #Dice #Indeed #LinkedIn
Job Title: Data Engineer Location: Jersey City NJ 07311 - Hybrid Duration: 06 Months ...
Job Title: Data Engineer
Location: Jersey City NJ 07311 - Hybrid
Duration: 06 Months
Summary: - Builds a comprehensive QA framework in Azure DataBricks that gather store detect and process large volumes of data for QA analysis and detection of bugs found in ETL code.
Responsibilities: - Develop QA ETL pipelines ensure data quality optimize data storage solutions. Skills: Python SQL cloud data tools experience (e.g. AWS Glue Azure Data Factory Azure DataBricks). Work with the QA analysts to test code and data as part of the SDLC process.
- The Security Data Operations team is looking for a QA Data Engineer to design and develop a QA framework and pipeline for the CyberDW on the *** Data Lakehouse.
- This role is a necessary component and part of the overall SDLC effort to centralize Information Security data and test the accuracy of storing and transforming the ETL process through the medallion architecture.
- To achieve success this individual will be required to analyze and develop the ingestion pipeline that will be used to test the data feeds from raw transformation and to the target process.
- This individual will function as a development resource in the Information Security Data Operations team and be a technical liaison between IT and the CDAO Data Engineering teams.
- They will be responsible for creating a mechanism to create JIRA tickets for the developers to remediate and fix their code if bug arise.
- These QA processes will be built to be flexible adaptable and scalable.
- They will be responsible for timely follow-ups and routine updates to the CyberDW sprint schedule as well as adherence to the internal IT SDLC agile processes.
#CareerBuilder #Monster #Dice #Indeed #LinkedIn
View more
View less