Overview
Opening available for Data Engineer at Cotiviti South Jordan UT:
Responsibilities
Design and maintain scalable data pipelines using in-house application framework to perform data profiling ingestion transformation and data load into Hadoop systems for end-users. Process large datasets efficiently using Spark Scala Python and SQL. Collaborate with product owners stakeholders and subject matter experts to gather data requirements and translate them into GDF and CDF formats based on product-based mapping and delivery expectations. Generate reports perform data analytics analyze data trends and incorporate finding to generate an accurate data flow for multiple downstream products. Conduct rigorous quality checks by performing data analysis and validations to maintain data integrity. Utilize big data technologies such as Hadoop Spark Hive and Impala to manage and process high-volume datasets. Develop ETL workflows to ingest transform and load for structured data management using relational database management platforms like Oracle MS SQL Server and Access databases. Ensure compliance with HIPAA rules while handling sensitive healthcare information. Provide mainframe-based solutions by diagnosing and resolving complex data issues using SQL COBOL JCL and CICS. Work with the data integration teams to automate manual processes using Control-M (CTM) and enhance productivity through Github ServiceNow Confluence Microsoft Project and Microsoft Visio. Use Software Development Life Cycle (SDLC) methodologies including Waterfall and Agile-Scrum to manage development cycles document system flows and maintain up-to-date knowledge repositories for effective team sharing and collaboration.
Qualifications
Requires a bachelors degree (or higher) or foreign equivalent degree in Computer Science Information Technology or related field plus 5 years of experience with designing and maintaining scalable data pipelines. Also requires 5 years of experience with data aggregation standardization linking quality check mechanisms and reporting; SQL development; gathering requirements and communicating timelines with internal business partners; experience with relational database management systems(Oracle and MS SQL Server); and experience with big data technologies including Hadoop Spark Hive and Impala; developing ETL workflows to ingest transform and load structured data; processing large datasets; debug data issues identify root causes and fix the data issues in a fast-paced environment; using a scripting language including COBOL and Python; experience working in an Agile environment; and experience with healthcare data in a data operations role. Telecommuting available anywhere in the U.S. Company headquarters located at 10701 South River Front Pkwy Suite 200 South Jordan UT 84095.
Base compensation ranges from at least $121405. Specific offers are determined by various factors such as experience education skills certifications and other business needs. Cotiviti offers team members a competitive benefits package to address a wide range of personal and family needs including medical dental vision disability and life insurance coverage 401(k) savings plans paid family leave 9 paid holidays per year and 17-27 days of Paid Time Off (PTO) per year depending on specific level and length of service with Cotiviti. #LI-DNI #immigration
Required Experience:
IC
OverviewOpening available for Data Engineer at Cotiviti South Jordan UT:ResponsibilitiesDesign and maintain scalable data pipelines using in-house application framework to perform data profiling ingestion transformation and data load into Hadoop systems for end-users. Process large datasets efficie...
Overview
Opening available for Data Engineer at Cotiviti South Jordan UT:
Responsibilities
Design and maintain scalable data pipelines using in-house application framework to perform data profiling ingestion transformation and data load into Hadoop systems for end-users. Process large datasets efficiently using Spark Scala Python and SQL. Collaborate with product owners stakeholders and subject matter experts to gather data requirements and translate them into GDF and CDF formats based on product-based mapping and delivery expectations. Generate reports perform data analytics analyze data trends and incorporate finding to generate an accurate data flow for multiple downstream products. Conduct rigorous quality checks by performing data analysis and validations to maintain data integrity. Utilize big data technologies such as Hadoop Spark Hive and Impala to manage and process high-volume datasets. Develop ETL workflows to ingest transform and load for structured data management using relational database management platforms like Oracle MS SQL Server and Access databases. Ensure compliance with HIPAA rules while handling sensitive healthcare information. Provide mainframe-based solutions by diagnosing and resolving complex data issues using SQL COBOL JCL and CICS. Work with the data integration teams to automate manual processes using Control-M (CTM) and enhance productivity through Github ServiceNow Confluence Microsoft Project and Microsoft Visio. Use Software Development Life Cycle (SDLC) methodologies including Waterfall and Agile-Scrum to manage development cycles document system flows and maintain up-to-date knowledge repositories for effective team sharing and collaboration.
Qualifications
Requires a bachelors degree (or higher) or foreign equivalent degree in Computer Science Information Technology or related field plus 5 years of experience with designing and maintaining scalable data pipelines. Also requires 5 years of experience with data aggregation standardization linking quality check mechanisms and reporting; SQL development; gathering requirements and communicating timelines with internal business partners; experience with relational database management systems(Oracle and MS SQL Server); and experience with big data technologies including Hadoop Spark Hive and Impala; developing ETL workflows to ingest transform and load structured data; processing large datasets; debug data issues identify root causes and fix the data issues in a fast-paced environment; using a scripting language including COBOL and Python; experience working in an Agile environment; and experience with healthcare data in a data operations role. Telecommuting available anywhere in the U.S. Company headquarters located at 10701 South River Front Pkwy Suite 200 South Jordan UT 84095.
Base compensation ranges from at least $121405. Specific offers are determined by various factors such as experience education skills certifications and other business needs. Cotiviti offers team members a competitive benefits package to address a wide range of personal and family needs including medical dental vision disability and life insurance coverage 401(k) savings plans paid family leave 9 paid holidays per year and 17-27 days of Paid Time Off (PTO) per year depending on specific level and length of service with Cotiviti. #LI-DNI #immigration
Required Experience:
IC
View more
View less