Software Engineering III

EQH

Not Interested
Bookmark
Report This Job

profile Job Location:

Charlotte, VT - USA

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

Requires a Masters or foreign equivalent degree in Computer Science Elec. Engineering or related technical field plus at least 5 years of experience as Software Engineer or related occupation involving development of Big Data solutions including performing design data ingestion and developing pipelines in Hadoop DataLake.

Experience must include: Sourcing & ETL Development support to build multiple Data Products for analytical and actuarial purpose; Hadoop Technologies (Sqoop Python Databricks); Azure Data Platform Handling; Building predictive models using sentiment scores to forecast market trends and assess the correlation between market movements; HDFS Hive Impala Sqoop Spark Python Azure; and ETL Vertica Map Reduce Spark Kafka Hive Impala flume Storm Zookeeper Java PL/SQL Oracle Teradata Scala MySQL and Eclipse

40 hours/week. Salary is $154669 - $165000. Direct applicants only. A telecommuting/hybrid work schedule may be permitted within a commutable distance from the worksite. Direct applicants only. Applicants send resume to (Ref. job code DP1725) or search job title through EOE M/F/D/V.

Equitable Financial Life Insurance Company seeks a Software Engineering III for its Charlotte NC location.

Duties:

Provide Big Data Solutions for Datalake design data ingestion and processing on Hadoop Cluster using Spark Map Reduce Hive/Hbase Flume Kafka Syncsort along with programming language such as python/scala.

Work with Customer Data Product Owner to gather and translate business requirement study the priority and criticality of those to advise the order of deliverables that fits the customer focus on key deliverables intact.

Working on Databricks platform to execute and optimize the data pipelines end to end.

Responsible for data gathering and analysis; systems design and implementation; logical design; detailed design; ensuring data security in the design; and system evaluation integration vetting modification troubleshooting and optimization.

Serve as subject matter expert (SME) for DataLake infrastructure and services.

Maintain current DataLake applications and develops procedures where necessary to improve the environment as required. Complies with all security and audit standards.

Provide technical expertise to the development and implementation of DataLake solutions.

Liaise with business unit customers and vendors depending on assignment and interact with IT Senior Executives.

Responsible for design specifications of one or more large or critical applications or systems.

Provide technical functional and systems design for all work related to a system development project.

Lead the process of compiling analyzing designing testing and prioritizing system design components and implementation.

Assists with technical testing ensuring that the system and unit tests were performed and reviews the test results.

Provide production support for new/existing systems of high complexity and scope.

Use Linux Hadoop Scoop HIVE Impala Tableau Python and Databricks to carry out job duties.


Required Experience:

IC

Requires a Masters or foreign equivalent degree in Computer Science Elec. Engineering or related technical field plus at least 5 years of experience as Software Engineer or related occupation involving development of Big Data solutions including performing design data ingestion and developing pipeli...
View more view more

Key Skills

  • React Native
  • AI
  • Enterprise Software
  • React
  • Node.js
  • Redis
  • AWS
  • Software Development
  • IOS
  • Team Management
  • Product Development
  • Mobile Applications

About Company

Company Logo

A financial services company with a legacy of helping people look forward with courage, strength, and wisdom.

View Profile View Profile