Remote Sr. Hadoop Developer

247Hire

Not Interested
Bookmark
Report This Job

profile Job Location:

Jacksonville, FL - USA

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

  • Application Deadline: Feb. 28 2026
  • Jacksonville
  • On-site
  • Hourly salary: $65

Job Description

Software Guidance & Assistance Inc. (SGA) is searching for a Remote Sr. Hadoop Developer for a CONTRACT assignment with one of our premier Healthcare Services clients for a Remote position.

Top Skills Needed:
Hadoop
Spark
Most recent HBase
Experience with No SQL databases (ex. Mongo Postgres)
Spark/Kafka streaming
Scala
DB2

Experience with Snowflake features such as data clustering caching and query optimization is highly desirable.

Responsibilities:
A Hadoop developer is responsible for the design development and operations of systems that store and manage large amounts of data. Most Hadoop developers have a computer software background and have a degree in information systems software engineering computer science or mathematics. IT Developers are responsible for development programming coding of Information Technology solutions. IT Developers are responsible for documenting detailed system specifications participation in unit testing and maintenance of planned and unplanned internally developed applications evaluation and performance testing of purchased products. IT Developers are responsible for including IT Controls to protect the confidentiality integrity as well as availability of the application and data processed or output by the application. IT Developers are assigned to moderately complex development projects.

  • Write code for moderately complex system designs. Write programs that span platforms. Code and/or create Application Programming Interfaces (APIs).
  • Write code for enhancing existing programs or developing new programs.
  • Review code developed by other IT Developers.
  • Provide input to and drive programming standards.
  • Write detailed technical specifications for subsystems. Identify integration points.
  • Report missing elements found in system and functional requirements and explain impacts on subsystem to team members.
  • Consult with other IT Developers Business Analysts Systems Analysts Project Managers and vendors.
  • Scope time resources etc. required to complete programming projects. Seek review from other IT Developers Business Analysts Systems Analysts or Project Managers on estimates.
  • Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.
  • Supports transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.
  • Researches vendor products / alternatives. Conducts vendor product gap analysis / comparison.
  • Accountable for including IT Controls and following standard corporate practices to protect the confidentiality integrity as well as availability of the application and data processed or output by the application.
  • The essential functions listed represent the major duties of this role additional duties may be assigned.
  • High Critical Collections supporting Customer Connect (Customer Collection Episodes Collection Rewards Collection)
  • All Collections Supporting Provider Vista and Provider Link (Provider Attribution Caregaps Codinggaps Medical Management Practice Summary Dashboard Collections)
  • Mandates/Regulatory (Continuity of Care State Transparency)


Required Skills:

  • Related Bachelors degree or related work experience
  • 5 years related work experience Professional experience with technical design and coding in the IT industry
  • Great verbal and written communication
  • Experience as a Lead using Cloudera Data Platform (CDP)
  • Good experience working on cloud-based platforms. Well-versed in cloud architecture deployment and management of big data applications with a strong understanding of cloud security scalability and cost optimization. Experience with cloud-based services such as S3 Azure Blob Storage etc.
  • Working knowledge of Snowflake architecture data modeling and data warehousing best practices. Hands-on experience working with Snowflake.
  • Ability to analyze existing Spark code identify areas of improvement and refactor the code to leverage Snowflakes data warehousing capabilities.
  • Experience with data transformation data quality and data validation.
  • Experience and understanding with unit testing release procedures coding design and documentation protocol as well as change management procedures
  • Proficiency using versioning tools
  • Thorough knowledge of Information Technology fields and computer systems
  • Demonstrated organizational analytical and interpersonal skills
  • Flexible team player
  • Ability to manage tasks independently and take ownership of responsibilities
  • Ability to learn from mistakes and apply constructive feedback to improve performance
  • Must demonstrate initiative and effective independent decision-making skills
  • Ability to communicate technical information clearly and articulately
  • Ability to adapt to a rapidly changing environment
  • In-depth understanding of the systems development life cycle
  • Proficiency programming in more than one object-oriented programming language
  • Proficiency using standard desktop applications such as MS Suite and flowcharting tools such as Visio
  • Proficiency using debugging tools
  • High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy
  • HADOOP
  • Spark
  • Most recent HBase
  • Experience with No SQL databases (ex. Mongo)
  • Spark/Kafka streaming
  • Scala
  • DB2


Preferred Skills:

  • Experience with Snowflake features such as data clustering caching and query optimization is highly desirable.
  • Experience with migrating existing Spark code to write data to Snowflake
  • Familiarity with Snowflakes APIs and SDKs for seamless integration with Spark.
  • Ability to design and implement ETL workflows that leverage Snowflakes compute capabilities including but not limited to Snowflakes SQL Python and Scala APIs
  • Prior firm experience preferred

cription add details here


Required Experience:

Senior IC

Application Deadline: Feb. 28 2026 JacksonvilleOn-siteHourly salary: $65Job DescriptionSoftware Guidance & Assistance Inc. (SGA) is searching for a Remote Sr. Hadoop Developer for a CONTRACT assignment with one of our premier Healthcare Services clients for a Remote p...
View more view more

Key Skills

  • CCTV
  • Computer Science
  • Corporate Marketing
  • E Learning
  • Arabic English Translation

About Company

Company Logo

About the company

View Profile View Profile