drjobs
Hadoop Developer
drjobs
Hadoop Developer
My3Tech
drjobs Hadoop Developer العربية

Hadoop Developer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

Charlotte - USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2643559

Role: Hadoop Developer/Data Lake Developer

Location: Charlotte NC (Hybrid Onsite 3 days/week)

Length: 18 months Contract

Top requirements:

a. 5 years Hadoop developer experience

b. Ezflow experience

c. Strong Python Spark

Plusses:

d. SQL Server

e. Testing

As a Cloudera Developer you will be responsible for developing and maintaining data solutions using the Cloudera platform. You will work closely with crossfunctional teams including data engineers data scientists and business stakeholders to understand data requirements and deliver robust and scalable data solutions. Your primary focus will be on designing developing and implementing data processing pipelines and data ingestion frameworks. The ideal candidate is adept at using pyspark ezflow and other methods for moving data within a Hadoop environment

Key Responsibilities:

  • Utilize multiple architectural components in design and development of client requirements.
  • Maintain improve clean and manipulate data for the operational and/or analytics data systems.
  • Constantly looking for better ways of solving technical problems and designing the solution without being afraid of challenging the status quo.
  • Document and communicate required information for deployment maintenance support and business functionality.
  • Adhere to team delivery/release process and cadence pertaining to code deployment and release
  • They must have a proven ability to drive business results with their databased insights
  • The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
  • Design develop and maintain data processing pipelines using Cloudera technologies such as Apache Hadoop Apache Spark Apache Hive and Python.
  • Collaborate with data engineers and data scientists to understand data requirements and translate them into technical specifications.
  • Develop and maintain data ingestion frameworks for efficiently extracting transforming and loading data from various sources into the Cloudera platform.
  • Optimize and tune data processing jobs to ensure high performance and scalability.
  • Implement data governance and security policies to ensure data integrity and compliance.
  • Monitor and troubleshoot data processing jobs to identify and resolve issues in a timely manner.
  • Perform unit testing and debugging of data solutions to ensure high quality and reliability.
  • Document technical specifications data flows and data architecture diagrams.
  • Stay updated with the latest advancements and best practices in Cloudera technologies and big data analytics.

Qualifications:

  • Strong problem solving skills with an emphasis on product development.
  • Experience working with and creating data architectures.
  • Excellent written and verbal communication skills for coordinating across teams.
  • A drive to learn and master new technologies and techniques.
  • Experience working with AutoSys
  • 57 years of experience with distributed data/computing tools: Hadoop Hive MySQL etc.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.