Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailJob Description:
General Skills : Must have experience deploying and working with big data technologies like Hadoop Spark and Sqoop
Experience with streaming frameworks like Kafka.
Experience designing and building ETL pipeline using NiFi
Highly proficient in OO programming (Python PySpark Java and Scala)
Experience with the Hadoop Ecosystem (HDFS Yarn MapReduce Spark Hive Impala)
Proficiency on Linux Unix command line Unix Shell Scripting SQL and any Scripting language
Experience designing and implementing large scalable distributed systems
Ability to debug production issues using standard command line tools
Create design documentation and maintain process documents
Ability to debug Hadoop / Hive job failures
Ability to use Cloudera in administering Hadoop
MUST have skills : Pyspark Python Hadoop Kafka Linux/Unix SQL.
Nice to have skills : Cloud technologies like Databricks AWS Azure and GCP.
Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services such as false websites or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process nor ask a job seeker to purchase IT or other equipment on our behalf.More information on employment scams is availablehere.
Full-Time