drjobs
Data Engineer BigQuery DataProc DataFlow Hybrid - Dallas TX
drjobs
Data Engineer BigQue....
Vision It US
drjobs Data Engineer BigQuery DataProc DataFlow Hybrid - Dallas TX العربية

Data Engineer BigQuery DataProc DataFlow Hybrid - Dallas TX

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

- USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2700097

Additional Job Details :

  • The ideal resource would be local to the Dallas TX area so they could be in office 12 days per week.
  • They would have extensive experience on BigQuery DataProc and DataFlow platforms on Google Cloud platform.
  • Having experience on Azure Databricks is an added advantage (not mandatory).
  • Programming experience on Python Shell scripting PySpark and other data programming language.
  • Programming experience on Apache Beam Java SDK for building effective heavy data pipelines and deploying them in GCP DataFlow.
  • CICD process to deploy these pipelines in GCP.

Description:

  • Advanced working SQL knowledge and experience working with relational databases query authoring (SQL) as well as working familiarity with a variety of databases.
  • Extensive Experience on BigQuery DataProc and DataFlow platforms on Google Cloud platform. Having experience on Azure Databricks is an added advantage (not mandatory).
  • Experience on Cluster capacity configurations and cloud optimization to meet application demand.
  • Programming experience on Python Shell scripting PySpark and other data programming language.
  • Programming experience on Apache Beam Java SDK for building effective heavy data piplines and deploying them in GCP DataFlow. CICD process to deploy these pipelines in GCP.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with Data Visualization Dashboard Metrics and etc.
  • Build processes supporting data transformation data structures metadata dependency and workload management.
  • A successful history of manipulating processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing stream processing and highly scalable big data data stores.
  • Familiar with Deployment tool like Docker and building CI/CD pipelines.
  • Experience supporting and working with crossfunctional teams in a dynamic environment.
  • 8 years experience in software development Data engineering and
  • Bachelors degree in computer science Statistics Informatics Information Systems or another quantitative field. Postgraduate/masters degree is preferred.
  • Experience in Machine Learning and Data Modeling is a plus.

What are the top 3 skills needed/required

  • Extensive Experience on BigQuery DataProc and DataFlow platforms on Google Cloud platform. Having experience on Azure Databricks is an added advantage (not mandatory).
  • Programming experience on Python Shell scripting PySpark and other data programming language.
  • Programming experience on Apache Beam Java SDK for building effective heavy data piplines and deploying them in GCP DataFlow. CICD process to deploy these pipelines in GCP.

What makes a resource profile stand out to you

  • Previous experience tenure with prior clients
  • Good handson experience on prior assignments

What will this persons daytoday responsibilities be

  • Handling business requirements and delivering them on Agile methodology.

How will they contribute to the project

  • Handling business requirements and delivering them on Agile methodology

If hybrid or in office role how many days a week will the resource need to come into the office

  • 1 or 2 days per week
  • Please note that resources who will be working in Bentonville AR Reston VA or some Texas locations must have a VendorSAFE background check completed.

Does this contract have the opportunity to extend or convert to an FTE

  • Yes
  • Along with the required skills it would be great if we can have profiles that are currently working somewhere or have just recently finished their assignment.
  • We are also looking for candidates with at least 78 years or experience or more in Big Data.

Required Skills : Data AnalysisData Warehouse
Additional Skills : Data EngineerThis is a high PRIORITY requisition. This is a PROACTIVE requisition

Employment Type

Full Time

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.