GCP Data Engineer (W2 Position)

Megan Soft Inc

Not Interested
Bookmark
Report This Job

profile Job Location:

Dearborn, MI - USA

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

We have a job opportunity of a Role Data Engineer with given job description on W2. Please forward updated profile to or 1.

Role: Data Engineer (W2 Position)

Location : Dearborn MI (Hybrid)

Duration: 12 Months

Experience: 8 Years

Note : Please dont share CPT OPT and OPT EAD resumes

JD:

  • Bachelors degree in a related field (e.g. Business Engineering Information Systems Computer Science Mathematics).
  • 7 years of progressive responsibilities in Information Technology (IT)
  • 5 years of experience solving business problems through delivering IT solutions in design engineering and manufacturing domains.
  • Experience in cloud platforms such as Google Cloud Platform (GCP
  • Experience as a Product Manager/Product Owner in Product Driven Organization (PDO) would be beneficial.
  • Excellent verbal and written communication skills with the ability to communicate effectively with all levels of management.
  • Strong interpersonal skills demonstrating professionalism in all actions.
  • Good analytical skills a process-driven work style and the ability to think strategically.
  • Ability to develop detailed customer IT and data-driven process flows.
  • Strong team player with proven ability to work cross-functionally.

Skills Required:

  • IT Solutions GCP Cloud Computing Data Warehousing Data Management

Experience Required:

  • Engineer 3 Exp: 7 years Data Engineering work experience

Thanks & Regards

Praveen

Megan Soft Inc.

Direct No: 1

We have a job opportunity of a Role Data Engineer with given job description on W2. Please forward updated profile to or 1. Role: Data Engineer (W2 Position) Location : Dearborn MI (Hybrid) Duration: 12 Months Experience: 8 Years Note : Please dont share CPT OPT and OPT EAD resumes JD: Bachelors ...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala