GCP Data Engineer BigData, Hadoop, Power BI

Not Interested
Bookmark
Report This Job

profile Job Location:

Mississauga - Canada

profile Monthly Salary: Not Disclosed
profile Experience Required: 5years
Posted on: 4 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Title: Data Engineer GCP BigData & Hadoop


Experience: 810 Years
Skills: Big Data & Hadoop Ecosystems Google Cloud Data Engineering



Key Responsibilities

  • Collect process and analyze data from multiple sources including databases spreadsheets APIs and streaming systems.

  • Design develop and maintain dashboards and reports using Power BI Tableau Looker or Excel.

  • Work closely with business and technical stakeholders to gather requirements and translate them into scalable data solutions.

  • Design and implement high-level and detailed technical architectures on Google Cloud Platform (GCP).

  • Develop and maintain robust data ingestion frameworks for structured and unstructured data sources.

  • Lead and execute migration of on-premise Hadoop clusters to GCP.

  • Build optimize and manage data pipelines using GCP services such as BigQuery Dataflow Dataproc Cloud Composer Cloud Functions and Cloud Data Fusion.

  • Perform root cause analysis on data issues and present actionable insights to cross-functional teams.

  • Ensure data quality integrity security and governance across all data platforms.

  • Monitor and optimize performance cost and reliability of GCP data solutions.

  • Recommend optimal GCP architecture and best practices to customers and internal teams.


Required Qualifications

  • Bachelors degree in Statistics Mathematics Computer Science Economics or a related field.

  • 810 years of experience in data engineering data analysis or big data solutions.

  • Minimum 3 years of hands-on experience with Google Cloud data services.

  • Strong proficiency in SQL and programming languages such as Java and/or Python.

  • Solid understanding of data modeling ETL/ELT frameworks and statistical techniques.

  • Experience working with Hadoop ecosystem tools and big data platforms.

  • Google Cloud Professional Data Engineer certification (mandatory).

  • Excellent communication and documentation skills.


Essential Skills

  • Google Cloud Platform: BigQuery Dataflow Dataproc Cloud Composer Cloud Functions Cloud Data Fusion

  • Big Data & Hadoop ecosystem

  • Data ingestion and pipeline design

  • Hadoop-to-GCP migration strategy and execution

  • Data visualization tools: Power BI Tableau Looker

  • SQL Java Python

  • Data quality governance and performance optimization


Desirable Skills

  • Prior experience working as a Data Analyst or in analytics-focused projects.

  • Exposure to BI and reporting tools within GCP ecosystem.

  • Experience presenting insights to business and executive stakeholders.


Soft Skills

  • Strong analytical and problem-solving abilities.

  • Excellent verbal and written communication skills.

  • Ability to explain complex technical concepts in a clear and concise manner.

  • High attention to detail and accuracy.




Required Skills:

Experience (Years): 4-6 Essential Skills: Work with project teams throughout the organization to design implement and manage CDN infrastructure using Akamai to ensure high availability performance and scalability for customer facing applications and business processes. Handle multiple priorities and assignments with excellence and precision. Be a part of a 24/7/365 organization (some after hours support is expected as part of normal on-call rotation). Directly support line of business development teams provide guidance to them on implementation and changes for customer facing applications Develop and maintain security protocols and measures to protect CDN infrastructure from cyber threats. Monitor and analyze network performance identifying and resolving issues to optimize content delivery of critical applications. Collaborate with cross-functional teams to integrate Akamai CDN solutions with existing systems and applications. Collaborate with information security teams to implement DDoS protection strategies and other security measures in the CDN. Provide technical support and guidance to clients and internal teams regarding CDN and security best practices. Work closely with vendor and professional service teams on delivery related activities and strategy. Qualifications: Bachelors degree in Computer Science Information Technology or a related field. OR similar work experience. Strong understanding of network protocols (HTTP/HTTPS DNS TCP/IP). Proven experience as a CDN Engineer or similar role with a strong focus on -depth knowledge of Content Delivery Network technologies including caching load balancing and content optimization. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience supporting 24/7/365 customer facing applications at enterprise scale. Awareness and experience with cybersecurity tools and practices such as firewalls intrusion detection/prevention systems and encryption. Proficiency in scripting and automation (e.g. Python Bash) a plus. Relevant certifications (e.g. CISSP CEH) are a plus but not required.

Job Title: Data Engineer GCP BigData & HadoopExperience: 810 YearsSkills: Big Data & Hadoop Ecosystems Google Cloud Data EngineeringKey ResponsibilitiesCollect process and analyze data from multiple sources including databases spreadsheets APIs and streaming systems.Design develop and maintain dash...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Automobile Sales
  • Bidding
  • International Development
  • In House Legal
  • Commissioning
  • Accountancy