Big Data Engineer GCP, Hadoop Ecosystem

Not Interested
Bookmark
Report This Job

profile Job Location:

Toronto - Canada

profile Monthly Salary: Not Disclosed
profile Experience Required: 5years
Posted on: 13 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Title: Big Data Data engineer - GCP Cloud
Location: Toronto (Hybrid)

Years of Experience: 6-8 years

Top 3 Required Skills:
and AWS



Preferred Skills:

are looking for an experienced Big Data Engineer to support end to end data platform engineering deployment validation and certification activities for the Trade Central program. The role requires strong hands on experience in Hadoop ecosystem components NiFi workflows Spark processing SQL development and platform-level changes. Experience working with DevOps tools and cloud platforms (preferably GCP) is essential.

Key Responsibilities
Design develop and optimize data pipelines using Hadoop Hive Spark and NiFi.
Perform platform activities including:
o Deploying and upgrading images
o Applying application-level changes
o Validating and certifying platform updates
o Conducting functional and integration testing
Develop and execute strong SQL queries for data processing validation and troubleshooting.
Manage data ingestion transformation and orchestration workflows.
Work with Kafka for event streaming and message processing.
Work with engineering teams to support Trade Central platform components.
Utilize JIRA for task tracking and Xray for test case creation and management.
Use Jenkins Bitbucket and Confluence as part of the DevOps toolchain.
Collaborate with cross-functional teams on GCP-based deployments and data engineering tasks.
Troubleshoot production issues and support performance tuning of jobs and pipelines.


Required Skills:

Experience (Years): 4-6 Essential Skills: Work with project teams throughout the organization to design implement and manage CDN infrastructure using Akamai to ensure high availability performance and scalability for customer facing applications and business processes. Handle multiple priorities and assignments with excellence and precision. Be a part of a 24/7/365 organization (some after hours support is expected as part of normal on-call rotation). Directly support line of business development teams provide guidance to them on implementation and changes for customer facing applications Develop and maintain security protocols and measures to protect CDN infrastructure from cyber threats. Monitor and analyze network performance identifying and resolving issues to optimize content delivery of critical applications. Collaborate with cross-functional teams to integrate Akamai CDN solutions with existing systems and applications. Collaborate with information security teams to implement DDoS protection strategies and other security measures in the CDN. Provide technical support and guidance to clients and internal teams regarding CDN and security best practices. Work closely with vendor and professional service teams on delivery related activities and strategy. Qualifications: Bachelors degree in Computer Science Information Technology or a related field. OR similar work experience. Strong understanding of network protocols (HTTP/HTTPS DNS TCP/IP). Proven experience as a CDN Engineer or similar role with a strong focus on -depth knowledge of Content Delivery Network technologies including caching load balancing and content optimization. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience supporting 24/7/365 customer facing applications at enterprise scale. Awareness and experience with cybersecurity tools and practices such as firewalls intrusion detection/prevention systems and encryption. Proficiency in scripting and automation (e.g. Python Bash) a plus. Relevant certifications (e.g. CISSP CEH) are a plus but not required.

Job Title: Big Data Data engineer - GCP CloudLocation: Toronto (Hybrid)Years of Experience: 6-8 yearsTop 3 Required Skills: and AWSPreferred Skills: are looking for an experienced Big Data Engineer to support end to end data platform engineering deployment validation and certification activities for...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala