Data Integration Engineer ETL, Secure Data Exchange

Not Interested
Bookmark
Report This Job

profile Job Location:

Toronto - Canada

profile Monthly Salary: CAD 10 - 10
profile Experience Required: 5years
Posted on: 11 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Description:

  • 5 years of experience in data engineering or ETL development in enterprise environments.
  • Strong experience with data pipeline orchestration tools (e.g. Apache Airflow AWS Step Functions).
  • Proficiency in SQL Python and data transformation frameworks. Experience with MFT tools and secure data exchange protocols.
  • Familiarity with email-based data ingestion and automation strategies. Knowledge of CDC frameworks and data replication techniques.
  • Experience- 10


Required Skills:

Hands-on working experience architecting Guidewire ClaimCenter solutions including customization and integration. Guidewire certification is a technologies of interest: Guidewire Cloud Salesforce CRM legacy modernization and AWS. Proven knowledge & architecture experience in architecture (digital / digital marketing / micro / macro / monolithic services APIs) application integration service-oriented architecture event-driven architecture application architecture distributed architecture data architecture and experience with modelling languages & techniques. Can quickly comprehend the functions and capabilities of new technologies. Can understand the long-term (big picture) and short-term perspectives of situations. Strong technical background (platforms languages protocols frameworks open source etc.).Experience with architecture frameworks (TOGAF) & architecture certifications a in engaging and supporting claims teams and understanding their day-to-day operations in the P&C insurance space. Open and clear connect with the business telecom infrastructure security audit vendors and software engineering. Driven by challenges and proactive and a motivation for on security standard methodologies and understand the impacts it can have on a working in a constantly evolving technological excellent teammate who demonstrates leadership. Comfortable speaking with all levels of the organization and different audiences.

Job Description: 5 years of experience in data engineering or ETL development in enterprise environments. Strong experience with data pipeline orchestration tools (e.g. Apache Airflow AWS Step Functions).Proficiency in SQL Python and data transformation frameworks. Experience with MFT tools and secu...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala