Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Resolver is a high-growth SaaS company whose intuitive no-code platform gives our customers a clear picture of their risks so they can make quick and effective decisions. As a part of the Resolver team your work will help transform risk management to risk intelligence so organizations can protect people and assets and deliver on their purpose.
We are ambitious in both our mission and our culture. As a business within Kroll we offer an innovative non-hierarchical work environment blended with the stability and financial security of an enterprise. Resolver has also been named one of Canadas Great Places to Work six years in a row!
By combining Artificial and Human Intelligence Resolvers Intelligence delivers 24/7/365 safety by continually fighting the weaponization of communications from whoever the source whatever the language and whichever the online harm.
The Role
We are looking for English speaking Content Moderators with a keen interest in Trust & Safety risk based content moderation and an excellent understanding of harmful content across social media platforms.
The role will include identification and classification of content containing core risks such as hateful and abusive chatter bad actor profiles and sector specific risk as well as reviewing imagery video and audio to assess content type and narrative.
Accuracy is essential in this role and the ability to process large data volumes whilst maintaining high quality outputs is key.
Criteria
Full-Time