Helping Schools Combat Image-Based Abuse With Saasyan Safe Image AI
Safe Image AI Detects Images Containing Sexual Content in Online Drives
We recently announced the official launch of Saasyan Safe Image AI.
Saasyan Safe Image AI is available in our cloud-based, online student safety solution, Saasyan Assure.
The Saasyan Safe Image AI function detects any images containing sexual content in a student's online drive, making the detection and prevention of image-based abuse, more commonly referred to as revenge porn, an attainable and manageable goal for educational professionals.
"Unfortunately, there is an extremely high prevalence of image-based abuse found in schools today. With the launch of Safe Image AI, Saasyan delivers a solution that enables schools to turn the tide on this concerning epidemic" says Sidney Minassian, CEO of Saasyan.
With Saasyan Safe Image AI, teachers, counsellors and IT professionals can:
- Be alerted when a student's online drive contains sexually explicit images;
- Implement early intervention for the victims of image-based abuse, helping to prevent suicide and self-harm;
- Quickly identify the perpetrator of the abuse and prevent a repeat offense; and
- Improve the mental wellbeing of students by decreasing the prevalence of revenge porn and other image-based abuse in schools.
Support for Google Workspace & Microsoft 365
Saasyan Safe Image AI is currently available for online drives in Google Workspace and Microsoft 365.
Contact us to learn more.