Jump to content

Rosetta, the AI of Facebook capable of identifying offensive memes


Recommended Posts

Posted

Facebook

 

Facebook has announced that it has developed an artificial intelligence that will help moderators identify offensive memes published by users.

Currently, social networks have taken seriously monitoring the content published by users to prevent insults and humiliation. Last April, Facebook revealed its secret rules to censor publications, so that anyone can see what is allowed and what is not on the platform.

Thus, according to the community norms of Facebook, violent behaviors or criminal behaviors of any kind are prohibited; encourage suicide or self-harm, nudity and sexual exploitation, bullying or harassment; It also prohibits behaviors that attempt against integrity and authenticity, and requires respect for intellectual property; In addition, the publication of questionable content, such as hate speech or incitement to hatred, violence or cruel and insensitive content, is not allowed.

 

To prevent this type of behavior, Facebook has a staff of human moderators, who are responsible for reviewing the content reported by other people and monitor that users comply with community standards.

However, taking into account the millions of content that are shared on Facebook every day, it is not easy to detect those that violate the terms and conditions. And this is where Rosetta comes in, the AI that Facebook has created to identify offensive memes, photos and videos automatically and intelligently.

Rosetta is able to extract the text of more than one billion images published on Facebook and Instagram, in a variety of languages, daily and in real time. Then, he introduces it in a model trained to understand the text and the image as a whole, and so he can know if it is a harmless content or if it violates community norms. A great help so that the platform can avoid inappropriate publications being shared.

  • I love it 1
Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.