Newswise — ITHACA, N.Y. – In the ever-growing battle against malicious social media content, Cornell University researchers are teaming up with Facebook to study why people share harmful posts and what their intent is behind sharing such content.

Serge Belongie, professor of computer science at Cornell Tech, is studying what he calls “intentonomy” – the complex psycho-emotional landscape lurking behind Facebook and Instagram posts.

Belongie and his team are working with Facebook to define possible posting intentions – from benign to polarizing to hateful – and populate a dataset with examples. The goal is to create and train a machine learning system that can predict intent and, eventually, alert the social network about problematic posts in real time.

“Human nature and politics and tribal behavior, monetary incentives – there’s just a zillion things playing into this,” said Belongie, who received a $1.77 million, three-year grant from Facebook to work on projects related to identifying content with malicious intent. “The best we can do is provide tools so that if someone comes to the table with good faith, they can separate the information from the misinformation.”

In a separate project, Belongie’s team is working on machine-learning approaches to detecting forgeries. People who buy advertisements on Facebook must validate their accounts using identification; Belongie will use his expertise in computer vision – an area of artificial intelligence focused on teaching machines to see as humans do – to develop methods that could determine whether those IDs are fake.

Belongie’s approach will build on his group’s research into using computer vision to recognize fine-grained differences among plants, animals and mushrooms. A similar approach could be useful for finding tiny details revealing forged IDs, such as the wrong kind of comma or apostrophe.

“If somebody just gives me a bucket of data and most of it is correct, most of it is real, how do you find that needle in the haystack?” he said. “Our goal is anomaly detection – to find things that are out of place.”

For more information, see this Cornell Chronicle story.

Cornell University has dedicated television and audio studios available for media interviews supporting full HD, ISDN and web-based platforms.

-30-

 

MEDIA CONTACT
Register for reporter access to contact details