Newswise — Somewhere, right now, an Internet troll — one of those nefarious propagators of disinformation and doubt — is working up a real corker of a rumor to really animate President Donald Trump’s detractors.

At the same time, somewhere, another troll is cooking up a hot batch of nonsense to be spread among the President’s supporters.

The thing is, those “somewheres” are likely within high-fiving distance for these two trolls. The two trolls may even be the very same person.

“If you really study Internet trolls, many of them are people working in the same room,” said Samer Al-khateeb, PhD, an assistant professor of Computer Science and Informatics in the Creighton University College of Arts and Sciences Department of Journalism, Media, and Computing. “They might be sitting in the same room, creating memes, cartoons, and fabricated images that will aggregate shares, followers and spread chaos.”

An expert in online behavior and social media networks, Al-khateeb is in his first semester on the Creighton faculty and working on a book on social cyber-forensics, a field in which he’s been active for several years dating back to his graduate school days at the University of Arkansas at Little Rock.

While at UALR, Al-khateeb helped start the Collaboratorium for Social Media and Online Behavioral Studies (COSMOS), an academic organization now numbering upwards of 30 undergraduate and graduate students, which is helping governmental agencies better understand the connections between social media, fake news and malicious hackings, all of which have featured prominently in the increasingly digitally interconnected world.

One of the studies Al-khateeb has designed is that of the online flash mob targeting an agency’s or company’s social media and that can lead to a disruption of service. In 2015, hackers claiming to be affiliated with the Islamic State group shut down U.S. Central Command, starting with an influx of tweets to the Central Command’s Twitter account.

“It used to be that a flash mob was fun and done in physical space,” Al-khateeb said. “Now, people gather online and do something harmful. With Central Command, they were able to put that page down for two to three hours and hackers were able to get into a government, military Twitter account and able to promote heinous propaganda.”

Even in a brief three years, though, much has changed with the scope, nature and perpetrators of such attacks, Al-khateeb said. First, the agents involved in such attacks need not be human. Bots — automated social media accounts run by software — can now create thousands of social media posts, obviating the need for human coordination to spread propaganda across multiple platforms.

Al-khateeb, at COSMOS, has been at work in finding ways to link social media accounts across many platforms and show that the origin of a number of disinformation campaigns and fake news disseminations come from the same or closely-connected sources.

This fall, Al-khateeb published two papers at the International Conference on Advances in Social Networks Analysis and Mining (ASONAM). One paper looks at the role of bots in spreading news about natural disasters, the other examines the use of YouTube in creating large-scale crowd manipulations. Al-khateeb has also published a book chapter on social bot evolution during large-scale combat operations published by the Army University Press.

He’s said as hackers and Internet trolls become savvier with their ways of spreading online chaos, the efforts to track them are also becoming more sophisticated. But in many cases, Al-khateeb said, the personal attentiveness of social media consumers to thoroughly examine and judge what they see and read online is a crucial part of the process.

“There is a danger in people actually believing in some of these things,” he said. “We need to raise more awareness, more education about these issues. Not everything you see is real, so the effort on our end also includes vigilance. One way to mitigate the risk of such acts is by creating more tools that can help the government and the society in general, but we’re always a work in progress. The best tool for risk mitigation is to have some skepticism.”