Facebook bots to combat bad behavior

[July 24, 2020: TechXplore] As the world’s largest social network, Facebook provides endless hours of discussion, entertainment, news,…

[July 24, 2020: TechXplore]

As the world's largest social network, Facebook provides endless hours of discussion, entertainment, news, videos and just good times for the more than 2.6 billion of its users.

It's also ripe for malicious activity, bot assaults, scams and hate speech.

In an effort to combat bad behavior, Facebook has deployed an army of bots in a simulated version of the social network to study their behavior and track how they devolve into antisocial activity.

Drawing principles from machine learning, artificial intelligence, game theory and multiagent systems, Facebook engineers developed the program—Web-Enabled Simulation (WES)—a highly realistic, large-scale replica of Facebook.

They hope to remedy the explosive growth of online harassment, especially in this era of political misinformation, crackpot conspiracy theories and hate speech.

WES bots are trained to interact with one another, sending messages, commenting on posts and making friend requests. They cannot interact with actual users.

Mark Harman, the lead Facebook research scientist who posted a summary of the WES effort on a blog Thursday, explained, "The WES approach can automatically explore complicated scenarios in a simulated environment. While the project is in a research-only stage at the moment, the hope is that one day it will help us improve our services and spot potential reliability or integrity issues before they affect real people using the platform."

Millions of bots with differing objectives can be deployed in the experimental system. Some, for instance, will attempt to purchase items that are not permitted on the site, such as guns or drugs. Researchers will track patterns applied by the bots as they conduct searches, visit pages and replicate actions that humans might take.

They will then assess various counter-measures to see which most effectively stop or even prevent the undesirable behaviors.

Harmon compared their approach to that of traffic engineers seeking means to create safer roadways. To curb speeders, for instance, city planners may install more stop signs. If that measure is insufficient, road bumps may be installed. With Facebook behaviors, countermeasures could include limiting frequency of commentary on posts or applying fact-checking to questionable conspiracy posts.... MORE



Joseph Shavit
Joseph ShavitSpace, Technology and Medical News Writer

Joseph Shavit
Head Science News Writer | Communicating Innovation & Discovery

Based in Los Angeles, Joseph Shavit is an accomplished science journalist, head science news writer and co-founder at The Brighter Side of News, where he translates cutting-edge discoveries into compelling stories for a broad audience. With a strong background spanning science, business, product management, media leadership, and entrepreneurship, Joseph brings a unique perspective to science communication. His expertise allows him to uncover the intersection of technological advancements and market potential, shedding light on how groundbreaking research evolves into transformative products and industries.