Wiki sites and other contributor-based web pages may see reduced user participation after introducing automated bots into their workflow, according to a new paper by Virginia Commonwealth University researchers — apparently the first study that has used rigorous econometric methods to track the connections between bot implementation and subsequent user activity.
But some of the negative impact of bots can be mitigated if there is a higher level of active usership on a particular platform or if a site's pages cover specific, focused topics as opposed to a broad range of subjects. The findings, detailed in a study published May 25 in Decision Support Systems, could help site managers find more effective ways to develop and deploy bots without losing valuable contributors.
"The bot is introduced in a way [intended] to help people, but the platform managers should be cautious in implementing such a bot agent because of these unintended consequences," Yeongin Kim, an assistant professor at VCU's School of Business and the paper's third author, told The Academic Times. "[Managers] have to understand what is actually happening after the bot intervention and be more cautious in utilizing this kind of tool."
The VCU researchers examined the introduction of bots into the popular instructional database wikiHow, using open-access data from 2005 to 2017.
The first bot they studied, known as Votebot, allows users to rate which information in an article is most relevant. The bot then reorganizes a particular page to reflect users' evaluations, with the most important information appearing at the top. The second bot, called Stubbot, flags incomplete or problematic articles so that collaborators can prioritize editing them.
The implementation of Votebot reduced the number of unique users by over 10%, while Stubbot's introduction led to around an 8% reduction in user participation.
The underlying mechanisms that contributed to this decline are likely multifaceted, but the researchers believe one possibility is that bots can impede the natural exchange of ideas that takes place in online communities.
On user-generated websites, contributors often learn about rules, moderation and project priorities from other users. That interaction, in and of itself, can help build a stronger, more active community, while the presence of a bot may interrupt those communication channels, the researchers explained. Volunteers may also be distrustful of bots — viewing them as a sort of enforcement tool that site managers could potentially use to limit or shape activities on a platform.
"We even asked the question: Is it possible that the bots actually take away the users' jobs?" Kim said.
Although the researchers were unable to formally test the theory, they said it could be possible that some bots may have taken over some of the tasks that had previously been assigned to humans, giving the humans less motivation to complete other projects on the site.
But the solution is not to abandon bots altogether, Xiaojin Liu, an assistant professor of supply chain management and analytics at VCU and the paper's second author, told The Academic Times. Human error can lead to a number of disastrous consequences: Collaborators may upload images that violate copyright laws, or they may plagiarize material from other websites. But that's just the start.
"Even worse, there are some users that try to come to the dark side and just delete everything," Liu said. "So we definitely need a bot to say, 'Woah, okay, there's some bad behavior. I detect it.'"
The sheer number of articles available on contributor-supported sites would also make bot abolition an impractical solution. User-generated sites have proliferated on the web: Wikipedia and Reddit — two sites that rely on contributor content as well as bot assistance — are the 13th and 19th most-viewed sites on the web, respectively, according to Alexa. In May 2021 alone, Wikipedia brought in 26 million user edits and 23 billion page views. Bots are such a part of its ecosystem that the site has created its own bot policy to address issues related to automated activity on its pages.
Niche sites and spin-offs have taken root online, too — "wikiHow in process knowledge, Scholarpedia in academic research, and Clinfowiki in healthcare," the researchers wrote.
Seeing bot implementation as an inevitability, the VCU researchers offered several solutions to help ease the transition. Site managers could create incentives for veteran users to encourage newer collaborators to remain active on the site. Another option is to make bots act more like humans so that they are less likely to compromise or stifle the work of real users, according to Kim. And because a higher number of users help alleviate some of the demotivation observed after bot implementation, site managers could also tap a more international user base, bringing creators and moderators from new regions and backgrounds into the fold, Liu added.
The researchers said that sites such as Wikipedia have already pioneered several promising policies, such as allowing certain users to provide their feedback about new bots before they are fully implemented on a site.
"Even though the bot is introduced by the platform manager, I think it's a good idea to show the bot to a specific group of users and then get their agreement," Kim said. "This is an important process in order to reduce the negative consequences."
The study "Can bots help create knowledge? The effects of bot intervention in open collaboration," published May 25 in Decision Support Systems, was authored by Seonjun Kang, Xiaojin (Jim) Liu, Yeongin Kim and Victoria Yoon, Virginia Commonwealth University.