The Bot Multiple: Unpacking the Materialities of Automated Software Agents


Talk at Annual Meeting of the Society for the Social Study of Science (4S), Denver, CO

This paper examines the roles that automated software agents (or bots) play in the governance and moderation of Wikipedia, Twitter, and reddit – three online platforms that differently uphold a related set of commitments to ‘open’ and ‘public’ online participation. While bots are often discussed as malicious or fake agents (e.g. ‘socialbots’), the bots I discuss in these three platforms are more or less legitimate social actors, delegated substantial authority in autonomously enforcing norms and policies. These bots extend and modify the functionality of sites like Wikipedia, Twitter, and reddit, and are generally developed and deployed by volunteers on their own time – continuously operated on computers that are independent from the servers hosting the site. These governance bots involve alternative relations of power and code, requiring that we go beyond studying software code in order to unpack the sociomaterial configurations at work in such digitally-architected spaces. Instead of taking for granted the pre-existing stability of these sites as unified platforms, bots require that we examine the concrete, historically contingent material conditions under which this code is run. Reporting from a multi-sited ethnography of infrastructure, I demonstrate several ways in which bot development comes on the scene in relation to broader assemblages of server farms, platform code, federated databases, code repositories, issue trackers, application programming interfaces, terms of service, mailing lists, counterpublic groups, and a variety of other entities. I argue that bots give us a compelling set of cases for exploring the multiple materialities at work in highly-distributed online spaces.