In June 2020 Stanford Internet Observatory came across communities of accounts involved in role-playing games (or RPGs) that used stolen images of children (defined here as under 18 years of age). After a months-long investigation, SIO identified four distinct RPG communities playing on Twitter and Instagram in Portuguese and English, which contained hundreds of stolen images of children.
When playing RPGs, users assume a fictional identity, often in a make-believe world. The specific role-playing game we identified was a segment of a larger role-playing community that appears to use shared terminology and rules. Different sub-RPGs role-play as pop bands, Instagram influencers, hamsters, cafeterias, travel agencies and much more. The role-play game using images of real children was played through accounts on Twitter and Instagram, in both English and Portugese. The accounts’ activities were mostly innocuous; accounts used photos of their child’s character in their profile photos and posted about their virtual lives — their ages, gender, and likes and dislikes— and interacted with other characters along the way.
Though the rules of gameplay take shape differently on each platform, the accounts we identified all use images of actual children. These are most often stolen from ‘public-facing’ sources, such as online advertisements that use child models, but for some, the exact provenance is unclear. A previous investigation into the provenance of similar online activity was documented in 2014 when the tech-centric media outlet Fast Company published an investigation into an Instagram-based community of users role-playing with stolen baby pictures. The investigation centered around one parent who came across an account using an image of her baby lifted from her family blog. After Fast Company published this investigation, Instagram clarified that the content violates their Terms of Service. Fast Company’s description of the RPG activity six years ago is similar to much of the activity SIO recently identified. The endurance of this phenomenon — on Instagram as well as on Twitter — indicates that it is a perennial avenue of online harm that platforms should learn to mitigate.
As more children come online — whether as users role-playing on social media, through their own accounts, or through images posted by their parents — platforms must take greater steps to protect their privacy and safety. In this investigation, we advocate that platforms take a more proactive approach to the content associated with RPG accounts that use images of real children by implementing specific policies to address images of children, users under 18, and RPGs.
To analyze the behavior of the RPG accounts using images of real children, SIO created a list of 12 child characters on Twitter and 86 ‘orphanage’ accounts — pages that post images of a child character that other users can ‘adopt.’ The Twitter accounts connected to a larger community of 13,298 accounts who followed the child characters. These accounts existed across four distinct communities: English-speakers playing on Twitter, English-speakers playing on Instagram, Portuguese-speakers playing on Twitter, and Portuguese-speakers playing on Instagram. While we did not find evidence of membership overlap between these communities, accounts in the four communities behaved and interacted in similar ways.
SIO identified sexualized content referencing child characters in two communities, including sexualized references to photographs of children and screenshot sexual role-play between a child character and an adoptive parent character.
Users frequently employ techniques like character substitution (e.g., däddy for daddy) and rely on direct message-based role-play, which can make it more difficult for platforms and outside observers alike to observe their behavior.
Both the content and age of the users behind the role-playing accounts make these communities vulnerable to potential harm. Some users in the child role-playing communities are themselves children. Additionally, users occasionally complained about the presence of pedophiles in the Twitter-Brazil role-playing community, which indicates users under 18 may be at risk of grooming.
SIO found that recommendation algorithms implemented by the social media platforms direct users to these communities.
SIO first came across a single user employing unconventional Portuguese language choices and character substitutions, and soon identified this user as part of a larger community of child RPG characters. Many of the users in this community role-play as orphans seeking adoption, inviting other users interested in being an “uncle,” “aunt,” or “friend” to direct message (DM) them. Additionally, some communities contain accounts that role-play as “orphanages” themselves, posting images of children seeking adoption and asking other users to reach out to the orphanage via direct message if they want to adopt.
For the purposes of this investigation, we focused solely on the phenomenon of RPGs that involve platform users role-playing as children with images of real children. Our investigation examines four communities: The first two are composed of users from Brazil, on Twitter and Instagram. These RPG communities somewhat overlapped with the (very large) Brazilian K-pop Twitter community, demonstrating how porous the Brazilian communities can be. The second two communities are English-language focused. Different region-specific spelling patterns suggest that the English communities were not country specific.
Table 1 outlines details about each community. In general, the oldest accounts were mostly located in the English communities, while Brazilian communities consisted of more newly created accounts. While many of the accounts in the English communities, such as those on Instagram, may not be active, they are still searchable on the platform.
From examining the communities above, five main themes emerged:
Image Theft: Most images used in these communities are stolen from other sources.
In order to build their characters, users habitually relied on stolen images. These images spanned all age groups, but we focused our investigation on those that depicted children.
The images themselves were taken from different sources. Some had lower image quality and were unstaged, which may indicate that they were lifted from the accounts of parents without their knowledge, as we saw with the Fast Company article from 2014. However, we cannot fully confirm this, as we were unable to identify the original copyright holders through reverse image searches. Most images were taken from ‘public-facing’ children — either models, children of influencers or child influencers themselves. In one case, an image from a child influencer with 364,000 Instagram followers was used for role-play in the Instagram-English community. In each of the communities, many of the images also appeared on Pinterest boards that consisted of hundreds of baby pictures, which were themselves sourced from other social media platforms such as Wattpad. Thus, with the image passing through many different hands and media, the ownership of the original image becomes lost.
Sexualization Of Children: Some accounts in the Instagram-English and Twitter-Brazil communities sexualize child RPG characters.
While the majority of accounts involved in the RPG communities are non-sexual, we found a notable amount of comments that sexualized child RPG characters in both Twitter-Brazil and Instagram-English communities.
In the Twitter-Brazil community, some users were approached by people interested in engaging in sexualized role-play via direct message, while others complained that people role-playing as children were attracting “more pedophiles than were already present” in the community. Present on some Twitter-Brazil users’ timelines are sexualized comments about the child characters as well as screenshots of DMs role-playing sexual activity with child characters (these screenshots only featured text; we did not come across any images sent by DMs). Meanwhile, in the Instagram-English community, we found a few accounts, mostly from 2012 and 2013, that role-played in a sexualized fashion in the comments of a post. One account used sexualized RPG hashtags on a few of their posts.
Obfuscation: Characters in every community habitually use DMs for role-play and unconventional spelling, which can make detecting these communities difficult.
Every community we identified used obfuscation techniques as part of conventional gameplay. Obfuscation makes it difficult for either an individual social media platform or independent researchers to identify and track the online harm present in these communities.
Both the Twitter-Brazil and Instagram-English communities used unconventional spelling substitutions, which can make detection more difficult for platforms and users alike. For example, Instagram users used the “#däddyrp” hashtag as a substitute for #daddyrp. Both of these hashtags have since been made unsearchable on Instagram, hiding thousands of posts from search. In the Twitter-Brazil community, many users habitually substitute UTF-8 characters for alphanumeric characters (e.g., α for a), which seems to have been developed as a response to issues they experienced using phrases like “tenho 9 anos” (“I am 9 years old”) — problematic content on a platform that bans users younger than 13 years old.
In all observed communities, most actual gameplay centers around direct messaging, with users role-playing as their online characters in private chats. In the Twitter-Brazil community, users role-playing as children solicited DMs from potential “adoptees,” “uncles”/“aunts,” or other users. Many accounts complained that they were DM-blocked for excessive messaging, indicating that these users were sending and receiving messages in extremely high volumes. Additionally, some accounts run an application process for new users through WhatsApp, and others use WhatsApp and Telegram as the primary mode of communication. This evidence indicates that much of RPG gameplay occurs outside of the public eye, whether via platform DM or another platform entirely, which decreases the likelihood that observers can report harmful content.
Account Users: Some of the users role-playing in each community seem to be underage themselves.
Some of the posts suggest that users behind these accounts are themselves children — for example, one user from the Instagram-English community posted in the comments that they were 14 years old. Another posted in the description of their orphanage role-play account that they have school from 9 a.m. to 4 p.m. and therefore won’t be active on the platform during those hours. The context and tone of these conversations strongly suggested that the accounts were ‘out of character’ and these users were talking about their real lives.
The age of the RPG users may make this community more vulnerable to unwanted or inappropriate solicitations. In a recent Fast Company article about sexualized gameplay on the gaming platform Roblox, experts noted that the combination of sexualized play and child users may be attractive to pedophiles interested in grooming children online. SIO also identified the combination of sexualized role-play and users under 18 in this investigation, which may leave children in these communities similarly vulnerable to grooming. The in-community complaints about pedophiles within the Twitter-Brazil dataset as well as sexualized comments on photographs of children indicate that this is already an issue recognized within the RPG community itself.
Recommendation Algorithms: Platforms implement recommendation algorithms that direct users to more RPG accounts with real child images.
We found that in some cases, the platform recommendation algorithms on Twitter and Instagram increased the ease of finding accounts that use stolen child images. On Instagram, after we searched for orphanage accounts in the Portuguese-speaking community, recommendations for other RPG orphanages showed up at the bottom of the page. Additionally, while we looked at the profiles of child characters, Twitter recommended new profiles to follow, most of which were also child characters.
In both the Twitter and Facebook RPG communities, users raised concern about the content they came across (and explicitly about the potential predators in the case of Twitter’s Brazil community). On Instagram, users seemingly from outside the RPG community claimed to have reported the accounts and vocalized their outrage at the content in the posts’ comments. While it is good that users are reporting content themselves, social media platforms must do more in response to the harms illustrated above, including developing and enforcing policy around online role-playing activity involving images of children.
Because children can’t meaningfully consent to having their images shared online and photographs of children are uniquely vulnerable to abuse, social media companies face unique challenges by allowing users to post images of children on their platform. These photographs have many legitimate uses, such as when posted by child actors, models or influencers, or family bloggers who make their living through social media. However, companies must balance the needs of these users with the online safety risks posed by images of children. These images were seemingly taken without meaningful consent from the children depicted, their guardians or the copyright owners. Children and their parents may be disturbed by the inauthentic use of their imagery; even with innocuous role-play, the theft of imagery can cause deep harm.
While an outright ban of photographs of children might be an overly austere measure, platforms could consider banning the posting of photographs by non-copyright owners and the use of photographs of children in role-playing activity.
Because users in role-play communities by definition engage in inauthentic behavior, often using photographs they do not own the copyright to, social media companies must carefully consider how RPGs fit into their content policies. While an outright ban of RPGs may not be appropriate, platforms may consider banning the use of copyrighted photographs in role-playing, role-playing as underage characters, role-playing as real people, and sexualized role-play. Platforms should also be sensitive to the risk of predators grooming children via role-play direct messaging, so DM-based role-play may merit additional scrutiny.
The use of multiple platforms in online RPGs increases the difficulty of monitoring and mitigating any harmful content present in these communities. Platforms may consider engaging in information-sharing with each other in order to increase their ability to detect these communities.
Age of Users
Though the users operating the Instagram and Twitter accounts may be older than 13 and thus allowed to have an account on the platform, the RPG activity we came across brings up the policy question of whether role-playing activity should be allowed for characters depicted as younger than the age required to have an account on the platform.
Additionally, because a notable amount of the RPG images were taken from child influencers who have sizable followings on platforms such as Instagram and TikTok, this RPG activity raises additional questions about whether it is okay to have child “influencers” whose accounts are run by their parents or guardians old enough to have an Instagram, Tik Tok or Twitter account if the children themselves are under this age restriction. If child influencers are allowed to have accounts, we recommend additional privacy measures or methods to track down stolen imagery.
The platforms’ recommendation algorithms surrounding child imagery should be closely evaluated. We propose that the algorithm not recommend RPG accounts that use images of real children. Platforms could consider only allowing certain types of content to be recommended to users.
In terms of policy enforcement, role-play activity using stolen images of real children violates Twitter’s and Instagram’s copyright policies. Additionally, because of some of the commentary was sexualized, we believe Twitter’s policy that “sexualized commentaries about or directed at a known or unknown minor” and Facebook’s policy that “content (including photos, videos, real-world art, digital content, and text) that depicts any sexual activity involving minors” should apply to sexualized role-playing content with real child images. In order to ensure that content on a platform conforms with platform policies, platforms need to take a more proactive approach to disallowed content.
Fake child characters with real images represent one element of the larger issue of online child imagery. This case study is additionally difficult because RPG content might not show an immediate threat or harm, as more egregious cases of online child imagery might. However, this investigation across two platforms, two languages, and multiple years of user activity demonstrates that this type of content will continue if platforms do not act. Platforms need to develop and enforce clear policies around the use of child imagery that falls outside of Child Sexual Abuse Material and other types of protections. Given that some of these children in the images are under the age requirements for accounts on the platforms, the policy considerations of child imagery merit special consideration.