Instagram’s recommendation algorithms linked and even promoted a “vast pedophile network” that advertised the sale of illicit “child-sex material” on the platform, according to the findings of an alarming report Wednesday.
Instagram allowed users to search by hashtags related to child-sex abuse, including graphic terms such as #pedowhore, #preteensex, #pedobait and #mnsfw — the latter an acronym meaning “minors not safe for work,” researchers at Stanford University and the University of Massachusetts Amherst told the Wall Street Journal.
The hashtags directed users to accounts that purportedly offered to sell pedophilic materials via “menus” of content, including videos of children harming themselves or committing acts of bestiality, the researchers said.
Some accounts allowed buyers to “commission specific acts” or arrange “meet ups,” the Journal said.
Sarah Adams, a Canadian social media influencer and activist who calls out online child exploitation, told the Journal she was affected by Instagram’s recommendation algorithm.
Adams said one of her followers flagged a distressing Instagram account in February called “incest toddlers,” which had an array of “pro-incest memes.” The mother of two said she interacted with the page only long enough to report it to Instagram.
After the brief interaction, Adams said she learned from concerned followers that Instagram had begun recommending the “incest toddlers” account to users who visited her page.
Meta confirmed to the Journal that the “incest toddler” account violated its policies.
When reached for comment by The Post, a spokesperson for Instagram’s parent company, Meta, said it has since restricted the use of “thousands of additional search terms and hashtags on Instagram.”
source nypost.com