The Technology 202: Researchers say it’s easy to find drugs on Facebook, Instagram and YouTube
By Cat ZakrzewskiMarch 15, 2021 at 9:28 a.m. EDT
with Aaron Schaffer
Accounts peddling steroids, opioids and other drugs remain easy to find on major social networks, casting doubt on the companies’ ability to enforce their own policies against such illegal sales.
A new report published today by the Digital Citizens Alliance, a consumer watchdog group, details how Facebook pages, Instagram accounts and YouTube videos are promoting drug sales, in some instances to thousands of followers or viewers. The Digital Citizens Alliance, researchers and other advocacy groups have warned the tech companies of such sales for several years, and they say the persistent appearance of such posts raises questions about how serious tech companies are about cracking down on apparent criminal activity.
“If we can find it, they can find it. That begs the question how hard they’re looking,” Digital Citizens Alliance Executive Director Tom Galvin said in an interview.
The researchers also uncovered that similar accounts were peddling what they claimed to be coronavirus vaccines or testing kits. Law enforcement has said it would be fraudulent for individuals to sell vaccines, which are being distributed in authorized medical centers – and the report’s authors raised concerns that potentially fake vaccines or test kits could give people a false sense of security about their protection from the virus. Facebook, Twitter, Google and others have been particularly aggressive in promising to stamp out content taking advantage of the public health crisis, and the findings raise questions about the enforcement of these policies.
The findings raise questions about how much Silicon Valley has focused on rooting out drug sales.
Two years ago, social networks faced intense scrutiny of their handling of opioid sales, as tens of thousands of Americans die each year from the opioid epidemic. But consumer advocates worry the companies are not dedicating the same amount of time, people and resources to the issue as they’ve been under more intense public criticism for other content moderation problems, especially pertaining to political disinformation or relating more directly to the pandemic.
“The platforms continue to treat this as a PR problem instead of an Internet safety issue,” Galvin said. “When we put out things, they take them down and say all is fixed. But it’s whack-a-mole, and they’re up again the next week.”
Eric Feinberg, the vice president of content moderation at Coalition for Safer Web who contributed research to the report, said he compiled a list of 40 accounts across major social networks in October 2019 selling steroids. As of March 7, 2021, the companies had only detected and removed ten of them.
Facebook, which also owns Instagram, and YouTube removed some of the accounts identified in the report and that Feinberg flagged separately to The Technology 202. “We removed the violating content identified in this report since our policies make it clear that buying, selling or trading drugs, including steroids, is not allowed anywhere on Facebook or Instagram,” company spokesman Andy Stone said.
Facebook invests in removing drug-related content. A recent company transparency report found that as of December, its A.I. systems proactively identified 96 percent of drug-related posts that people reported on the platform. That’s up about two percentage points from the previous quarter, and could let the company more quickly remove or limit drug sales on the platform.
Yet Feinberg also raised concerns about how social networks’ algorithms pushed drug-related content directly to him. Once he began following and searching for hashtags related to opioids, Instagram began to recommend he follow accounts such as Oxycodone626 and Xanax Raw that were apparently selling drugs.
Accounts selling drugs also began following him in large numbers. He told me he now has 160 accounts pushing drug sales on Instagram following him, including some with more than 1,000 followers. “I’m not searching it, it’s searching me,” Feinberg said.
The findings come as Washington lawmakers increasingly weigh how to rein in social media giants.
Earlier this year, Democrats including Sen. Mark Warner (D-Va.) introduced the Safe Tech Act, which would make it easier for Web users to seek court orders and file lawsuits if posts, photos and videos — and the tech industry’s failure to address them — threaten them personally with abuse, discrimination, harassment, the loss of life or harms, as my colleague Tony Romm has reported. There’s a broader debate in Washington about whether it’s time to overhaul Section 230, a decades-old legal shield that protects social networks from lawsuits over content on their site. Feinberg says he thinks the law needs to be updated so that companies face greater pressure to address illegal content.
The report also uncovered that advertising from major brands — ranging from Disney to Toyota — appears alongside the illicit posts.
Advertisers — who are the key revenue drivers of social networks such as Facebook and YouTube — have played a pivotal role in forcing the platforms to take more aggressive action on other forms of harmful content. Last year, a group of major companies led an advertising boycott of Facebook and Instagram after civil rights advocates raised concerns about the companies’ handling of hate speech and disinformation, particularly targeting minorities ahead of the 2020 elections.
DCA say the findings underscore the need for platforms to more effectively collaborate to address criminal behavior online.
Galvin said for three years, the DCA has been calling the large tech companies to work together to root out accounts selling drugs online, because it’s apparent that some of the actors are working across multiple platforms. Tech companies already work together to root out terrorist content and child pornography, and they at times participated in threat-sharing sessions with law enforcement around U.S. elections.
Galvin thinks it’s time for them to apply similar tactics to address this problem.
“There has to be something more institutional, structural done, hopefully among all the platforms to work together to identify bad actors,” Galvin said. “Then they have to share that information don’t just go to another platform like walking into another neighborhood, or come back to that neighborhood. They have to be able to identify them.”
Our top tabs
Millions of videos are inaccessible to deaf or blind Americans as short-form video apps skyrocket in popularity.
Creators and social media users who are deaf or blind say they’re excluded from apps or are forced to find complicated or expensive workarounds to use them, Rachel Lerman reports. TikTok, which was one of 2020’s hottest apps, critically doesn’t have a built-in way to easily caption videos, rendering some videos inaccessible.
A company executive, Joshua Goodman, noted that the company recently rolled out automatic narration technology and is “continuing to develop new products to make TikTok more accessible for everyone.” But the app’s existing features, such as one that gives users the ability to add text to videos, have their own set of issues and complications.
“Frankly, I think a lot of companies just kind of ignore the issue of accessibility,” said Christian Vogler, professor and director of the Technology Access Program at Gallaudet University. He pointed to slow captioning developments on Instagram and TikTok, and noted that millions would benefit from captions or other accessible technologies.
A judge allowed a lawsuit challenging Google’s collection of data during private browsing sessions to proceed.
Judge Lucy Koh dealt a blow to the search giant when she rejected its motion to dismiss the lawsuit, which was filed as a class-action suit in federal court in California, Bloomberg News’s Malathi Nayak and Joel Rosenblatt report. Major technology companies continue to face scrutiny over data collection, and Google recently promised to stop tracking individual users for advertising purposes.
The Google Chrome users who filed the lawsuit said the company’s software tools continue to track users after they have opted to browse the Internet privately.
“We strongly dispute these claims and we will defend ourselves vigorously against them,” Google spokesperson Jose Castaneda said. “Incognito mode in Chrome gives you the choice to browse the Internet without your activity being saved to your browser or device. As we clearly state each time you open a new incognito tab, websites might be able to collect information about your browsing activity during your session.”
Facebook found that a small group of users is responsible for much of the content casting doubt on vaccinations.
Ten out of 638 population segments developed by the company for a study contain half of the vaccine hesitancy content on the platform, and just 111 users contributed half of all content in the most vaccine-hesitant segment, Elizabeth Dwoskin reports. Even though Facebook is banning outright falsehoods about immunizations, the study reveals the risks of content that raises skepticism about vaccinations.
The study also revealed evidence of overlaps between communities skeptical of vaccines and those affiliated with QAnon, a sprawling set of false claims that have coalesced into an extremist ideology that has radicalized its followers
The company could use the study’s findings to inform discussions about its policies or to direct more authoritative information to those groups, spokeswoman Dani Lever said, but it is still working on finding a solution.
Rant and rave
Critics of Facebook said Elizabeth’s article confirmed their long-running concerns about the role the social network is playing in spreading skepticism about vaccines. CNN reporter Donie O’Sullivan:
Baltimore doctor Scott Krugman:
Platformer’s Casey Newton:
ORIGINAL ARTICLE: https://www.washingtonpost.com/politics/2021/03/15/technology-202-researchers-say-it-easy-find-drugs-facebook-instagram-youtube/?outputType=amp