☀️ Good Afternoon! Stay informed with this latest update.
Companies allowed more harmful content on user’s feeds, knowing their algorithms ran on outrage, BBC hears.
Watch LiveBritish Broadcasting CorporationHomeNewsSportBusinessTechnologyHealthCultureArtsTravelEarthAudioVideoLiveDocumentariesHomeNewsUS & CanadaUKUK PoliticsEnglandN. IrelandN. Ireland PoliticsScotlandScotland PoliticsWalesWales PoliticsAfricaAsiaChinaIndiaAustraliaEuropeLatin AmericaMiddle EastIn PicturesBBC InDepthBBC VerifySportBusinessWorld of BusinessTechnology of BusinessNYSE Opening BellTechnologyWatch DocumentariesArtificial IntelligenceAI v the MindHealthWatch DocumentariesCultureWatch DocumentariesFilm & TVMusicArt & DesignStyleBooksEntertainment NewsArtsWatch DocumentariesArts in MotionTravelWatch DocumentariesDestinationsAfricaAntarcticaAsiaAustralia and PacificCaribbean & BermudaCentral AmericaEuropeMiddle EastNorth AmericaSouth AmericaWorld’s TableCulture & ExperiencesAdventuresThe SpeciaListEarthWatch DocumentariesScienceNatural WondersClimate SolutionsSustainable BusinessGreen LivingAudioPodcast CategoriesRadioAudio FAQsVideoWatch DocumentariesBBC MaestroDiscover the WorldLiveLive NewsLive SportDocumentariesHomeNewsSportBusinessTechnologyHealthCultureArtsTravelEarthAudioVideoLiveDocumentariesWeatherNewslettersWatch LiveTikTok and Meta risked safety to win algorithm arms race, whistleblowers say5 hours agoShareSaveMarianna Spring,social media investigations correspondentandMike RadfordShareSaveBBC/Getty imagesWhistleblowers have given an inside view of the algorithm arms race which followed TikTok's explosive growthSocial media giants made decisions which allowed more harmful content on people's feeds, after internal research into their algorithms showed how outrage fuelled engagement, whistleblowers told the BBC.
More than a dozen whistleblowers and insiders have laid bare how the companies took risks with safety on issues including violence, sexual blackmail and terrorism as they battled for users' attention.
An engineer at Meta, which owns Facebook and Instagram, described how he had been told by senior management to allow more "borderline" harmful content - which includes misogyny and conspiracy theories - in user's feeds to compete with TikTok.
"They sort of told us that it's because the stock price is down," the engineer said.
A TikTok employee gave the BBC rare access to the company's internal dashboards of user complaints - as well as other evidence of how staff had been instructed to prioritise several cases involving politicians over a series of reports of harmful posts featuring children.
Decisions were being made to "maintain a strong relationship" with political figures to avoid threats of regulation or bans, not because of the risks to users, the TikTok staffer said.
The whistleblowers who spoke to the BBC documentary, Inside the Rage Machine, offer a close-up view of how the industry responded following the explosive growth of TikTok, whose highly engaging algorithm for recommending short videos upended social media, leaving rivals scrambling to catch up.
A senior Meta researcher, Matt Motyl, said the company's competitor to TikTok, Instagram Reels, was launched in 2020 without sufficient safeguards. Internal research shared with the BBC showed comments on Reels had significantly higher prevalence of bullying and harassment, hate speech, and violence or incitement than elsewhere on Instagram.
The company invested in 700 staff to grow Reels, while safety teams were refused two specialist staff to deal with protect
0 Comments