[liberationtech] YouTube doesn't lead to radicalization, study finds

John Ohno john.ohno at gmail.com
Thu Jan 2 20:44:39 CET 2020


It's conceptually possible for a recommendation system to avoid trendism or
limit its effect. For instance, a recommendation system that's extremely
granular & really privileges close matches over loose ones will have only a
small trendism effect because the buckets will be pretty small: it'll
recommend to you stuff seen by three or four people, if those three or four
people share with you some of your rarest preferences. Alternatively,
recommendation systems that explicitly introduce random factors or that cut
off or invert the top end of the popularity spectrum (so that anything more
popular than average is actually handicapped by that popularity) will
result in much more bell-curve-like distributions of popularity.

What Wales is suggesting might be closer to 'sort by controversial', which
has some problems when done algorithmically.

I've written a couple short articles on this topic:

https://medium.com/@enkiv2/against-trendism-how-to-defang-the-social-media-disinformation-complex-81a8e2635956?source=friends_link&sk=867e6aa7117b5a1a06fbd1c7d7129b6a

https://medium.com/@enkiv2/trendism-cognitive-stagnation-21c8e003df83?source=friends_link&sk=484882732c87324401e0a25d083577e9

https://medium.com/@enkiv2/sort-by-controversial-as-a-proxy-for-information-content-748bd7124b7a?source=friends_link&sk=ff5730559afb33bfdf1c2a4e177feaee

Since writing that last one, I've soured a bit more on 'sort by
controversial'. Sort-by-controversial produces very similar results to
sort-by-popular if counting by views as opposed to by user rating, in
environments where people are incentivized by view-based metrics to
hate-share. Any sort-by-controversial algorithm that produces genuine
novelty will need to account for cohort effect by finding clusters (wherein
some group has similar ratings for similar pieces of media) and looking for
outliers to the clusters. I'm not sure that can be done reliably without
serving up mostly-uninteresting stuff...

On Thu, Jan 2, 2020 at 2:02 PM Tim Phillips <tim.p.phillips at gmail.com>
wrote:

> Jimmy Wales of Wikipedia said the following recently in an AMA, which
> makes me think his new social media experiment WT.Social might provide some
> efforts toward tackling the recommendation system echo chamber effect.
>
> > I have said for a long time that I wish facebook would have a setting:
> "Instead of showing you things we think you will like, we want to show you
> things we think you'll disagree with, but which we have signals that
> suggest they are of quality." There's nothing better, really, than finding
> something challenging and interesting that I disagree with, but for which I
> have to concede: it makes me think.
>
>
> https://www.reddit.com/r/IAmA/comments/e52r7u/iama_jimmy_wales_founder_of_wikipedia_now_trying/f9hdsns/?context=8&depth=9
>
>
> https://www.reddit.com/r/IAmA/comments/e52r7u/iama_jimmy_wales_founder_of_wikipedia_now_trying/
>
> That being said, I've found some amazing things through recommendation
> engines, but only within my own interest boundaries.
>
> Best,
> Tim
>
> On Thu, Jan 2, 2020 at 9:17 AM John Ohno <john.ohno at gmail.com> wrote:
>
>> Google sets (and checks) a perma-cookie for customizing youtube
>> recommendations & doing ad targeting -- so even without an account, a bot
>> following recommendations will have a customized element to rankings based
>> on history (unless that cookie is manually deleted). I recall a paper a few
>> years ago doing exactly the same thing (following recommendations with no
>> account) & finding that, from a marginally political starting point, all
>> roads lead to nazi videos, so I suspect that these results are probably
>> because of changes to the ranking algorithm designed specifically to
>> discourage fringe content (and encourage centrist content). (I don't recall
>> the name of that paper but I'm sure anybody paying attention to the subject
>> in 2016 and 2017 will recall seeing it.)
>>
>> This kind of analysis is tough, not just because of the black box nature
>> of a constantly-tweaked ranking algorithm & the path dependency of a random
>> walk through recommendations (whose actual numeric rankings are not
>> visible) but because we know from experience that factors like location,
>> ISP, and browser type are taken into account, & spoofing these in a
>> controlled way without hitting anti-spoof measures complicates an
>> already-difficult analysis.
>>
>> Regardless of any kind of tweaking Google might do, automatic
>> recommendation systems have an underlying problem (which all of them share,
>> aside from the rare experiment / art project): items are ranked and
>> recommended based on either existing popularity, hotness (which depends
>> upon popularity as a factor), or interaction count (which depends upon
>> popularity as a factor) -- which is to say, no matter the other factors
>> taken into account, recommendation systems drive people into groupthink by
>> nudging them in the direction of consuming the same things as other people.
>> As far as I'm aware, the only countermeasure for this (on a personal scale)
>> is the use of third-party randomizers (which select options totally
>> randomly, and sometimes also support filtering out anything with greater
>> than some maximum number of views). There's a business case against
>> eliminating the 'trendist' model of promoting already-popular things:
>> Google depends upon the handful of 'winners' on their platform as the
>> platform's face, & as fewer and fewer of them can make a living off
>> Youtube, the ones that remain make more & are more closely tied to the
>> platform. But, for subscription services, evening the playing field makes
>> more sense: Netflix makes the same amount of money regardless of how much
>> people watch (and, in fact, makes marginally more money if people watch
>> less but maintain their accounts), and so to the extent that they pay out
>> royalties based on viewership, they have no particular reason not to spread
>> out views more evenly through a more nuanced and granular recommendation
>> model.
>>
>> On Thu, Jan 2, 2020 at 11:30 AM Niels Abildgaard <
>> niels.abildgaard at gmail.com> wrote:
>>
>>> Freedom abhors accounts.
>>>>
>>>> Here are some accounts...
>>>>
>>>
>>> I have no idea what this is supposed to mean, or how it is relevant to
>>> the study. The study, concretely, didn't study how recommendations for a
>>> YouTube account evolved over time, as they were not logged in when looking
>>> at recommendations. Freedom might abhor accounts, and if you don't use an
>>> account on YouTube... good on you, I guess? But that's not what the
>>> discussions about radicalization on YouTube have been about. The study
>>> doesn't apply, is the point.
>>>
>>> Corinne's link has more good details :-)
>>>
>>> Den tir. 31. dec. 2019 kl. 20.47 skrev grarpamp <grarpamp at gmail.com>:
>>>
>>>> > an account
>>>>
>>>> Freedom abhors accounts.
>>>>
>>>>
>>>> Here are some accounts...
>>>>
>>>> https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
>>>> https://ibmandtheholocaust.com/
>>>> infohash:20820F55D884C945154136689E436990107DD1E9
>>>>
>>>> --
>>>> Liberationtech is public & archives are searchable from any major
>>>> commercial search engine. Violations of list guidelines will get you
>>>> moderated: https://lists.ghserv.net/mailman/listinfo/lt. Unsubscribe,
>>>> change to digest mode, or change password by emailing
>>>> lt-owner at lists.liberationtech.org.
>>>
>>> --
>>> Liberationtech is public & archives are searchable from any major
>>> commercial search engine. Violations of list guidelines will get you
>>> moderated: https://lists.ghserv.net/mailman/listinfo/lt. Unsubscribe,
>>> change to digest mode, or change password by emailing
>>> lt-owner at lists.liberationtech.org.
>>
>> --
>> Liberationtech is public & archives are searchable from any major
>> commercial search engine. Violations of list guidelines will get you
>> moderated: https://lists.ghserv.net/mailman/listinfo/lt. Unsubscribe,
>> change to digest mode, or change password by emailing
>> lt-owner at lists.liberationtech.org.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ghserv.net/pipermail/lt/attachments/20200102/686cc367/attachment-0001.html>


More information about the LT mailing list