Viewing a single comment thread. View all comments


selver wrote

Yeah that's cause the author of the article spends all their time researching Nazi conspiracy theories or something.


leftous wrote (edited )

I would agree with you if I hadn't experienced this myself. Here is a Reddit post I made a few months ago where I documented what I was noticing with google products at the time.

I do believe google is playing a role in further radicalizing people toward alt-right/fash ideology.


selver wrote (edited )

I agree that it does that, but I think it's more due to an algorithm being stupid than anything particularly malicious (other than being profit-seeking ofc). I get the same sort of results on youtube, but I think it's because google doesn't know how to categorize a person who's into radical politics. I'm sure most of us have looked at alt-right stuff since it gets pointed out a lot by anti-fascists, and google doesn't differentiate whether you checked something out to hate it or agree with it.

It probably just sees that I read shit about identity politics, and goes "I see you like identity politics, have you tried these takes on identity politics, they seem to be popular." "I see you watched Noam Chomsky videos and don't like the State, but have you considered maybe Jews run the State?" But a lot of stuff on YouTube seems to only be a stone's throw away from the far right, and maybe that's intentionally done by the people making the videos.


leftous wrote

It is possible that it's unintentional and just due to the algorithm.

However, my reason for suspecting it's more than that is the fact I don't (or extremely rarely) get recommendations for further leftist videos or leftist youtubers when I watch left-wing material (which is most of the time). But if I watch a moderately conservative/liberal video, I will get content that is further right-wing and in some cases extreme. Often for weeks.