Viewing a single comment thread. View all comments

2

selver wrote (edited )

I agree that it does that, but I think it's more due to an algorithm being stupid than anything particularly malicious (other than being profit-seeking ofc). I get the same sort of results on youtube, but I think it's because google doesn't know how to categorize a person who's into radical politics. I'm sure most of us have looked at alt-right stuff since it gets pointed out a lot by anti-fascists, and google doesn't differentiate whether you checked something out to hate it or agree with it.

It probably just sees that I read shit about identity politics, and goes "I see you like identity politics, have you tried these takes on identity politics, they seem to be popular." "I see you watched Noam Chomsky videos and don't like the State, but have you considered maybe Jews run the State?" But a lot of stuff on YouTube seems to only be a stone's throw away from the far right, and maybe that's intentionally done by the people making the videos.

1

leftous wrote

It is possible that it's unintentional and just due to the algorithm.

However, my reason for suspecting it's more than that is the fact I don't (or extremely rarely) get recommendations for further leftist videos or leftist youtubers when I watch left-wing material (which is most of the time). But if I watch a moderately conservative/liberal video, I will get content that is further right-wing and in some cases extreme. Often for weeks.