You must log in or register to comment.


mofongo wrote (edited )

I find this dubious, Google provides pages that already align with one's worldview. I did my own search and got no fascist sites, only what one would expect Wikipedia, the Holocaust museum website, news sites (BBC, the independent, etc) even after seeing three pages.

So if you're receiving fascist sites is because Google has you tag as someone that visits similar sites.

PS: hail surveillance!


indi wrote

A big part of Google's results personalization algorithms is your location. So it's not all that dubious for results to be skewed like that in some particular location. Like Sweden, for example.

Speaking generally though, I don't find it the least bit surprising that searches for things like "holocaust" or "did the holocaust happen" turn up a lot of Nazi shit. They're the ones who are most likely to be talking about it. And it's not just about Nazis and anti-semitism: searches for racial stuff will probably more likely turn up racist sites, feminist issues will probably more likely turn up misogynist sites, and so on. Just think about it: people who aren't misogynists (for example) don't really have all that much motivation to talk about feminism all the time, because for them, it's a totally non-controversial given... but for misogynists, it's a perennial bug up their ass, so they're going to complain about it all the time, and share links back and forth with other misogynists to fortify their hate with like-minded affirmations and supportive conspiracy theories. Likewise, anyone who's not a complete fucknut doesn't feel a need to discuss the Holocaust all that often... but for Nazis and deniers, it's always on their mind.

This has nothing to do with whether or not Google is the great Satan of privacy destroying data collection; it's a completely obvious and even unavoidable outcome of any attempt to automatically rank a site's relevance to a search term. By just about any measure you can imagine other than a "bullshit/not-bullshit" scale - frequency of mention, number of visits, number of inbound links - a hate site is going to rank high on the relevance score for a controversial term. Any AI intelligent enough to decide whether a site is truly legit or not will have to be an AI that makes value judgments... that is, something that literally thinks for you. Is that what anyone really wants?

The real problem here is not Google, it's the misconception that scoring high on a search rank is a measure of the quality of some site's information, and not merely the popularity. It's like going to a library and thinking the most authoritative books on a topic are the ones with the most eye-catching covers. Google's page rank is not a proxy for verisimilitude. And it is certainly not an excuse to turn off your brain and not do any critical analysis of whether a site's data is trustworthy.

Blaming Google, as the article does, is missing the point entirely. What we should be alarmed about is people so clueless about the way information works that they take articles like this one seriously.


jadedctrl wrote (edited )

Google has been blasted for producing links to anti-Semitic blogs and a notorious neo-Nazi website as top search results relating to the Holocaust or Jews in Sweden.

Google factors in history and location, so it might take less of a fascist leaning (or none at all) to find more anti-Semitic sites in different places


selver wrote

Yeah that's cause the author of the article spends all their time researching Nazi conspiracy theories or something.


leftous wrote (edited )

I would agree with you if I hadn't experienced this myself. Here is a Reddit post I made a few months ago where I documented what I was noticing with google products at the time.

I do believe google is playing a role in further radicalizing people toward alt-right/fash ideology.


selver wrote (edited )

I agree that it does that, but I think it's more due to an algorithm being stupid than anything particularly malicious (other than being profit-seeking ofc). I get the same sort of results on youtube, but I think it's because google doesn't know how to categorize a person who's into radical politics. I'm sure most of us have looked at alt-right stuff since it gets pointed out a lot by anti-fascists, and google doesn't differentiate whether you checked something out to hate it or agree with it.

It probably just sees that I read shit about identity politics, and goes "I see you like identity politics, have you tried these takes on identity politics, they seem to be popular." "I see you watched Noam Chomsky videos and don't like the State, but have you considered maybe Jews run the State?" But a lot of stuff on YouTube seems to only be a stone's throw away from the far right, and maybe that's intentionally done by the people making the videos.


leftous wrote

It is possible that it's unintentional and just due to the algorithm.

However, my reason for suspecting it's more than that is the fact I don't (or extremely rarely) get recommendations for further leftist videos or leftist youtubers when I watch left-wing material (which is most of the time). But if I watch a moderately conservative/liberal video, I will get content that is further right-wing and in some cases extreme. Often for weeks.