emma
emma wrote
Reply to comment by Enheduanna in Why mastodon isn't eco friendly displayed in one image. by postleftpuppy
pgp isn't an option because the signing party would have to be the website you're retrieving preview data from. if instances wanted to share this data with one another to spare innocent websites, they can already do so securely, since mastodon signs all activitypub messages.
emma wrote
Reply to comment by antispe in Why mastodon isn't eco friendly displayed in one image. by postleftpuppy
there's no need to educate all the guests. eugen and a handful of other devs in the fediverse space have the power to fix this, and when they do, it'll solve the problem in the long term as instances upgrade to newer versions. usually a software bug would be limited to the users of that software, but in this case it negatively affects the web around it, so i don't think this is too much to ask.
emma wrote
Reply to comment by Enheduanna in Why mastodon isn't eco friendly displayed in one image. by postleftpuppy
wasn't it supposed to be based on peer-to-peer communications
the peers, or mastodon instances in this case, are inherently untrustworthy under the open federation model that forms the "fediverse". i presume the reason they chose not to federate preview info is that it can't be verified without refetching it from the origin.
to give an example of how this can be abused, i could set up a malicious instance and make toots that link to my enemies' websites, and have the preview title say "here's how i murdered someone and hid their body", but the actual website doesn't say that. this fake title would then propagate to other instances, and people might believe the website i linked to actually said that at one point in time.
emma wrote
Reply to comment by antispe in Why mastodon isn't eco friendly displayed in one image. by postleftpuppy
Makes me appreviate how Signal does sender-side link-previews to only make one request to the linked server per story
this is that, but each mastodon server has to fetch its own preview, which is why a billion sudden requests happen at once
But this also sounds like they could also just serve a cached page that didn't use php capacity?
yeah, you can serve cached pages if you know how, and the thing that's linked doesn't require server-side recomputing, but caching introduces a bunch of complexity and prerequisites that have to be met, and i've never seen a cheap web host offer this capability.
the onus shouldn't be on the webmaster to prepare for a huge burst of requests. mastodon instances should behave as good netizens. if the integrity of previews is so important that each server must do its own independent fetching, then perhaps they shouldn't do previews at all.
emma wrote
Reply to Rats are high on marijuana evidence at an infested police building, New Orleans chief says by chunkenheimer
hope the rats can deal with their police infestation 🙏
emma wrote
Reply to comment by mima in Nightmare on Lemmy Street (A Fediverse GDPR Horror Story) by kin
A lot of fediverse software unfortunately just immediately uploads your image to the server (probably so that you can add the alt text immediately as well)
i think it's because developers have been poisoned into believing multipart/form-data is a bad thing, since it doesn't fit neatly into the REST paradigm. images could be handled entirely on the client side until it comes time to actually post them, allowing for the alt text stuff to happen.
it's pretty funny that lemmy has all those deficiencies, but i don't believe they're in violation of the gdpr. the right to be erasure only requires that data is deleted without undue delay, and doesn't require an automated self-service process. the EU defines "undue delay" as up to a month, or in some cases, three.
emma wrote
Reply to comment by itsalways1312somewhere in Nightmare on Lemmy Street (A Fediverse GDPR Horror Story) by kin
they don't cache the entire server. generally, they pass the request on to raddle if it hasn't been flagged as malicious. most endpoints won't be cached, because doing so could potentially expose secrets in the html that's served to logged in users. for images, they do cache these, which saves several 100 gigs each month.
we do have to trust that cloudflare won't retain private responses longer than they have to, but that's true of the hosting provider as well.
emma wrote
Reply to comment by ziq in Nightmare on Lemmy Street (A Fediverse GDPR Horror Story) by kin
This is cloudflare caching the image. If I add a query string, it 404s.
emma wrote
Reply to When you download something open source, like the Signal messaging app, is there a way to confirm that the thing you've downloaded is actually using the code that is open source? by Tequila_Wolf
Are we simply trusting that they are using the same code?
Pretty much, yeah.
In theory, you can verify it's built on the source by compiling it yourself and comparing the output to the prebuilt version. There's a whole methodology called reproducible builds meant to enable this verification, and supposedly Signal supports this.
emma wrote
Reply to I NEED AN OLD REDDIT ACCOUNT by Brendasims
I'll send one to you if you send me a message on Ligma
emma wrote
Reply to comment by MountainMan in Every time I unban the .onion I regret it within hours by ziq
Tor's hidden services work by having connections originate from the Tor daemon. It is configured with 127.0.0.1:80 as the destination, so the connections appear to originate from 127.0.0.1 as far as the web server is concerned.
emma wrote
I love the confident 'Really.' when this didn't happen. Tech journalism is absolute sewage.
emma wrote
Reply to AlternativeTo is a free service that helps you find better alternatives(software).all the lists of alternatives are crowd-sourced.useful when you make a research on reddit after picking some possible alternatives. by maybeanotherday
I've seen this site in search results for well over a decade, and never, ever has it been useful to me.
emma wrote
Reply to comment by !deleted52089 in Against Progress, Towards a Low-Tech Raddle Future by ziq
it looks like the gitlab page is archived anyway and you can't contribute
the code is on codeberg, i have yet to update the website
but you're right about contributions being disabled. i'm hoping someone else will fork the project and take charge, perhaps in a way that i can be a contributor and not have to deal with being a project lead.
emma wrote
Reply to Statement regarding the ongoing SourceHut outage by mima
I keep telling people we should just tack "-over-HTTPS" to every network protocol so we can leverage the affordable, widely available DDoS mitigation infrastructure that's meant to protect websites.
SourceHut looks nice, and if I were still actively programming, I'd gladly try it out. It sucks to see this happen.
emma wrote
Reply to comment by miggyb in Any update on stripping exif data off pictures? by miggyb
i didn't interpret it as aggressive, i thought it was fair critique. thanks for being understanding.
emma wrote
Reply to comment by wazzupdog in /f/SelfHosted: A forum for people who like to host their own shit on their own hardware by MountainMan
what
emma wrote
I've uploaded all the latest source code to Codeberg for anyone who cares to give implementing this a shot.
Please fork the project, I intend to step away from development.
emma wrote
Reply to Real leftists don't commit such heinous hatespeech against our president! Real leftists vote for the lesser fascist by itsalways1312somewhere
Not surprised this is from Mastodon, and a tech-centric instance at that. That place is full of jingoists sharing the mental gymnastics they use to justify voting for genocide.
emma wrote
Reply to comment by 2elddar in Free Software vs. Open Source by plank60
Good god that thread is funny. Threats of reporting the copyright holder to some ominous group for violating their own rights as the copyright holder.
I can see that you are active in other threads, you are breaking the contract (GPL3) by not sending me sources. You are being illegal.
Best imagined as dialogue from Ace Attotney.
Idk why, but any discussion about copyright and free software just seems to invite a ton of insights from people who don't have even a basic understanding of any of them, as demonstrated both in that thread and here.
emma wrote
Reply to [POLICY] Can the word for having an account approved to do things not be crackerlisted anymore? by yaspora
I've decided I will be changing the wording because 'whitelisting' was always very vague and prone to misunderstanding. The change will happen next time we do an upgrade.
emma wrote
Reply to comment by !deleted52089 in [POLICY] Can the word for having an account approved to do things not be crackerlisted anymore? by yaspora
back when GitHub switched away from master/slave branches.
Gonna expand on this a bit. Git, the technology that Github uses, had 'master' as the default branch name, but 'slave' wasn't used anywhere. However, one version control system that inspired Git did use it, therefore it was argued that Git's use of 'master' was part of the master/slave dichotomy.
I was one of those who renamed branches from 'master' to 'main' back when the George Floyd thing happened, but in hindsight I feel it was an incredibly performative action to take. Not only was the rationale incredibly contrived, but the tech companies that pushed for this were also perfectly happy selling their services to a US government agency that hunts down immigrants, breaks up their families, and locks them in cages. So from my perspective, that whole endeavour was incredibly pointless, and played into a PR campaign to launder the reputation of harmful companies.
But that's just my perspective as a white person living in Europe, and maybe other people here feel differently about it. If anyone has perspectives to share about either 'master' or 'whitelist', please let me know, as I'll take these into account when deciding whether to change the wording in Postmill or not.
emma wrote
Reply to by livetopXss202
emma wrote
Reply to [CW: very serious topics] Might actually get in trouble for supposedly blackmailing my teacher. Oops. by omegaSomeone
That's not blackmail, that's a poorly worded C&D.
I think you'll be fine.
emma wrote (edited )
Reply to Raddle loads images from clearnet CDN instead of onion when using onion domain by Rocket_Gecko
yeah. postmill only has one global setting for which domain to serve images from, and i'm too lazy to fix this.
also the onion site gets an incredibe amount of abuse at times, and offloading image traffic to cloudflare would go a long way toward solving it.
edit: to clarify, it used to be that
UPLOAD_ROOT=/
, which made images load relative to the domain you were visiting from. now it'sUPLOAD_ROOT=https://uploads-cdn.raddle.me/
, so that's where images will load from on both the clearnet and onion sites.