

I haven’t heard of that happening much outside of law enforcement raid.
Laptops, yeah. But stories of homes being broken into to steal servers?


I haven’t heard of that happening much outside of law enforcement raid.
Laptops, yeah. But stories of homes being broken into to steal servers?


When was the last time you saw a headline: “Thieves steal home lab”?


The encoding format of URLs is URL encoding, also known as percent-encoding. Content in the URL may be first encoding in some other format, like JSON or base64, and then encoded additionally using percent-encoding.
While there is a standard way to decode percent-encoding, websites are free to use base64 or JSON in URLs however they wish, so there’s not a one-size-fits-all way to decode them all. For example, the “/” character is valid in both percent-encoding and base64-encoding, so to know if it’s part of a base64-encoded blob or not, you might end up trying decoding several parts of the URL as base64 and checking if the result looks like URL-- essentially brute force.
A smarter way to do this might be to maintain a mapping between your favorite sites that you want to decode and what methods they use to encode links. Then a tool could efficiently directly decode the URLs embedded in these click trackers.


Lol. After professionally hosting email for 15 years I’m happy to let someone else handle it now.
About 90% of incoming mail will be spam and it will be your job to make sure you are doing good job of classifying it so you don’t get junk in your inbox and don’t lose real mail in the spam folder.
Then for outgoing mail you need to make sure SPF, DKIM and DMARC are all in order.
Then there is all the usual stuff of security updates, backups, monitoring, alerting, logging and having a plan for internet outages.
Yes, it’s all doable but I won’t expect it be “set and forget”. I expect there will be quite a bit of tuning with some possible spam and delivery problems while you get kinks worked out.


I also use Ansible. Using Podman’s “quadlet” adapter, the containers run as systemd services.


Congrats on the cat box cleaning!
There’s also Zitadel: https://zitadel.com/
Also, all spam messages.


I host routing for customers across the US, so yes I need it all. There are ways to solve the problem with less memory but the point is that some problems really do require a huge amount of memory because of data scale and performance requirements.


Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.
OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.
Simple means different things to different people.
I self-host Ghost and find it pleasant to use and low maintenance. It is a single Docker container plus MySQL. I recommend a reverse proxy in front of it like Nginx. There are importers from many other blog formats.
For bookmarking: https://raindrop.io/
But it’s not self-hosted and I’m not sure it supports offline reading.


It isn’t hard when every works perfectly but there is a tremendous amount of complexity in some of these apps and a huge range of quality, documentation and required env vars and mounts.
And so, so many ways for things to break.


You still have manage upgrades due security vulns in all the features you are ignoring.


Yes. DMZ on router 1 exposes router 2 IP to internet.


No, this is all happening in the browser, there are no other image manipulation tools being called.


I just tested the new release. Consider defaulting PNGs to convert to JPEGs unless they have a PNG-specific feature like transparency. Lots of screenshots are initially PNGs, but not because they need any PNG-specific features. Consider: In a test screenshot, it compressed 3.4% with the default 80% setting and PNG->PNG, but for PNG->JPG, it compressed 84.6%. 


MCP sounds like a standardized way for AI clients to connect to data sources, the Model Context Protocol.
https://www.anthropic.com/news/model-context-protocol
It sounds like it may compete some with Google’s A2A protocol, which is for AI agent to agent communication.
Both share the same goal of making services easier for AI to consume.
Because less than 1% of users would use it and your trusting the security of not one bit partner but thousands of ever-changing small partners.
Also, email is already federated.