Y u no Mamaleek

  • 0 Posts
  • 9 Comments
Joined 1 month ago
cake
Cake day: November 3rd, 2025

help-circle
  • My motivation to use Ansible is fueled by disdain for manual non-scriptable configuration. I’ve had to use Windows for a couple years lately, and the absence of programmatic access to many things annoyed me to no end.

    Now, I get up in the morning and look to the east. I salute the sun and thank the fate for the chance to do proper configuration again. I don’t wade through dialogs for hours anymore. I don’t lose track of things that I’ve changed somewhere sometime. I’ll learn what the hell the difference between dconf and gsettings is, just to use one of them for all my desktop settings forever. I will have this config for years to come, and I will put more things in it bit by bit.

    Now, if Ansible’s config language wasn’t a naive reinvention of Lisp, that would be great.


  • This here is the implementation of sha256 in the slow language JavaScript:

    const msgUint8 = new TextEncoder().encode(message);
    const hashBuffer = await window.crypto.subtle.digest("SHA-256", msgUint8);
    const hashHex = new Uint8Array(hashBuffer).toHex();
    

    You imagined that JS had to have that done from scratch, with sticks and mud? Every OS has cryptographic facilities, and every major browser supplies an API to that.

    As for using it to filter out bots, Anubis does in fact get it a bit wrong. You have to incur this cost at every webpage hit, not once a week. So you can’t just put Anubis in front of the site, you need to have the JS on every page, and if the challenge is not solved until the next hit, then you pop up the full page saying ‘nuh-uh’, and probably make the browser do a harder challenge and also check a bunch of heuristics like go-away does.

    It’s still debatable whether it will stop bots who would just have to crank sha256 24/7 in between page downloads, but it does add cost that bot owners have to eat.


  • I mean, I thought it was long dead. It’s twenty-five years old, and the web has changed quite a bit in that time. No one uses Perl anymore, for starters. I used Open Web Analytics, Webalizer, or somesuch by 2008 or so. I remember Webalizer being snappy as heck.

    I tinkered with log analysis myself back then, peeping into the source of AWStats and others. Learned that a humongous regexp with like two hundred alternative matches for the user-agent string was way faster than trying to match them individually — which of course makes sense seeing as regexps work as state-machines in a sort of a very specialized VM. My first attempts, in comparison, were laughably naive and slow. Ah, what a time.

    Sure enough, working on a high-traffic site taught me that it’s way more efficient to prepare data for reading at the moment of change instead of when it’s being read — which translates to analyzing visits on the fly and writing to an optimized database like ElasticSearch.