Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.
Edit: As Thunraz points out below, there’s a footnote that reads “Numbers after + are successful hits on ‘robots.txt’ files” and not scientific notation.
Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That’s when my account was suspended for exceeding bandwidth (it’s an artificial limit I put on there awhile back and forgot about…) that’s also why the ‘last visit’ for all the bots is November 12th.
AI bots killing the internet again? You don’t say
You can also use crowdsec on your server to stop similar BS. They use a community based blacklist. You choose what you want to block. Check it out.
I’m going to try and implement crowdsec for all my ProxMox containers over Cloudflare tunnels. Wish me luck and that my wife and kids let me do this without constantly making shot up fore to do.
They also have a plugin for opnsense (if you use that)
I used to, but moved on to a full Unifi infrastructure about 2 years ago.
Yeah, then you need to implement it at the webhost level.
Good luck and if you need help drop by their discord. They have an active community.
Can they help me keep my wife and kids at bay too? That’s what I need the most help with 😂
I don’t think asking help about domestic issues on the Internet is healthy… However, who knows maybe they can ( ͡~ ͜ʖ ͡°)
I had to pull an all nighter to fix some unoptimized query because I had just launched a new website with barely any visitors and hadn’t implemented caching yet for something that I thought no one uses anyway, but a bot found it and broke my entire DB through hitting the endpoint again and again until nothing worked anymore
- Get a blocklist
- Enable rate limits
- Get a proper robots.txt
ProfitSilence
Can you just turn the robots.txt into a click wrap agreement to charge robots high fees for access above a certain threshold?
why do a agreement when you can serve a zip bomb :D
Puts the full EU regulations in robot.txt
fracking clankers.
It’s a shame we don’t have those banner ad schemes anymore. Cybersquatting could be a viable income stream if you could convince the cleaners to click banner ads for a faction of a penny each.
I don’t know what “12,181+181” means (edit: thanks @Thunraz@feddit.org, see Edit 1) but absolutely not 1.2181 × 10185. That many requests can’t be made within the 39 × 109 bytes of bandwidth − in fact, they exceed the number of atoms on Earth times its age in microseconds (that’s close to 1070). Also, “0+57” in another row would be dubious exponential notation, the exponent should be 0 (or omitted) if the mantissa (and thus the value represented) is 0.
My little brain broke when I started trying to figure out how big the number was… thanks for breaking it down even more intuitively, yeah it is way to large to have been correct!
This is why I use CloudFlare. They block the worst and cache for me to reduce the load of the rest. It’s not 100% but it does help.
LOL Someone took exception to your use of Cloudflare. Hilarious. Anyways, yeah, what Cloudflare doesn’t get, pFsense does.
What is the blog about? It may be increased interest as search providers use them for normal searches now… or it could be a couple of already sentient doombots.
Please don’t be a blog about von Neumann probes. Please don’t be a blog about von Neumann probes. Please don’t be a blog about von Neumann probes…
What’s wrong with blogs about von Neumann probes? Genuinely curious!
If an ai read it several thousand times, I thought it was too on the nose joke sorry
lol that’s funny. I guess I’m just slow
I want to search for a blog on this now…
deleted by creator
Hydrogen bomb vs coughing baby type shit
What is that log analysis tool you are using in the picture? Looks pretty neat.
It’s a mix, I put two screenshots together. On the left is my monthly bandwidth usage from CPanel on the right is Awstats (though I hid some sections so the Robots/Spiders section was closer to the top).
Awstats
I thought I recognized it. Hell of a blast from the past, haven’t seen it in fifteen years at least.
I think they’re winding down the project unfortunately, so I might have to get with the times…
I mean, I thought it was long dead. It’s twenty-five years old, and the web has changed quite a bit in that time. No one uses Perl anymore, for starters. I used Open Web Analytics, Webalizer, or somesuch by 2008 or so. I remember Webalizer being snappy as heck.
I tinkered with log analysis myself back then, peeping into the source of AWStats and others. Learned that a humongous regexp with like two hundred alternative matches for the user-agent string was way faster than trying to match them individually — which of course makes sense seeing as regexps work as state-machines in a sort of a very specialized VM. My first attempts, in comparison, were laughably naive and slow. Ah, what a time.
Sure enough, working on a high-traffic site taught me that it’s way more efficient to prepare data for reading at the moment of change instead of when it’s being read — which translates to analyzing visits on the fly and writing to an optimized database like ElasticSearch.
Downloading you wallpapers? Lol what for
That’s insane… Can’t a website owner require bots (at least those who are identifying themselves as such) to prove at least they’re affiliated with a certain domain?
Had the same thing happen on one of my servers. Got up one day a few weeks ago and the server was suspended (luckily the hosting provider unsuspended it for me quickly).
It’s mostly business sites, but we do have an old personal blog on there with a lot of travel pictures on it, and 4 or 5 AI bots were just pounding it. Went from 300GB per month average to 5TB on August, and 10/11 TB in September and October.







