Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.
Edit: As Thunraz points out below, there’s a footnote that reads “Numbers after + are successful hits on ‘robots.txt’ files” and not scientific notation.
Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That’s when my account was suspended for exceeding bandwidth (it’s an artificial limit I put on there awhile back and forgot about…) that’s also why the ‘last visit’ for all the bots is November 12th.


I don’t know what “12,181+181” means (edit: thanks @Thunraz@feddit.org, see Edit 1) but absolutely not 1.2181 × 10185. That many requests can’t be made within the 39 × 109 bytes of bandwidth − in fact, they exceed the number of atoms on Earth times its age in microseconds (that’s close to 1070). Also, “0+57” in another row would be dubious exponential notation, the exponent should be 0 (or omitted) if the mantissa (and thus the value represented) is 0.