I generated 16 character (upper/lower) subdomain and set up a virtual host for it in Apache, and within an hour was seeing vulnerability scans.
How are folks digging this up? What’s the strategy to avoid this?
I am serving it all with a single wildcard SSL cert, if that’s relevant.
Thanks
Edit:
- I am using a single wildcard cert, with no subdomains attached/embedded/however those work
- I don’t have any subdomains registered with DNS.
- I attempted dig axfr example.com @ns1.example.com returned zone transfer DENIED
Edit 2: I’m left wondering, is there an apache endpoint that returns all configured virtual hosts?
Edit 3: I’m going to go through this hardening guide and try against with a new random subdomain https://www.tecmint.com/apache-security-tips/


Crawlers typically crawl by ip.
Are u sure they just not using ip?
U need to expressly configure drop connection if invalid domain.
I use similar pattern and have 0 crawls.
+1 for dropped connections on invalid domains. Or hell, redirect them to something stupid like ooo.eeeee.ooo just so you can check your redirect logs and see what kind of BS the bots are up to.
Is this at a webserver level?
It can be both server and DNS provider. For instance, Cloudflare allows you to set rules for what traffic is allowed. And you can set it to automatically drop traffic for everything except your specific subdomains. I also have mine set to ban a IP after 5 failed subdomain attempts. That alone will do a lot of heavy lifting, because it ensures your server is only getting hit with the requests that have already figured out a working subdomain.
Personally, I see a lot of hacking attempts aimed at my main
www.subdomain, for Wordpress. Luckily, I don’t run Wordpress. But the bots are 100% out there, just casually scanning for Wordpress vulnerabilities.