Hi all,
I have one particular customer who is getting flooded by Googlebot, with requests to inexistent URLs.
I have tried saying Google not to index my using meta tags, Disallow: on robots.txt, and on Google search console I requested clearing cache and temporary removal of the entire domain. None worked.
I still see hundreds of requests 24x7 on my apache status, as attached picture shows. some of them enjoying keepalive timer.
Blocking Googlebot ips on firewall work, but is NOT an option as this is a shared server, it would harm other server users by preventing their sites from getting crawled.
How can I improve that situation, perhaps a modsecurity rule denying Googlebot /24 subnet or it´s User Agent to that particular domain? If yes, What would the rule look like?
Please help
I have one particular customer who is getting flooded by Googlebot, with requests to inexistent URLs.
I have tried saying Google not to index my using meta tags, Disallow: on robots.txt, and on Google search console I requested clearing cache and temporary removal of the entire domain. None worked.
I still see hundreds of requests 24x7 on my apache status, as attached picture shows. some of them enjoying keepalive timer.
Blocking Googlebot ips on firewall work, but is NOT an option as this is a shared server, it would harm other server users by preventing their sites from getting crawled.
How can I improve that situation, perhaps a modsecurity rule denying Googlebot /24 subnet or it´s User Agent to that particular domain? If yes, What would the rule look like?
Please help
Attachments
-
159.7 KB Views: 9