We have a server host with WHM .
This host is a vertual host with a bunch of websites.
We have recently found that Google analytics and Search Console can't seem to access website pages because the Google can't seem to access the robots.txt file.
The robots.txt file exists and is reachable from the browser.
My conclusion is that somehow the WHM firewall or similar is blocking the Google access to www.website.com/robots.txt . But I can't see how this is happening. Google gives no useful specific information. Just that the request is met by a (5xx) error. But the request loads perfectly in the browser.
I have cleared our extensive list of blocked IPs on the Firewall (CSF) and have checked that port flooding firewall options are turned off (they are off). I have also checked Apache to see if there's anything on there that might cause issue in the Virtual host httpd.conf includes and nothing there seems relevant.
I'm not certain what I'm looking for but something that's causing Google (specifically and only) to be denied by the server.
What am I missing? Where can I look? I'm out of ideas. I think that there's something automated that's denying Google bots from reaching the server but I can't make out what it is. Maybe some sort of rule denying access to non HTML files , although they work in the browser .