Googlebot cannot access CSS and JS files on your website? How to fix this issue…
This week, Google Search Console (previously Webmaster Tools) sent out error messages to a vast number of website owners, informing them of an access error. This error read as follows:
Have you received this Googlebot error within the Google Search Console?
What does this mean?
This error simply means that Google robots which crawl your website in order to determine where it should rank within their search results cannot properly access every area of your website.
How can I fix this issue?
Firstly, you need to check your robots.txt file. This can normally be accessed at (yourdomain)/robots.txt. Here, you should find that the folder containing your CSS / JS files has been disallowed. For example, within WordPress sites these files are contained within the wp-includes folder, so within the robots.txt file you would expect to see: “Disallow: /wp-includes/”. Removing this disallow rule from your robots.txt file should resolve the Search Console error, allowing Google’s robots to crawl these aspects of your website. You can carry out the methods listed within the email from Google Search Console to test that the site is updated and that the error no longer applies.
I’m having trouble testing my site, the error hasn’t been resolved?
If you’re still having issues with your website or the Google error, or if your website isn’t within WordPress and you’re unsure how to resolve the issue, please contact the experts at Varn for help.