Gary Illyes from Google posted a brand new PSA on LinkedIn pronouncing that the most typical reason why a web site all of a sudden blocks Googlebot from crawling is because of a misconfiguration of a firewall or CDN.
Gary wrote, “take a look at what site visitors your firewalls and CDN are blocking off.” “Through a long way the most typical factor in my inbox is said to firewalls or CDNs blocking off googlebot site visitors. If I succeed in out to the blocking off web site, within the overwhelming majority of the circumstances the blockage is unintentional.”
So what are you able to do? Gary stated, “I have stated this earlier than, however need to emphasize it once more: make a dependancy of checking your block regulations. We submit our IP levels so it will have to be really easy to run an automation that assessments the block regulations in opposition to the googlebot subnets.”
Gary connected to this assist report for extra main points.
In brief, do what you’ll to check to peer in case your web site is obtainable to Googlebot. You’ll be able to use the URL inspection device in Google Seek Console, as one means. Additionally, ascertain along with your CDN or firewall corporate that they’re permitting Googlebot and ask them to turn out it.
Discussion board dialogue at on LinkedIn.