google-searchgoogle-search-consolex-robots-tag

Google Search Console throws error in UI: 'noindex' detected in 'X-Robots-Tag' http header


In trying to crawl my site in Google Search Console, I see the following error on each and every one of my pages: Google Search Console Error

I updated the X-Robots-Tag to have the following: < X-Robots-Tag: usasearch all; googlebot all; none and verified that this is live with Google's Robots Testing tool that Googlebot and Googlebot-Mobile are both allowed. Also confirmed here is what a server header checker tool shows:

Final response 
< HTTP/1.1 200 OK
< Server: nginx
< Content-Type: text/html; charset=UTF-8
< Transfer-Encoding: chunked
< Connection: keep-alive
< Keep-Alive: timeout=300
< Vary: Accept-Encoding, 
< Cache-Control: must-revalidate, no-cache, private
< Date: Fri, 27 Sep 2019 21:04:41 GMT
< X-Drupal-Dynamic-Cache: UNCACHEABLE
< Link: ; rel="shortlink", ; rel="canonical"
< X-UA-Compatible: IE=edge
< Content-language: en
< X-Content-Type-Options: nosniff
< X-Frame-Options: SAMEORIGIN, SAMEORIGIN
< Expires: Sun, 19 Nov 1978 05:00:00 GMT
< X-XSS-Protection: 1; mode=block
< Strict-Transport-Security: max-age=63072000; includeSubDomains; preload
< X-Robots-Tag: usasearch all; googlebot all; none

Yet no matter how many times I use the URL Inspect tool I get the same error pictured above. Not sure what else I need to do. Should I just wait a while since the change was done recently?

Any suggestions?


Solution

  • It appears as though Google defaults to the most restrictive tag in X-Robots-Tag. In this case none. After removing that attribute, the search console was able to reach and index that page correctly.