I've recently inherited a codebase and discovered this gem:
{% if PAGE_EXTRAS.hide_from_sitemap %}
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW">
<META NAME="ROBOTS" CONTENT="INDEX, NOFOLLOW">
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
{% endif %}
I have no idea what it's trying to do. Is there a reason you'd put multiple, apparently-conflicting robots tags in a site like that? Or is it as insane as it looks to my uninformed eye?
This looks like a mistake to me. The only information I could find about this was in Google's Robots meta tag specification:
If competing directives are encountered by our crawlers we will use the most restrictive directive we find.
So, (for Google, at least) the code:
<meta name="robots" content="noindex, follow">
<meta name="robots" content="index, nofollow">
<meta name="robots" content="noindex, nofollow">
does exactly the same thing as:
<meta name="robots" content="noindex, nofollow">
It's conceivable this code might be intended as some sort of sneaky hack, intended to apply different rules to different crawlers by exploiting differences in how they resolve conflicts. If so, this is a terrible idea, IMHO. There is no need for a messy fragile hack when there's already a legitimate mechanism to do the same thing:
<meta name="googlebot" content="noindex, follow">
<meta name="bingbot" content="index, nofollow">