We have a web site which makes expensive calls to a back end system to display product availability. I'd like to eliminate these calls for page views that are not actual customers. My first thought was to filter on user agent and if the requester is a spider / search engine crawler, to display a "Call for availability" or some such message (which would be the same message we would display if the backend systems were down for maintenance or generally unavailable) rather than make a call to the backend system for real availability.
In discussions with folks, there seems to be much concern over the availability icon (a very small icon, mind you) being different when being crawled vs. when a user is viewing or requesting the page - that we might be penalized for cloaking the search engines.
As the information we are displaying is a very small image icon, and we are not offering drastically different content to the search engines vs. live users, I really don't see cloaking as an issue - but I'd like to get some outside perspective.
Is simulating an "information not available" scenario for search engines acceptable practice when the overall content of the page does not change, or would it still qualify in some way as cloaking?
Why don't you make the "information" which you are displaying use javascript / ajax. That way when the page loads through a non-javascript enabled browser (e.g. search engine spider), this "expensive call" isn't made.
Alternatively you could put this information in an IFRAME on your page. And exclude indexing the page shown in the IFRAME through robots.txt or the META / robots tag.
Both approaches are completely "white hat" although I think the 2nd is more so.