I have a website deployed on gae. This resource has purchased a domain, but of course you can go to the site and a standard domain types app_id.appspot.com plus this can also go there and version_id.app_id.appspot.com. More than that if you enter abrakadabra.app_id.appspot.com get on Default version.
So google robot somehow found my version 1 and 2. For SEO is not very helpful :(. Plus all robots began to come to the site more often (increased load) quotas are spent quickly. Maybe someone has already encountered this problem, tell me the solution.
The best solution is to create filter on url /robots.txt and send for versions hosts text like this:
User-agent: *
Disallow: /*
Google crawler no more come to versioned hosts! :)