I read a little bit about robots.txt and I read I should disallow all folders in my web application, but I would like to allow bots to read main page and one view (url is for example: www.mywebapp/searchresults - it's a codeigniter route - it's called from application/controller/function).
Folder structure for example is:
-index.php(should be able to read by bots)
-application
-controllers
-controller(here is a function which load view)
-views
-public
Should I create robots.txt like this:
User-agent: *
Disallow: /application/
Disallow: /public/
Allow: /application/controllers/function
or using routes something like
User-agent: *
Disallow: /application/
Disallow: /public/
Allow: /www.mywebapp/searchresults
or maybe using views?
User-agent: *
Disallow: /application/
Disallow: /public/
Allow: /application/views/search/index.php
Thanks!
Answer to my own, old question:
When we would like to allow bots to read some page, we need use our URL (routing) so in this case:
Allow: /www.mywebapp/searchresults
In some cases we also could disallow some pages by HTML tag (add to header):
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
When we would like to block some folder i.e with pictures just do:
Disallow: /public/images