Stop Google from indexing [closed]
robots.txt User-agent: * Disallow: / this will block all search bots from indexing. for more info see: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360
robots.txt User-agent: * Disallow: / this will block all search bots from indexing. for more info see: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360
The best way is to set static_url_path to root url from flask import Flask app = Flask(__name__, static_folder=”static”, static_url_path=””)
Here’s a solutions if you want to disallow query strings: Disallow: /*?* or if you want to be more precise on your query string: Disallow: /*?dir=*&order=*&p=* You can also add to the robots.txt which url to allow Allow: /new-printer$ The $ will make sure only the /new-printer will be allowed. More info: http://code.google.com/web/controlcrawlindex/docs/robots_txt.html Advanced Usage … Read more
The module documentation for robotparser and its Python 3 counterpart, urllib.robotparser, mention that they use the original specification. This specification does not have an Allow directive; that is a non-standard extension. Some major crawlers support it, but you (obviously) don’t have to support it to claim compliance.