chiark / gitweb /
add robots.txt to stop crawlers delving into lookup/ and doing searches etc.
authorIan Jackson <ijackson@chiark.greenend.org.uk>
Tue, 7 Sep 2010 19:03:09 +0000 (20:03 +0100)
committerIan Jackson <ijackson@chiark.greenend.org.uk>
Tue, 7 Sep 2010 19:03:09 +0000 (20:03 +0100)
yarrg/web/robots.txt [new file with mode: 0755]

diff --git a/yarrg/web/robots.txt b/yarrg/web/robots.txt
new file mode 100755 (executable)
index 0000000..38a5ff7
--- /dev/null
@@ -0,0 +1,11 @@
+<%flags>
+inherit => undef
+</%flags>
+<%perl>
+$r->content_type('text/plain');
+</%perl>
+User-Agent: *
+Disallow: /lookup
+Disallow: /test/code/lookup
+Disallow: /test/data/lookup
+Disallow: /test/both/lookup