Add a /robots.txt file to both get rid of error messages
authoropenssl <openssl>
Tue, 23 Mar 1999 07:38:54 +0000 (07:38 +0000)
committeropenssl <openssl>
Tue, 23 Mar 1999 07:38:54 +0000 (07:38 +0000)
(requests to this file resulting in 404 not found) and
to make sure at least some robots don't crawl under
/source/ too much (our cgiweb stuff causes high load).

robots.txt [new file with mode: 0644]

diff --git a/robots.txt b/robots.txt
new file mode 100644 (file)
index 0000000..5ca672c
--- /dev/null
@@ -0,0 +1,8 @@
+##
+##  robots.txt -- Robot Exclusion Standard config file
+##
+
+User-agent: *
+Disallow: /source/cvs/
+Disallow: /source/exp/
+