Wikipedia:Spiders and bots policy
This page is currently inactive and is retained for historical reference. Either the page is no longer relevant or consensus on its purpose has become unclear. To revive discussion, seek broader input via a forum such as the village pump. |
Search engine spiders
[edit]A huge portion of our traffic comes from referrals from search engines. Obviously, we want search engines to spider over the site and index us! These spiders are usually well-behaved; they take their time, obey the robots.txt file, and don't stress the server too much.
Other spiders
[edit]Other well-behaved spiders are also welcome.
But from time to time spiders are not friendly -- they don't obey the restrictions in robots.txt and overuse database-stressing dynamic functions, and/or request pages at so great a rate that they're preventing legitimate human users from getting at the database. If a spider is found to be causing that kind of trouble, the IP address will be banned from access to the site. Currently so banned are:
- 144.167.21.15
- 192.153.22.246
- 202.69.76.19
- 212.27.33.00/24
- 24.130.248.43
- 194.209.152.200
- 80.192.68.91
- 80.2.170.93
- 209.208.186.2
- 68.62.88.211
- 203.175.70.118
- 65.60.161.156
- 81.86.203.137
- 63.164.242.215
- 66.147.154.3
- 62.101.126.224
- 209.125.45.130
- 172.189.3.245
- 61.30.127.4
- 61.30.14.26
- 137.230.1.11
- 194.228.168.42
(These don't show up in Special:Ipblocklist, which blocks IPs from saving edits to the wiki.)
Bots that make edits to the wiki
[edit]Well-behaved bots are acceptable under some circumstances. Please see Wikipedia:Bots.