request rate limiting + whitelist Google

From: Jozsef Rekedt-Nagy <joe7#site.hu>
Date: Thu, 05 May 2011 16:39:56 +0200


Hello guys,

Am wondering if once can rate-limit (reqs/second or /minute) globally, but also whitelist Google (and other crawlers), by hardcoding their reverse dns hosts in the config?
If reverse dns/host check is no-go, can the same be achieved using user agent filter?

Any ideas are greatly appreciated.
Thx
Joe Received on 2011/05/05 16:39

This archive was generated by hypermail 2.2.0 : 2011/05/05 16:45 CEST