Re: Preventing bots from starving other users?

From: Brent Walker <bwalker#verticalacuity.com>
Date: Mon, 16 Nov 2009 07:57:55 -0500


If the bot conforms why not just control its behavior by specifying restrictions in your robots.txt?

http://www.robotstxt.org/

On Sun, Nov 15, 2009 at 9:57 AM, Wout Mertens <wout.mertens#gmail.com> wrote:
> Hi there,
>
> I was wondering if HAProxy helps in the following situation:
>
> - We have a wiki site which is quite slow
> - Regular users don't have many problems
> - We also get crawled by a search bot, which creates many concurrent connections, more than the hardware can handle
> - Therefore, service is degraded and users usually have their browsers time out on them
>
> Given that we can't make the wiki faster, I was thinking that we could solve this by having a per-source-IP queue, which made sure that a given source IP cannot have more than e.g. 3 requests active at the same time. Requests beyond that would get queued.
>
> Is this possible?
>
> Thanks,
>
> Wout.
>
Received on 2009/11/16 13:57

This archive was generated by hypermail 2.2.0 : 2009/11/16 14:00 CET