Re: Preventing bots from starving other users?

From: Łukasz Jagiełło <lukasz.jagiello#gforces.pl>
Date: Sun, 15 Nov 2009 22:42:03 +0100


2009/11/15 Wout Mertens <wout.mertens#gmail.com>:
> I was wondering if HAProxy helps in the following situation:
>
> - We have a wiki site which is quite slow
> - Regular users don't have many problems
> - We also get crawled by a search bot, which creates many concurrent connections, more than the hardware can handle
> - Therefore, service is degraded and users usually have their browsers time out on them
>
> Given that we can't make the wiki faster, I was thinking that we could solve this by having a per-source-IP queue, which made sure that a given source IP cannot have more than e.g. 3 requests active at the same time. Requests beyond that would get queued.
>
> Is this possible?

Guess so. I move traffic from crawlers to special web backend cause they mostly harvest when I got backup window and slow down everything even more. Add request limit should be also easy. Just check docu.

-- 
Łukasz Jagiełło
System Administrator
G-Forces Web Management Polska sp. z o.o. (www.gforces.pl)

Ul. Kruczkowskiego 12, 80-288 Gdańsk
Spółka wpisana do KRS pod nr 246596 decyzją Sądu Rejonowego Gdańsk-Północ
Received on 2009/11/15 22:42

This archive was generated by hypermail 2.2.0 : 2009/11/15 22:45 CET