Topic: Search bot crashing application

I have a simple application that either serves a page model or article model whose content is stored in a database. The site was running fine for the first 2 months, but recently whenever a search bot crawls the site the memory spikes and the mysql process on the server gets killed, taking down the entire site with it.

I am running on a VPS with 512mb of ram, nginx, passenger with a max of 3 instances and Ruby enterprise edition.

I did a load test of the site at 50 concurrent connections navigating 3 levels which went perfectly, no issues at all and the memory was freed when the load decreased. In this test the memory usage reaches a total of 284mb.

However every morning around 3am the site is getting crawled and the memory spikes to over 900mb and the site goes down.

Please can someone help.

Re: Search bot crashing application

You don't want to block all crawlers but you do have access to the user agent string at many levels before you touch your database. You can tell Nginx to block certain user agent strings. Read here:

http://n2.nabble.com/How-to-block-by-us … 54744.html

One other thing that will help is to put some kind of monitoring in place that will shut down your processes and restart them before they grow out of control. Try monit or god for that.

http://n2.nabble.com/How-to-block-by-us … 54744.html

http://god.rubyforge.org/

By using these, you can tolerate the middle-of-the-night onslaught and drop a minimal number of requests. Hopefully, you don't mind denying a few requests during this period.

Re: Search bot crashing application

Do you know which processes are growing in memory?

Play postseason fantasy sports at Fantasy Postseason - Written in RoR