Topic: Locking and Concurrency in a High Load App w/ Crons

Wasn't sure if this belonged in the database category or not...

I have a RoR app that needs to be extremely safe concurrency wise.  Inside the core application, I'm using Rails' optimistic locking for concurrent updates and using Mysql unique index for concurrent inserts(needed in my case), all wrapped up in an Exception/Transaction block.  Here's some code:

      begin
         Booking.transaction do
            Booking.make_new(booking_info_array)
         end
      rescue ActiveRecord::StaleObjectError => e
            .....
      rescue RecordNotSaved => e
            .....
      rescue Exception => e
            .....
      end

My understanding of this is that StaleObjectError's will be raised via Rails' optimistic locking, RecordNotSaved if I try to create the same unique index twice, and Exception to handle everything/anything else.  If anything is raised, the transaction should be rolled back.  Let me know if anything is incorrect or misunderstood here.

The problem is now my cron jobs(which update the state of a booking).  They are on a separate database connection and therefore Rails' optimistic locking breaks down.  I've read that pessimistic locking has been introduced in Edge Rails, but I'm wary of pessimistic locking because of deadlocks.  Also, even if pessimistic locking is the way to go for this situation, I'm not sure if my RoR host will upgrade to Edge Rails.

How else are people handling concurrency across database connections?

Re: Locking and Concurrency in a High Load App w/ Crons

A hack we figured out is to have the crons go through the web server, therefore using the same connection pool, so we don't have to deal with pessimistic locking.  We are going to have a single cron tab which wgets a cron controller which calls our cron model.

Does anyone see any immediate problems with this approach?