Topic: Ruby 1.9 memory leak?

I have a folder with files in it that I need to read in as JSON.  So my script iterates over each file, opens it up, reads the contents, and then parses the file contents as JSON.  However, this keeps eating up memory and eventually the OS (Linux) kills the process for using too much memory.  The files themselves are about 50MB a piece, and there are about 30 of them.

I'm not sure what's going on.  Logically, the code opens a single file, reads it, and then it should be closed at the end of the loop before moving on to the next file.  But doing a "top" at the console shows me that the memory keeps growing and growing until the process is forcibly killed by the OS.

# get all files in the /script/input folder
input_files = Dir.glob("input/*")

input_files.each do |f|
  File.open(f) do |file|
                # wrap the contents with proper JSON
    data = '{ "documents": [' + file.read + "]}"
    parsed_json = JSON.parse(data)
  end
end

Is this a memory issue in ruby, or am I misunderstanding what the code is supposed to do?  It's almost as if the variables are not getting reset on each iteration in the loop, but are instead concatinating values or something strange.

Last edited by daniel_l (2011-08-02 14:26:40)