How to avoid storing large SELECT memory in Rails?

2

In the routine I'm developing, I make a simple SELECT in the database, just in a table:

Model.select("id").where(:tipo => 2).find_each do |registro|
    puts registro.id
end

But this select returns around 160,000 records. Then the system gives the error:

  

pid 258 SIGKILL (signal 9)

If I comment on this line and follow my code, everything runs normally. I already researched this and started using the find_each function instead of each , but the error continued. If I limit the query, it also works normally.

As far as I understand, the error is only because of the volume of data that is larger than the memory can support. Analyzing link and link , I noticed that find_each improves this situation, but it did not help. How do I resolve this?

    
asked by anonymous 15.12.2017 / 18:51

1 answer

1

I think you need the find_in_batches method. It breaks your query into queries with shorter logging intervals. This way you do not overload your memory with Rails trying to map all lines into objects in memory.

Example:

Model.select("id").where(:tipo => 2).find_in_batches do |registro|
  puts registro.id
end

By default it fetches 1000 records. Normally this quantity is quiet but if you have the same problem with this amount you can customize by setting the batch_size option.

Model.select("id").where(:tipo => 2).find_in_batches(batch_size: 500) do |registro|
  puts registro.id
end

Check the API for details

    
15.02.2018 / 22:55