Many lines in a csv file

0

The scenario of my doubt is as follows: I have a system that reads a file that has around 3 million lines, each line has a separator, I identify each item on the line, I do the proper treatment that must be done in each of these items, and I transform each row into an object, and this object is later added to a list, which is then saved to the database. Lines that are defective should go to another list so that a file with the defective lines is generated.

The problem happens when the number of lines is for example 5 million, causing memory overflow. I wonder if someone has already gone through this, and how did you resolve?

    
asked by anonymous 05.05.2016 / 20:50

1 answer

0

I work in PHP. When I have a processing of this type that 'sets' the server time, I make a page that will consume a service that will be processed in the background by the server.

What most of the time is to give browser timeout, but the system continues to process.

So I use this approach. I call a service that will, for example, process the file and can later send an email by returning the report of the processed file or making the report available on a link in the system.

    
05.05.2016 / 21:55