The scenario of my doubt is as follows: I have a system that reads a file that has around 3 million lines, each line has a separator, I identify each item on the line, I do the proper treatment that must be done in each of these items, and I transform each row into an object, and this object is later added to a list, which is then saved to the database. Lines that are defective should go to another list so that a file with the defective lines is generated.
The problem happens when the number of lines is for example 5 million, causing memory overflow. I wonder if someone has already gone through this, and how did you resolve?