Create Json with a lot of Data and Return with WebApi

1

Scenario: I have a WebApi that runs a method on my service layer that executes a query either returns 500k result lines, containing 90 columns.

Method that queries and traverses results:

   using (var command = Session.Connection.CreateCommand())
   {
      command.CommandText = dataSource.Query;

      var name = "";

      using (var reader = command.ExecuteReader())
      {
         while (reader.Read())
         {

         }
      }
   }

Today to try to build a JSON I use the JsonTextWriter class:

StringBuilder sb = new StringBuilder(int.MaxValue);
StringWriter sw = new StringWriter(sb);

using (JsonWriter jrWriter = new JsonTextWriter(sw))
{
   using (var command = Session.Connection.CreateCommand())
   {
      command.CommandText = dataSource.Query;

      var name = "";

      using (var reader = command.ExecuteReader())
      {
         while (reader.Read())
         {
            int fields = reader.FieldCount;

            jrWriter.WriteStartObject();

            for (int i = 0; i < fields; i++)
            {
               jrWriter.WritePropertyName(name);

               name = reader.GetName(i);

               if (reader[i] == null)
               {
                  jrWriter.WriteValue("");
               }
               else if (reader[i] is string)
               {
                  jrWriter.WriteValue(reader[i].ToString().Trim());
               }
               else if (reader[i] is bool)
               {
                  var value = (bool)reader[i];

                  if (value)

                     jrWriter.WriteValue("true");

                  else
                  {

                     jrWriter.WriteValue("false");

                  }
                  else if (reader[i] is DateTime)
                  {
                     jrWriter.WriteValue(DateTime.Parse(reader[i].ToString()));
                  }
                  else
                  {
                    jrWriter.WriteValue(reader[i].ToString().Trim().Replace(",", "."));
                  }

                  jrWriter.WriteEndObject();
             }
       }
  }

}

Problem:

After iterating 150k records, SystemOutMemoryException is generated.

I have tried several other solutions to try to send this mass of data to the front, but always crash in the same exception.

I have tried to convert each of my records to an array of bytes and include each record in an Array of byte array, so I even managed to iterate through all records, but at the time of returning to the front, the SystemOutMemoryExeption .

If someone has already had to send a lot of data through the webApi, I'm open to receiving tips to do this better, because I believe the problem is always having to convert everything to string.

    
asked by anonymous 13.02.2017 / 20:25

1 answer

1

@DiegoSilvaLima

I see four possible solutions.

1) Partition the information search, that is, instead of taking the 500k lines at once, trigger queried asyncs requests that will fetch 50K at a time (for example) and set up your PIVOT and free up memory while doing so .

2) You will need a lot more RAM and have no other way. You're saving EVERYTHING on the ram and 500k lines depending on the size of those lines do not have much to do.

3) Make a mix between ram and file. Go to the end of the file and save it as a file (or a BD) to save the final JSON and then you will read the JSON file and then mount the PIVOT as you are reading and releasing memory.

4) RECOMMENDED Paging. PIVOT is fully capable of displaying page information. And then think about your end user. What human user of a system can see (or even have an interest in) 500k of rows from a table at a single time? You'll make him wait a lot longer for the result, the page will crash (I'm pretty sure chrome will want to kill the flap for not responding for long).

I can not imagine any scenario where you can not have a page for such a large volume of information.

If you really can not page, I know why, I think the only solution for you is item 2. More ram, much more.

If you're dealing with data analysis, then you go into something else that does not involve improvement in C # code or anything like that. It involves pre-computing the data, transforming it into smaller structures already computed / analyzed. But even so it will not exempt the need for more memory.

And remember, if your system is multiuser, what could happen if 3 users request the same report at the same time on the same server.

    
14.02.2017 / 16:40