System out memory exception when exporting 4.5 million records to an textfile
Sorry for the confusion, how to write the logic export a large dataset into pieces. At the moment if I try to export 4.5 million files to one text file, I get a system out of memory exception. So I have to write logic to break up into chunks. so what would be the logic. I want to write no more then 750,000 records to a text file.
Let me know if this make sense.
Export the data to excel is the easy part. Now export 4.5 five million records to one textfile is not the easy part, because i get the system outof memory exception. How to avoid that?
I have 4.5 million records in a temporary datatable, my goal is to export data to a text file in chunks of records until all records are written to textfile or processing
Can someone help me write the logic to export 750,000 records at a time until all records are gone. For every 750,000 records create its own text file. Below is the logic I using below in code, can you guys help me modify logic below
Have simple logic below to grab the data, now I after the data get into to the temporary table, how to write the logic break up 4.5 millions records into pieces?
DataTable dataTable = new DataTable ();
dataTable.Columns.Add(new DataColumn("UNIT", typeof(string)));
dataTable.Columns.Add(new DataColumn("SERIAL", typeof(string)));
dataTable.Columns.Add(new DataColumn("PART", typeof(string)));
SqlDataReader dr = cm.ExecuteReader();
while(dr.Read())
{
nextRow = dataTable.NewRow();
nextRow["UNIT"] = row.UNIT;
nextRow["SERIAL"] = row.SERIAL;
nextRow["PART"] = row.PART;
dataTable.Rows.Add(nextRow);
}