System.Data.SQLite

Login
This project makes use of Eagle, provided by Mistachkin Systems.
Eagle: Secure Software Automation
Ticket Hash: 7b4cc88ed9be53531d441bd9fcbf0d4383c13696
Title: Out-of-memory after several INSERT / UPDATE SQL statements ..
Status: Closed Type: Incident
Severity: Important Priority: Medium
Subsystem: Resource_Cleanup Resolution: Unable_To_Reproduce
Last Modified: 2013-10-18 23:46:36
Version Found In: latest
User Comments:
anonymous added on 2013-09-11 18:06:56: (text/x-fossil-plain)
Hi,

I'm using a SQLite database as a kind of "cache" for reference data.

In order to fill the SQLite database, I collect the data (5 fields) from the 3rd party system and put those values into a table with a SQLite database.

This "filler" program (.NET 3.5.1) sends out several hundreds of inserts and updates and runs for about 2-3 minutes.

If there are more than 5000 inserts / updates SQL statements, then the program with your SQLite interface terminates with an out-of-memory.

I looked at the Memory Usage of the program in Windows Task Manager - Processes.

After querying the 3rd party system, the memory usage is at about 80 MB.

When the SQL statements towards SQLite starts, then the memory usage goes up in 50 - 100 MB steps to up to 1.6 GB and then the program crashes with an Out-of-Memory exception. The exception handling routine in the C# (.NET 3.5.1) can't even show the stack trace.

I run the same program - after commenting out all System.Data.SQLite pieces - again and the program memory usage was within 80-90 MB.


Questions:
 * Is this a known "issue"? Have other had similar experiences with heavy 
   "insert/update" calls?
 * Is there a way to force System.Data.SQLite to release the memory and to 
   start garbage collection?
   I even changed the program to close and dispose the System.Data.SQLite 
   pieces and - after a wait of 2 seconds - to open a new connection. Didn't
   change the memory usage notable.
 * What is System.Data.SQLite doing? The whole SQLite database is only 12 MB 
   and the table has about 150'000 entries - SQLite can handle much more.
 * Any tips / hints / ideas?

Kind regards,

Thomas

Email: thomas.laes@swissonline.ch

mistachkin added on 2013-09-12 04:26:02: (text/x-fossil-plain)
Are you making use of "using" blocks to dispose of the IDisposable resources
allocated during this processing?  This should avoid the need to manually trigger
the garbage collector.

mistachkin added on 2013-09-13 01:24:29: (text/x-fossil-plain)
Ticket is pending close awaiting response from opener.