2018-04-26
| ||
16:46 | • Closed ticket [dfc8133ba2]: Incremental IO interface very slow plus 3 other changes artifact: a6c2329db2 user: mistachkin | |
2018-04-18
| ||
15:29 | • Ticket [dfc8133ba2]: 4 changes artifact: ea1d89a6a0 user: mistachkin | |
15:27 | Add a new overload for the SQLiteBlob.Create method that is a thinner wrapper over the native API, for greater efficiency. Pursuant to [dfc8133ba2]. check-in: e28ff15b0e user: mistachkin tags: trunk | |
15:22 | • Ticket [dfc8133ba2] Incremental IO interface very slow status still Pending with 3 other changes artifact: a9fa71ab7e user: anonymous | |
14:36 | • Ticket [dfc8133ba2]: 3 changes artifact: b0a4589e41 user: mistachkin | |
14:33 | • Pending ticket [dfc8133ba2]. artifact: c2b63717af user: mistachkin | |
11:13 | • New ticket [dfc8133ba2]. artifact: 83d4253084 user: anonymous | |
Ticket Hash: | dfc8133ba200aaa82098de6cca26810380b5afb9 | ||
Title: | Incremental IO interface very slow | ||
Status: | Closed | Type: | Incident |
Severity: | Important | Priority: | Medium |
Subsystem: | Data_Reader | Resolution: | Fixed |
Last Modified: | 2018-04-26 16:46:36 | ||
Version Found In: | 1.0.108 |
User Comments: | ||||
anonymous added on 2018-04-18 11:13:43:
I am using Incremental IO to write data to blog and I can see it is very slow. This is the test I am using string sql = @"CREATE TABLE MailMessagePartContent ( Id integer primary key autoincrement , Header blob null , Body blob null ); "; using (SQLiteCommand cmd = new SQLiteCommand(sql, connection)) { cmd.ExecuteNonQuery(); } byte[] headerBytes = null; byte[] bodyBytes = null; FileStream fileStream = File.OpenRead(@"C:\temp\Debug.zip"); MemoryStream memoryStream = new MemoryStream(); fileStream.CopyTo(memoryStream); //memoryStream.Position = 0; if (fileStream != null) { bodyBytes = new byte[fileStream.Length]; bodyBytes = memoryStream.ToArray(); } Stopwatch s; using (var tr = connection.BeginTransaction()) { s = new Stopwatch(); s.Start(); string sql = $"INSERT INTO MailMessagePartContent (Header, Body) VALUES (zeroblob(0), zeroblob({body.Length}))"; using (SQLiteCommand cmd2 = new SQLiteCommand(sql, connection)) { cmd2.ExecuteNonQuery(); } string sql2 = "SELECT Id, Body FROM MailMessagePartContent WHERE Id = 1"; using (SQLiteCommand cmd3 = new SQLiteCommand(sql2, connection)) { using (var reader = cmd3.ExecuteReader(CommandBehavior.KeyInfo)) { reader.Read(); try { using (var blob = reader.GetBlob(1, false)) { blob.Write(body, body.Length, 0); blob.Close(); } } catch (SQLiteException ex) { } } } tr.Commit(); s.Stop(); } What bothers me here is that in order to construct a SQLiteBlob I need to execute this query - SELECT Id, Body FROM MailMessagePartContent WHERE Id = 1. (In my case the blob will contain files. Let say we have 100mb file. ). This query need to read 100mb byte array and return a reader which later i'll use to create a SQLiteBlob. I can see in the source code that the content of the blob is not used at all. If I am right we are pulling the content of the blob for no reason? I tried workaround like SELECT Id, substr(Body,0,0) FROM MailMessagePartContent WHERE Id = 1" but it doesn't work (No RowId is available error) mistachkin added on 2018-04-18 14:36:52: Perhaps adding a simpler overload for the SQLiteBlob.Create method would work for this use case? anonymous added on 2018-04-18 15:22:03: I think new interface close to C++ sqlite3_blob_open will be a good idea. If you end up using Incremental IO this means you need efficiency and anything with high level of abstraction can have overhead. mistachkin added on 2018-04-18 15:29:07: The new overload is now on trunk. Please let us know if this change works for you. |