Large record sets and insert is slow. How to best optimize? #265
Replies: 2 comments 7 replies
-
Yes skipDataCheck will make things faster as it will insert data without cheking data's type and all. |
Beta Was this translation helpful? Give feedback.
-
I'm still having speed issues on large data. If I have 80,000 records and I want to find 100 records that match one field to delete and re-add. Is it normal for this to take a long time. It takes over 30 seconds on a Samsung Pad. Here is one record of 80,000 in a SQL format: My table: Thank you. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello JsStore and @ujjwalguptaofficial. I have two questions about dealing with large datasets (20,000+ records) and inserts. With WebSQL this was not an issue. Now with our port from WebSql to JsStore (for IndexedDB) some of our functions that took less than a second are taking almost 30 seconds. On a PAD restoring 80,000 records can take an hour or more. Also, we commonly do about 100 inserts into a 20,000+ record set and then run some clean up to remove 100 old records on the same table.
We have not added any indexes yet. That will be our next try.
Will skipDataCheck and validation options regain the speed and what is the down side?
We are using JsStore 4.3.5.
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions