3

I have a specific web application which relies on uploading a large number of rows of data from a local file on the client side. The aim is to store this data within an indexeddb.

The data only has two columns I am interested in, each containing a string of characters not longer than 25 characters, however there can be up to 1 million rows.

Reading a lot of questions and docs I have created code which seems to work in creating the indexeddb with smaller datasets below 20,000 rows, but breaks on larger data.

I'm sure this is due to poor design as I'm new to this style of work, or potentially some sort of freeze out in the chrome browser - as I don't receive any error messages, I can trigger an alert showing the last for loop is reached, however the on.complete never triggers and the database never seems to

The input of the function e - is a read file.

I also perform an operation on the data within the for loop, but I have removed this for simplicity.

function storeDataEnc (e) {
    var lines = e.target.result.split('\n');
    var request = self.indexedDB.open('DB', 1);
    request.onerror = function(e) {
        console.log("there was and error:" +e.target.errorCode);
    }
    request.onupgradeneeded = function(e){
        var db = request.result;
        var store = db.createObjectStore("col1", {
                    keyPath: "col2"} );
    };

    request.onsuccess = function(e) {

        var db = request.result;
        var tx = db.transaction("dataTable", "readwrite");

        var store = tx.objectStore("dataTable");

        db.onerror = function(e){
            console.log("ERROR" + e.target.errorCode);
        }


    for (var i = 0; i < lines.length; ++i) {
        var test = lines.length - 1;
        if (i == test) {console.log('nearly done')};

            function forEachLinenow (match) {
                if ( match.charAt( 0 ) != '#' ) {
                    match = match.trim();
                    var fields = match.split('\t');
                    var col1in = fields[0];
                    var col2in = fields[3];

                    store.put({ COL1: col1in, COL2: col2in              }
            }
        forEachLinenow(lines[i] + '\n');
    }
    tx.oncomplete = function() {
            db.close();
            alert("all data read");
        }
    }
}

I am guessing I do not understand some issue with the browser to stop malicious apps from taking up too many resources. Has anyone used data of this size, who can spot the error in my process.

My guess would be I may need to generate more than one transaction, which I did try, but didn't seem to change my issue.

I know this may be slow, however speed itself is not the biggest issue as long as the data is successfully ported in.

2
  • Got a bit of code smell from this, surely there's no reason you'd need to load 1M records into a browser database?
    – Phix
    Commented Jan 9, 2019 at 1:39
  • 1
    haha It's a niche application, which would be extremely useful to have from a user perspective for a handful of users me included. In the end I may have to keep it server side which you would likely recommend, but when the million records was only 20MB I really thought the size issue would be pretty negligent in terms of the practical benefit for small user group - especially considering the comparable size of say a small video. Commented Jan 9, 2019 at 8:07

1 Answer 1

4

You could be hitting the data size limits of the browser.

Here in mozilla docs it mentions the limits https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API/Browser_storage_limits_and_eviction_criteria

And here are some more limits of indexeddb by documented google for popular browsers. https://developers.google.com/web/fundamentals/instant-and-offline/web-storage/offline-for-pwa

Seems the limits are all based on the available storage of the host os. Check the size of the data you are expecting to import and your available storage.

3
  • 1
    It does look like the problem is actually hitting the data size limit. I hadn't caught this, as I expected the indexedDB to be allocated 6% of disk space as written in the docs, however in my chrome browser it is only being allowed a max of 101MB. I'm not sure why this limit is being enforced by the browser. Additionally the file takes up a lot more storage within the db compared to a file interestingly. This could be a problem. Commented Jan 8, 2019 at 23:25
  • The issue was indeed the fact that the application was hitting the size limit, but was not throwing any kind of error handling event. It turns out this was an issue with browser choice. While I was using Google Chrome, and expecting 6% of total available storage, my browser defaults to "incognito mode" which it turns out sets different hard limits on applications. Using the dev tool inspect options within Google Chrome, I was able to see the data limit was being set to 101MB for every single window within the application which was the problem. Commented Jan 9, 2019 at 8:13
  • 2
    So if anyone else runs into a similar problem, it seems to be a pretty straightforward issue of browser choice and settings. Make sure you do not run an indexedDb app through incognito mode. Commented Jan 9, 2019 at 8:14

Not the answer you're looking for? Browse other questions tagged or ask your own question.