javascriptjsongoogle-chrometypeahead.jstypeahead

typeahead.js, localStorage and large json File


I've got a JSON file which has a size of 1Mb. I tried to implement typeahead.js with a simple example like this:

    <div class="container">
    <p class="example-description">Prefetches data, stores it in localStorage, and searches it on the client: </p>
    <input id="my-input" class="typeahead" type="text" placeholder="input a country name">
  </div>
  
   <script type="text/javascript">
    // Waiting for the DOM ready...
    $(function(){

      // applied typeahead to the text input box
      $('#my-input').typeahead({
        name: 'products',

        // data source
        prefetch: '../php/products.json',

        // max item numbers list in the dropdown
        limit: 10
      });

    });
  </script>

But when I launch it Chrome says:

Uncaught QuotaExceededError: Failed to execute 'setItem' on 'Storage': Setting the value of '__products__itemHash' exceeded the quota.

What can I do? I'm using the typeahead.min.js


Solution

  • You are seeing that error because typeahead prefetch uses localStorage to store the data.

    Firstly, storing 1MB of data on the client side is not really good in term of user experience.

    Given that, you can still solve the problem with multiple-datasets. This is just a workaround and may not be the most elegant solution but it works perfectly.

    The sample data I tested with was >1MB and looks like this

    enter image description here

    You can view the sample here (It takes a while to open)

    Procedure:

    1. First download the entire data using $.getJSON
    2. Then split the data into chunks of 10,000 (just a magical number that worked for me across browsers. Find yours)
    3. Created sets of bloodhounds for each chunk and store everything in an array.
    4. Then initialize typeahead with that array

    Code:

    $.getJSON('data.json').done(function(data) { // download the entire data
      var dataSources = [];
      var data = data['friends'];
      var i, j, data, chunkSize = 10000; // break the data into chunks of 10,000
      for (i = 0, j = data.length; i < j; i += chunkSize) {
        tempArray = data.slice(i, i + chunkSize);
        var d = $.map(tempArray, function(item) {
          return {
            item: item
          };
        });
        dataSources.push(getDataSources(d)); // push each bloodhound to dataSources array
      }
      initTypeahead(dataSources); // initialize typeahead 
    });
    
    function getDataSources(data) {
      var dataset = new Bloodhound({
        datumTokenizer: Bloodhound.tokenizers.obj.whitespace('item'),
        queryTokenizer: Bloodhound.tokenizers.whitespace,
        local: data,
        limit: 1 // limited each dataset to 1 because with 76,000 items I have 8 chunks and each chunk gives me 1. So overall suggestion length was 8
      });
      dataset.initialize();
      var src = {
        displayKey: 'item',
        source: dataset.ttAdapter(),
      }
      return src;
    }
    
    function initTypeahead(data) {
      $('.typeahead').typeahead({
        highlight: true
      }, data); // here is where you use the array of bloodhounds
    }
    

    I created a demo here with 20 items and chunkSize of 2 just to show how multiple-datasets would generally work. (Search for Sean or Benjamin)

    Hope this helps.