To make it clear what I'm asking, here is my example (fiddle).
I have a list of ~500 random names. I have an input at the top that has live-style searching. On every keyup
, the value of the input is taken, and every item in the list is matched against it. Items that don't match are hidden.
Subjectively, the performance is okay, but not great. If you type quickly there is a noticeable pause before the list updates. I haven't profiled the code, but the bottleneck is almost certainly the changes to the DOM and the reflows it causes.
I wonder if it's possible to “queue up” these changes and only actually apply them at the end of the loop. So it would be one giant reflow and not lots of little ones.
In another version of the fiddle, I used a RegExp to get more fancy with the matching and presentation. Even though I'm using more DOM manipulation in this one (adding/removing tags to enable match highlighting) the performance feels about the same. I did also try adding visible/hidden classes in CSS and just setting the elements' className
to that because that is supposed to be better performing (search for javascript reflows & repaints stubbornella—I can't post more than 2 links) but in my testing (Firefox 54) I found it was worse. So I don't know what's going on there.
What I guess I'm actually asking is: how do I make this code faster?
There's no point in buffering updates to the DOM, the DOM itself does that already just fine before reflowing/rerendering.
What you have to aim for are doing less updates to the DOM, using only cheap interactions, as few interactions as possible (Where "interactions" includes getters). Oh, and never use properties that force a reflow.
500 elements are quite doable, and your first fiddle is already quite responsive for me. In the second, I have identified a few problem zones and possible improvements:
innerText
is bad. Really bad. It forces a reflow, as it takes into account styling and will not return invisible text (which also did break your fiddle). Use textContent
instead.innerHTML
is nearly as bad, as it requires the HTML parser to be invoked. 500 times. That can sometimes (for large chunks) be faster than manually updating every part of the DOM, but not here. Instead of destroying and recreating all these tags, keep the elements in the DOM.requestAnimationFrame
instead of a very small setTimeout
, so that the DOM is updated only exactly once before it is rendered.new RegExp
is also rather expensive. You only need to call it once, not for every item.listItems
from the DOM every time the function is called, but cache the array outside of the function like you do for list
and search
. And you can do even better: Also cache their contents and the style objects, so that you don't have to access them through the DOM.So once you fix the "Quick hacky way to remove <b>s
" (as you documented it yourself), most of the problems should be gone. Here's the gist of my approach:
var search = document.getElementById('s');
var items = Array.from(document.getElementById('l').children, function(li) {
return {
text: li.textContent,
style: li.style,
pre: li.firstChild, // the text node
match: li.appendChild(document.createElement("span"))
.appendChild(document.createTextNode("")),
post: li.appendChild(document.createTextNode(""))
};
});
function searchAction() {
var term = search.value;
var re = new RegExp(term, 'i'); // case insensitive
for (var {text, style, pre, match, post} of items) {
var m = text.match(re);
if (m) {
pre.nodeValue = text.slice(0, m.index);
match.nodeValue = m[0];
post.nodeValue = text.slice(m.index + m[0].length);
show(style);
} else {
hide(style);
}
}
}
See updated fiddle.