Possible to add large amount of DOM nodes without browser choking?
I have a webpage on my site that displays a table, reloads the XML source data every 10 seconds (with an XmlHttpRequest), and then updates the table to show the user any additions or removals of the data. To do this, the JavaScript function first clears out all elements from the table and then adds a new row for each unit of data.
Recently, I battled thru a number of memory leaks in Internet Explorer caused by this DOM destroy-and-create code (most of them having to do with circular references between JavaScript objects and DOM objects, and the JavaScript library we are using quietly keeping a reference to every JS object created with new Element(...)
until the page is unloaded).
With the memory problems solved, we've now uncovered a CPU-based problem: when the user has a large amount of data to view (100+ units of data, which equals 100 <tr>
nodes to create, plus all of the table cells for each column), the process ties up the CPU until Internet Explorer prompts the user with:
Stop running this script?
A script on this page is causing Internet Explorer to run slowly. If it continues to run, your computer may become unresponsive.
It seems that running the row-and-cell-creation code times 100+ pieces of data is what is causing the CPU usage to spike, the function to take "too long" (from IE's perspective) to run, thus causing IE to generate this warning for the user. I've also noticed that while the "update screen" function runs for the 100 rows, IE does not re-render the table contents until the function completes (since the JS interpreter is using 100% CPU for that time period, I assume).
So my question is: Is there any way in JavaScript to tell the browser to pause JS execution and re-render the DOM? If not, are there any strategies for handling creating large amounts of DOM nodes and not having the browser choke?
One method I can think of would be to handle the "update table" logic asynchronously; that is, once the Ajax method to reload the XML data is complete, put the data into some sort of array, and then set a function (using setInterval()
) to run which will handle one element of the array at a time. However this seems a little bit like re-creating threading in a JavaScript environment, which seems like it could get very complicated (ie what if another Ajax data request fires while I'm still re-creating the table's DOM nodes?, etc.)
update : Just wanted to explain why I'm accepting RoBurg's answer. In doing some testing, I've found that the new Element()
method in my framework (I'm using mootools) is about 2x as slow as the traditional document.createElement()
in IE7. I ran a test to create 1000 <spans>
and add them to a <div>
, using new Element()
takes about 1800ms on IE7 (running on Virtual PC), the traditional method takes about 800ms.
My test also revealed an even quicker method, at least for a simple test such as mine: using DocumentFragments as described by John Resig. Running the same test on the same machine with IE7 took 247ms, a 9x improvement from my original method!
100 <tr>
's isn't really that much... are you still using that framework's new Element()
? That might be the cause of it.
You should test the speed of new Element()
vs document.createElement()
vs .innerHTML
Also try building the dom tree "in memory" then append it to the document at the end.
Finally watch that you're not looking at .length too often, or other bits and bobs like that.
I have experienced similar problems at round 3000 table rows of complex data, so there is something not entirely right with your code. How is it running in firefox ? Can you check in several different browsers.
Are you binding to onPropertyChange anywhere ? This is a really dangerous ie event that has caused me severe ie-specific headaches earlier. Are you using CSS selectors anywhere ? These are notoriously slow on ie.
您可以创建一个字符串表示并将其添加为节点的innerHTML
。
上一篇: jQuery自动完成:事件