I have a Playlist
object which contains many PlaylistItem
children. If I have 10,000 children then the UI gets blocked from rendering until all 10,000 children have been processed.
To avoid this, I've created a recursive function wrapped in a setTimeout
. This allows a chunk of the playlist to be rendered, allow the UI to update by pausing for setTimeout
and then continue to run until it is empty.
Is this a good implementation? Any critiques? I know I still should implement pagination in the long run, but this is for the short term.
render: function () {
this.$el.html(this.template(
_.extend(this.model.toJSON(), {
// Mix in chrome to reference internationalize.
'chrome.i18n': chrome.i18n
})
));
// Group playlistItems into chunks of 200 to render incrementally to prevent long-running operations.
var chunkSize = 200;
var playlistItemChunks = _.toArray(this.model.get('items').groupBy(function (playlistItem, index) {
return Math.floor(index / chunkSize);
}));
var self = this;
this.incrementalRender(playlistItemChunks, function () {
self.$el.find('img.lazy').lazyload({
container: self.$el,
event: 'scroll manualShow'
});
});
return this;
},
incrementalRender: function (playlistItemChunks, onRenderComplete) {
// Render a chunk:
if (playlistItemChunks.length > 0) {
var playlistItemChunk = playlistItemChunks.shift();
// Build up the views for each playlistItem.
var items = _.map(playlistItemChunk, function(playlistItem) {
var playlistItemView = new PlaylistItemView({
model: playlistItem
});
return playlistItemView.render().el;
});
// Do this all in one DOM insertion to prevent lag in large playlists.
this.$el.append(items);
var self = this;
setTimeout(function() {
self.incrementalRender(playlistItemChunks, onRenderComplete);
});
} else {
onRenderComplete();
}
},
1 Answer 1
You code looks good, I only can give a few suggestions:
- I find it more convenient to use
...bind(this)
instead ofvar self = this;
- in this case the extra variable is not necessary the code looks better. - Maybe it is better to use an index within the array - in this way the array will not be copied several times. Of course, the JS is highly optimized internally, but it is a good to help it a little.
- Rendering 10.000 items is still very CPU consuming process (especially on mobile devices) so maybe use the similar approach that you have used with images and build the next chunk only when the user scrolled down to the end?
-
\$\begingroup\$ Thanks for responding! I definitely should be using bind -- I just haven't used it enough to be comfortable with it -- but now is a good time. I'll also strongly consider using an indexer in the array instead of breaking into chunks to prevent copying. This is a valid point. And yes, finally, I do need to support pagination -- the logic is just more complex and I was trying to create a temporary fix until I'm able to fully grasp what I need to do to implement that while reading data from a server. :) Cheers \$\endgroup\$Sean Anderson– Sean Anderson2013年11月07日 23:50:41 +00:00Commented Nov 7, 2013 at 23:50
Explore related questions
See similar questions with these tags.
setTimeout
to control the amount you render is the right way to go here. Your problem is that you're trying to render 10,000 elements at once.. thats way too much information for a human to process. Instead, why not try using paging and / or infinite scrolling? \$\endgroup\$