I wrote an infinite scrolling scrolling plugin for an app I'm developing. When requesting the second 'page' from the server, I loop through each image, and give it a very basic onload
function.
// I will have the result in a variable called data
// Fetch each image from result
$images = $( data.content ).find('img');
// Cache length for use in for loop
imagesLength = $images.length;
for ( var i = 0; i < imagesLength; i++ ) {
// Create new Image object
img = new Image();
// Match src of image in the result
img.src = ( $images[i].src );
// Add onload listener
img.onload = function () {
// Increment iterator, and check if we've loaded all the images
if( ++imageLoadingIterator == images.length)
_continue(); // Run _continue() if this is the case
}
}
I feel like this isn't the best way to do this. Is there a way I can improve this function to run faster? For example: I'm thinking maybe creating a new Image
object for each img
tag is too much overhead.
-
\$\begingroup\$ Why do you think it can be sped up? Because as far as I can tell, the js can't be made much faster. If it seems slow, it's probably the actual image fetching that's taking time \$\endgroup\$Flambino– Flambino2013年03月08日 00:26:32 +00:00Commented Mar 8, 2013 at 0:26
1 Answer 1
Not really about performance, but this is more about clarity of the code.
Attach
onload
handlers first, for readability. Appending thesrc
starts the loading process. If you saw code that loaded something, expect to do something, but haven't found the handler to do it makes it a bit confusing.Declare one handler function in a persistent scope and have the
onload
refer to it, rather than create one handler for each onload, or creating the function everytime the whole routine executes.Instead of
find()
, use jQuery function's second parameter, context. With this, jQuery finds the elements with the given selector under the jQuery object that is on the second parameter.If the images are children of
data.content
, usechildren()
instead offind()
to avoid deep DOM traversal. Also, the lower you are in the DOM, the less deepfind()
has to dive and find your elements.Use strict comparison as much as possible.
Don't forget to declare the variables and use
var
. You'd be declaring globals if you didn't.
And so:
var $images = $('img', data.content),
imagesLength = $images.length,
imageLoadingIterator = 0,
img, i;
function onImageLoad() {
if (++imageLoadingIterator === images.length) _continue();
}
for (i = 0; i < imagesLength; i++) {
img = new Image();
img.onload = onImageLoad;
img.src = $images[i].src;
}
However, these advices might have a very negligible gain in performance, and might be unnecessary since the compiler might be optimizing them in some way.
-
\$\begingroup\$ Thanks for the tips! I defined the variables as you recommended already, but I forgot to paste it into the question. It was behind some other (irrelevant) code so I missed it \$\endgroup\$xyhhx– xyhhx2013年03月08日 17:21:39 +00:00Commented Mar 8, 2013 at 17:21
-
\$\begingroup\$ Also, wrap your code in a
(function($){ ... })(jQuery);
to avoid problems withjQuery.noConflict();
and to keep thosevar
s as local variables, instead of global. That should massivelly reduce any risk of outside manipulation and should increase the performance a bit more (since it doesn't have to access the over-crowdedwindow
object for your variables) \$\endgroup\$Ismael Miguel– Ismael Miguel2016年01月04日 21:41:35 +00:00Commented Jan 4, 2016 at 21:41
Explore related questions
See similar questions with these tags.