Be Careful When Including Images In jQuery Auto-Suggest
Yesterday at work, we ran into a very interesting problem involving a jQuery auto-suggest feature on one of our client sites. We had implemented auto-suggest on this particular site several times before and it had always proved to be very zippy and responsive. This time, however, the "suggest" page requests were taking 5, 6, sometimes 8 seconds to complete; this kind of delay clearly defeats the purpose of auto-suggest. But, more than that, I was getting extremely frustrated with this seemingly nonsensical delay that none of the other auto-suggest instances were exhibiting.
After checking the HTML, the jQuery, and the SQL query powering the auto-suggest, I was just about at my wit's end; then, blankly staring at FireBug's network activity in disbelief, it finally dawned on me! This particular auto-suggest feature's result set included thumbnail images of products (it was an e-commerce website). That was the problem: there's only so many parallel requests that the browser can make to the same domain at the same time. What was happening here was that the product thumbnails were hogging up all of the available HTTP requests; as such, when the auto-suggest went to make a subsequent request, it got queued after the pending image requests.
As you can see in the video, when the auto-suggest feature launches a second request, it gets queued after the all of the pending image requests. My particular browser (all browsers are different and can be custom configured) can handle 6 parallel requests to the same domain. As such, you can see in FireBug that not even all of the image thumbnail requests can be made at the same time; of the 10, only the first 6 are actually processing, leaving 4 pending. Then, only when the first 6 thumbnails load, the last 4 can be processed. At that point, since there are only 4 concurrent requests, the second auto-suggest request can finally be processed.
The solution to this is not to remove images from the auto-suggest results - remember, we want to create a rich, interactive experience for our users; the solution is much more core to the problem. The thumbnails aren't the issue, it's the nature of the parallel requests. The browser has limits on how many parallel requests it can make to the same domain; so, to solve the problem, rather than removing the images, we "simply" have to pull them from a different domain. Once we do that, the browser will have no problems processing more than 6 parallel requests.
As you can see in this video, once the thumbnails are being pulled from a different domain than then auto-suggest results, the browser no longer needs to queue the auto-suggest results. As such, they come back fast even when there are many pending images still loading.
To be honest, multi-domain strategies are something that I understand in theory, but have never really put into effect. Things like loading binary assets off of Amazon S3 or sub-domains makes perfect sense - I just haven't implemented them all that much in production. After seeing this problem, however, it definitely feels like something that I need to start implementing as best practice. Whether or not I need it for a particular site, I certainly don't ever want to run into a situation where domain-use prevents me from delivering the most compelling user experience possible.
While the code I was using is not really the point of this post, I will list it below for reference. Here is the main HTML page. Please keep in mind that this was not supposed to be an accurate auto-suggest feature - I just needed something that "looked" like auto-suggest to test the feature:
test.htm
<!DOCTYPE HTML>
<html>
<head>
<title>jQuery Auto-Suggest And Images</title>
<style type="text/css">
#results {
background: #F0F0F0 ;
border: 1px solid #999999 ;
display: none ;
font-family: verdana ;
font-size: 11px ;
position: absolute ;
}
#results a {
display: block ;
overflow: hidden ;
padding: 5px 5px 5px 5px ;
height: 47px ;
zoom: 1 ;
}
#results a:hover {
background-color: #E0E0E0 ;
}
#results img {
background-color: #D5D5D5 ;
border: 1px solid #CCCCCC ;
float: left ;
height: 45px ;
margin-right: 8px ;
width: 68px ;
}
#results strong {
display: block ;
margin-bottom: 3px ;
}
</style>
<script type="text/javascript" src="../jquery-1.4.2.js"></script>
<script type="text/javascript">
// When the DOM is ready, initialize script.
jQuery(function( $ ){
// Get references to our main DOM items.
var criteria = $( "#criteria" );
var results = $( "#results" );
// Resize and position the results element to be below
// the criteria input (for this demo, we're not going
// to worry about any window resizing issues).
results.css({
left: (criteria.offset().left + "px"),
top: (criteria.offset().top + criteria.outerHeight() + "px"),
width: (criteria.outerWidth() + "px")
});
// Turn off auto-complete on the criteria input.
criteria.attr( "autocomplete", "off" );
// This will keep track of the AJAX request object
// so that we can abort it if necessary.
var xhr = null;
// This will keep track of the key-down timer so that
// we don't launch too many requests to the server.
var resultsTimer = null;
// I will get the results from the server for the
// auto-suggest.
var getResults = function( query ){
// Get the results.
xhr = $.ajax({
type: "get",
url: "results.cfm",
data: {
query: query
},
dataType: "html",
success: function( response ){
// Populate the results.
results.html( response );
// Show or hide the results as necessary.
if (response.length){
// Show the results.
results.show();
} else {
// Hide the results.
results.hide();
}
}
});
};
// Bind the key up events in the input.
criteria.keyup(
function( event ){
// Clear any existing timer.
clearTimeout( resultsTimer );
// Clear any existing request.
if (xhr){
xhr.abort();
}
// Get the current query value.
var query = criteria.val();
// Set a timer to get the next set of results.
resultsTimer = setTimeout(
function(){
getResults( query );
},
150
);
}
);
});
</script>
</head>
<body>
<form>
<input id="criteria" type="text" size="40" />
<div id="results">
<!-- To be populated dynamicaly. -->
</div>
</form>
</body>
</html>
Here is the code that gathers the auto-suggest results. Notice that I have the code for using two different domains for the IMG SRC attributes.
results.cfm
<!--- Param the query value. --->
<cfparam name="url.query" type="string" />
<!---
Make sure the query never goes more than 10 characters -
this is ONLY for demo purposes.
--->
<cfset url.query = left( url.query, 10 ) />
<!---
Store the content of the response. For demo purposes,
we are simply going to make the results (10) minus the
number of letters.
--->
<cfsavecontent variable="results">
<cfoutput>
<!--- Local domain. --->
<cfset domain = "./" />
<!--- Different domain. --->
<!---
<cfset domain = "http://localhost:8501/jquery/autosuggest/" />
--->
<cfloop
index="index"
from="#len( url.query )#"
to="10"
step="1">
<!---
When we make the results line item, notice that
we are setting the value of the IMG tag to be a
ColdFusion page - this will come into play with
the delayed time.
NOTE: I am using the randRange() function to
prevent any caching attempts by the browser.
--->
<a href="##">
<img src="#domain#img.cfm?id=#randRange( 1, 99999 )#" />
<strong>Tanya Hyde</strong>
One of the hottest female bodybuilders.
</a>
</cfloop>
</cfoutput>
</cfsavecontent>
<!--- Convert the response to a binary for streaming. --->
<cfset binaryResponse = toBinary( toBase64( trim( results ) ) ) />
<!---
Set the content length so the browser knows how much to
content to expect.
--->
<cfheader
name="content-length"
value="#arrayLen( binaryResponse )#"
/>
<!--- Stream the content back. --->
<cfcontent
type="text/html"
variable="#binaryResponse#"
/>
And, here is the page that loaded the image thumbnail. The only reason this page had to be a ColdFusion page, rather than a binary URI, was to allow ColdFusion to simulate the network latency. I had to sleep the requests so that they didn't complete immediately.
img.cfm
<!---
Sleep the thread for a few seconds. To simulate network
delays and request latency.
--->
<cfthread
action="sleep"
duration="#(10 * 1000)#"
/>
<!--- Simply stream the image back to the client. --->
<cfcontent
type="image/*"
file="#expandPath( './thumbnail.jpg' )#"
/>
Anyway, this was just an interesting problem that had me stumped for a good thirty minutes. It definitely gives me a lot of food for thought on what I want to consider "best practices" for asset delivery going forward.
Want to use code from this post? Check out the license.
Reader Comments
Could you not have used a dynamic sprite image so you would only have had the one image request?
@Redsquare,
It's funny you mention that - I actually considered that idea. My concern with that was that I would have to create a new sprite for every search results set; I wasn't sure if ColdFusion could resize and paste so many images without seeing a speed issue.
But, it would be something cool to play with; perhaps I'll explore that in a follow-up post. Thanks!
Using Data URLS for the images would elminate the requests on non IE browsers, as the image data would simply be encoded in base64 and returned in the same request as the data itself. You would still need to use a separate domain to server the images for IE though. I like being able to serve my static files from a separate domain anyway, for this very reason.
Following Ryan's suggestion and using base64 encoded images may also allow you to dequeue by aborting the XHR's.
Thanks for explaining this so thoroughly, Ben. It seems like a very useful technique.
@Ryan, @jdbartlett,
That's an interesting idea. I looked into base64 images a while back (unrelated) and found that it wasn't so cross-browser friendly. Not sure if it was just IE6 or all IE (as it seems @Ryan is implying). I'll do some more exploration in that.
data: URIs don't work in IE7 and prior, but are supported in IE8. I can't test it right now, so I'm not sure if earlier versions of IE degrade relatively gracefully (by showing nothing), or if they display a broken-image icon.
One of the things that can be done is to make use of the way browsers cache images. A lot of the time people link directly to a script in the format like you have in your code <img src="Domain/img.cfm?id=#id#" /> the problem with that is that the browser can't cache it due to the way that it reads the get params as unique data. The way around this is to use url rewriting for example <img src="Domain/img/#id#/img.cfm" /> it then sees this as a unique singular file and will cache it, thus decreasing the load time for other images that are the same.
@CJM,
Ah, good to know. I can understand not supporting IE6; but, supporting IE7 is probably still necessary.
@Kathryn,
You make a really good point. In my example, I happened to be doing everything I could to prevent caching (since I wanted to be creating parallel request threads); but, if I wanted to leverage browser caching, building the unique URLs would definitely be a solid strategy.
I don't know about IE, but for mozilla there are actually four settings buried in the config (my current values):
network.http.max-connections
network.http.max-connections-per-server
network.http.max-persistent-connections-per-proxy
network.http.max-persistent-connections-per-server
You're addressing the second one here, but not the first (comes out of the box as 2x the second).
And of course you never really know what Joe User's done to his browser ("Do you know what you're doing?" prompt aside), so the guy with those prefs set to 6/3 is gonna bog down pretty fast no matter what ;-).
@Jim,
I've only ever messed with the config in FireFox. Not sure what's available in the other browsers.
We do something similar for this when loading pages with a lot of images.
I noticed the browser would slow down on image heavy pages even though the bandwidth usage was minimal.
We have several sub-domains that are just for this purpose. eg. //imgs0.domain.com //imgs1.domain.com.
Then on each image a call is done to grab a different host.
This really speeds up the load time, our JavaScript and CSS are also on their own domains, which also speeds up things considerable.
<cfset thisImageHost = 0 />
<cfset imgHosts = ["imgs0","imgs1","imgs2","imgs3"]>
<cfset thisImageUrl = ".domain.com/imagedb/products/180x210/" />
<cffunction name="getImageHost">
<cfif thisImageHost GT 3>
<cfset thisImageHost = 0 />
<cfelse>
<cfset thisImageHost = thisImageHost + 1 />
</cfif>
<cfreturn "http://" & imgHosts[thisImageHost] & thisImageUrl />
</cffunction>
@Kevin,
Yeah, this looks like a pretty good practice. I have to look into setting up a "*" sub-domain in my registrar so I can start handling things like that in my server.
The only thing that would be frustrating about this approach is that you need to have a full URL, not just a relative url (meaning, you need the server name). This seems like just another thing to worry about when going from development to production servers.
This is the same method that Google uses on their Google Maps requests-- they use 4 sub-domains just for their image requests.
We had to follow suite because we show a lot of thumbs on each request-- especially on our wholesale site and our art archives (not public).
Managing the separate configurations is always an issue.
I have been using ANT to compile my builds now.
I can do a programmatic configuration using a local server to generate a ColdSpring XML list map.
In the XML I include the list map bean to create the array of sub domains, so my local build can have a local static relative URL, them my generate build can have the array of sub-domains to use when generating the HTML.
With Javascript it would be even easier since the URL's can be dynamic in the requests.
It could be done just as easy just by providing a configuration variable with a array of sub-domains, or just a single relative path depending on the environment.
The environmentConfig.cfc on RIA forge could handle this quite well.
We used the wildcard method at first but we switched to the static URL's because we wanted a bit more control over it, and it was less resources overall (wildcard lookup is a bit more overhead).
This is a good article though-- I had situations like this using AJAX where I could not figure out why it was so slow, this is a common issue across the board.
I'm sure most people solution to this would be to nuke the images, I like the point you made that you wanted to keep it to "create a rich, interactive experience" which is the whole point to AJAX. Persistence is key here. You didn't take a step backwards and remove the images, you found a solution. Good stuff man. I love it.
@Kevin,
I think my desire to create a "*" sub-domain is simply a response to the "emotional" feeling that somehow I am gonna need to do a LOT of work. This, of course, is simply not true; it's really just a one-time setup if I want to create several sub-domains. So, as you are saying, going static could be good.
As far as the number of sub-domains used, any advice on the number to use. If Google uses 4, I have to assume that's a really good number :) I would also assume that at some point there is simply a diminishing return on the number of parallel threads (as there is probably a limit to how much the browser can even take advantage of this, no matter how many sub-domains).
Here is a great reference from Google about Parallelize downloads across hostnames.
http://code.google.com/speed/page-speed/docs/rtt.html#ParallelizeDownloads
@Kevin,
Thanks for the link. I keep wanting to try this out on something, but I don't know what. Maybe this blog! I kind of wish I had more stuff linked.
Would it be possible to cancel 'old' image requests using this snippet of code at stackoverflow
http://stackoverflow.com/questions/930237/javascript-cancel-stop-image-requests
It essentially 'clicks' the stop button in your browser. So could you use it just before each ajax call, to halt the loading of previous images?