Compressed or uncompressed?

Page: 1

Author Post
gregp
Guest
What's the diff apart from the filesize? Does the uncompressed have more functionality? I would have thought so but I've seen posts about other lightboxes where certain things only work with the compressed library. Is there a list somewhere of pros and cons, and when to use each?

Greg
Administrator
Registered: Aug 2008
Posts: 3382
There should be absolutely no difference other than the file size. If there are any differences in behaviour then the compressor is buggy. I use Dean Edward's Javascript Compressor which has a reputation for reliability and correctness, and which allows me to keep the config section uncompressed while squishing the rest.

I include compressed versions because some people like them, but I don't use the compressed version on my own site.

The difference in file sized is about 40k. In 56Kb modem days this was a big deal and would add about 10 seconds to the page download time. Now I assume just about everyone is working with broadband and if you've got 2Mbps service that extra 40k takes about 1/5 second to load.

I've never measured it, but I bet it takes about the same amount of time to decompress the script. And the script needs to decompress every time it's run. The download happens only once thanks to browser caching. Taking that into consideration, I bet that on a fast pipe the compression actually adds a small increase to the average load times.

Hmmm, now that I've written it all out, I guess I have to conclude that compression is a BAD THING.
gregp
Guest
I guess I could have answered my own question somewhat by looking deeper into the compressed version :roll: I just opened the file and looked at the start and it looked the same so I presumed bits were pulled out further down. Thanks. I think I'll stick with the uncompressed version - if I ever want to see how something's done I think I'll be hard pushed to understand anything in the compressed one! :lol:
i960
Guest
@admin - you are right about it taking just as long to decompress the script on the fly, but that's only because you are using Packer. There are other compressors out there that don't base62 encode the files and therefore can just run on the client end without the need to decompress. YUI Compressor is my favorite. While it doesn't compress quite as much as Packer does, once you take gzip compression into account, the difference is negligible. Unless I need it done on the fly, I always use YUI. Here's a good comparison tool:

http://compressorrater.thruhere.net/
i960
Guest
I went ahead and ran Floatbox through that tool just to give you an idea. I ran the entire script through just for simplicities sake, but you will obviously want to leave the copyright and configuration section in place. Here are the results:

Without gzip:

Packer - 31620 bytes
YUI - 50910 bytes

Obviously in this case Packer does a better job compressing, but has the need to decompress on the fly. Now look what happens when we add gzip.

With gzip:

Packer - 14470 bytes
YUI - 15105 bytes

That's only a difference of 635 bytes. From a download time perspective, they are identical. Now just for completeness sake, here is the result using only gzip and nothing else:

Gzip only: 18089 bytes.

That's only 2984 bytes saved by using YUI. That could also be considered close enough to not matter, but I like to squeeze every byte I can with as few negative side effects as possible. Plus, not everyone is going to be using gzip. So if you are going to offer a compressed version of your script, I think YUI would be a better choice. Just my two cents.
« Last edit by Unknown on Wed Oct 01, 2008 11:28 pm. »
Administrator
Registered: Aug 2008
Posts: 3382
It's my turn to ask for assistance. 8)
I have tried hard, and failed, to get js (and css) files to load using gzip on my site. I've got floatbox.js and floatbox.js.gz in the same folder and have tried variations on the following .htaccess file to get them served out. Whatever I try, my host's Apache server fails with server error 500. Can anyone offer suggestions as to what the problem might be and how to make this work???
Thanks in advance...

Options -MultiViews

<FilesMatch "\.js\.gz$">
ForceType text/javascript
Header set Content-Encoding: gzip
</FilesMatch>

<FilesMatch "\.css\.gz$">
ForceType text/css
Header set Content-Encoding: gzip
</Files>

<IfModule mod_rewrite.c>
RewriteEngine on
RewriteBase /
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !Safari
RewriteCond %{HTTP_USER_AGENT} !Konqueror
RewriteRule ^(.+)$ $1.gz [QSA,L]
</IfModule>
i960
Guest
Do you have access to your Apache configuration file? Sounds to me like mod_headers is not enabled.
Administrator
Registered: Aug 2008
Posts: 3382
Doh!

Notice my mismatched
<FilesMatch "\.css\.gz$">...
</Files> :oops:

It's funny how stupid little things like that can take hours to find.
Serving the gzipped files is working fine now. I'll include a little blurb in the package for how to use them with Apache.
i960
Guest
Good eye. I didn't even notice that. :P

This is how I am doing gzip compression:


<Location />
# Insert filter
SetOutputFilter DEFLATE

# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html

# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip

# MSIE masquerades as Netscape, but it is fine
# BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won&#039;t work. You can use the following
# workaround to get the desired effect:
BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html

# Don&#039;t compress images
SetEnvIfNoCase Request_URI \
\.(?:gif|jpe?g|png)$ no-gzip dont-vary

# Make sure proxies don&#039;t deliver the wrong content
Header append Vary User-Agent env=!dont-vary
</Location>


It does it all on the fly, no need for separate gzipped files. I'm pretty sure it caches the compressed versions too.

The reason I mention mod_headers being enabled is because by default it isn't. When I first tried to use that code above I was getting 500 errors too because of the Header command.
Administrator
Registered: Aug 2008
Posts: 3382
I'm using a hosting company, 1and1.com, which doesn't turn on mod_deflate. I think a lot of hosting companies turn this off due to the server computational load it adds. Under these circumstances, we're pretty much stuck with providing our own pre-gzipped files.

Now that it's working for me, I did some load time measurements being careful to keep caching out of the picture. Here's what firebug reports:

Compressed:
floatbox.js 19k 200ms
floatbox.css 3k 212ms

Uncompressed:
floatbox.js 88k 203ms
floatbox.css 14k 210ms

Hmmmm....
i960
Guest
Interesting... when I get a chance I'll set up a test install on my server and see if I get similar results. It could be that Floatbox is just too small to benefit much from compression, assuming a broadband connection. I'm guessing on a dialup connection the difference would be much larger. Also, what level of compression did you use for gzip?
i960
Guest
How are you getting your numbers? Here is my test page:

http://www.industrialtechware.com/fbtest/

And the results from YSlow:

Compressed:
User posted image

Uncompressed:
User posted image

I used my method of gzipping on the fly. I used a slightly different version of the .htaccess file I posted above. The other one is straight out of the Apache docs and gave me a 500 error. Here is exactly what I have in mine:


<IfModule mod_deflate.c>

# Insert filter
SetOutputFilter DEFLATE

# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html

# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip

# MSIE masquerades as Netscape, but it is fine
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

# Don&#039;t compress images
SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary

# Make sure proxies don&#039;t deliver the wrong content
Header append Vary User-Agent env=!dont-vary

</IfModule>


I would think you would see even better results serving up pre-compressed files. No idea what's going on there.
« Last edit by Unknown on Mon Oct 20, 2008 5:49 pm. »
Administrator
Registered: Aug 2008
Posts: 3382
Your numbers match expectations better than mine, but I consistently get very similar timings between gzipped and plain files.

I'm getting my numbers from the Net panel in firebug. I added in YSlow and it shows the exact same numbers as the Net panel. I'm using max compression (-c9).

Perhaps latency between me and my hosting servers is overwhelming the raw transfer.

Further on the yui compressor. Floatbox compressed with yui breaks in IE. Other browsers are ok. The breakage occurs with any combination of yui compressor options. I pulled those compressed (minified) files out of the download package and replaced the .gz ones with new ones that did not first go through the compressor.

This evening I'll try to put up a couple of test pages, one with gzip, one without, and invite you to take a couple of measurements to see what you see.

Cheers...
i960
Guest
admin wrote
I'm using max compression (-c9).


That might be part of the problem. Typically max compression does not result in any significant gains, but it takes quite a bit longer to compress. I'm not sure how it affects things on the decompression side of things, but it may affect that as well. Try using a much lower compression level (like 1) and see if that helps.

Regarding YUI, I'm surprised it's breaking Floatbox. Usually YUI is very safe, probably the safest one I know of. I've seen JSMin break scripts but not YUI.
Administrator
Registered: Aug 2008
Posts: 3382
Yes, the YUI compressor has a good reputation for not breaking things. The usual suspect for compression breaking code is that the script is not well formed and is missing some needed semi-colons. This is not the case here. I did jslint it first to confirm that all the semi-colons were in place.

There are IE conditional compilation statements in my code. The YUI compressor handles these. But it is screwing up the script structure in some manner. When my code comes back from the compressor it has a couple of extra curly braces right after the IE conditional compilation.
Pre:
/*@end @*/
fb_prevOnload = ...

Post:
/*@end @*/
}}fb_prevOnload=...

Clearly a compressor bug.

Max compression is the right way to go when pre-compressing the files. Min compression is the right way to go when dynamically compressing them at each request. I've lost the links, but have seen a couple of pages of timings for compressing and uncompressing gzip at each compression level. Both pages agreed that level 9 takes about 6 times as long to zip as level 1 but there is no meaningful difference in unzip times. I confirmed this by trying both a -c1 and a -c9 off my site. They had the same timings.

My previous timings showing no difference between load times for my gzipped files vs. uncompressed are confirmed in a second batch of testing. Here's what I measured (firebug Net panel):
demo_gzip.html 22KB - 158, 165 & 163 ms
demo_nogzip.html 88KB - 172, 159 & 162 ms

I got very similar timings when I took the html page load with all its graphics out of the equation and just loaded the .js file directly.
floatbox.js (compressed) -c9 22KB - 169, 160 & 151 ms
floatbox.js (compressed) -c1 27KB - 161, 154 & 161 ms
floatbox.js (uncompressed) 88KB - 166, 155 & 158 ms

I've been very glad for your assistance and suggestions on this topic. I know much more about compression than I did two weeks ago. Bottom line for me is that I don't think it is really worth the effort in a broadband world. The acid test would be whether an end user could detect the difference between compressed and uncompressed delivery. I don't think you will find such a user. However, there is one other good advantage to compression. If your hosting provider charges by bytes delivered, or limits the allowed transfer volume, you can drastically reduce the amount of data flowing out from your server if a large portion of it is compressed.
i960
Guest
I get similar timings on my end with your test pages. That is very odd to me. Decompression of the file should happen very quickly and not add very much to the response time. Download time should go down as the file gets smaller, not stay the same, even taking into account broadband. I'm wondering if it has something to do with your host. In my case I own my server and I have it co-located in downtown Los Angeles. I'm actually in the same datacenter as YouTube. :P I'm curious what kind of results would get with a huge javascript file... something on the order of 10 times larger. If you still get the same timings between normal and compressed then something is definitely wrong.

Page: 1