Maybe I just didn't port that bug fix into my copy of Noisen/Wedge.org...? I mean, originally Noisen runs on SMF 2.0 Beta 1 (the September 2007 release), and I applied all subsequent updates manually...
Therein lies the issue. I don't really expect to do much maintenance, even less than with BB.
Then, you may easily compress the files.
Anyone can re-download them from the relevant website (getid3 or libsecsomething) and replace their minified files with the originals...
The comments aren't needed for phpseclib and with the header that retains the copyright and so on, I spent 15 minutes on it and got it down to 280KB or so, then just spent time faffing about cleaning the code, didn't really change the size but it did make it tidier. It'll zip fairly well, I think, haven't tested it.
Oh, speaking about zipping... I made a few tests with a Sep. 15 revision of Wedge.
I didn't test with WinRK (it's not free), but I think it wouldn't be (much) better than PAQ...
Also, please remember that the Wedge package is currently 50% bigger than SMF's -- add 1.5MB for AeMe, 500KB for /other/stupid-sub-folders/*, and the rest is mainly new features, new code etc...
ZIP: 3653kb (by comparison, SMF = 2610kb)
ZIP (best): 3642kb
tar.gz (best): 3132kb (85% of the zip file) (by comparison, SMF = 1941kb, i.e. 74% of the zip file)
RAR (best): 3006kb
tar.bz2 (best): 2734kb (75% of the zip file) (by comparison, SMF = 1598kb, i.e. 61% of the zip file)
7z SFX LZMA (ultra): 2688kb
RAR (solid best): 2464kb
7z SFX PPMd (ultra): 2450kb (67% of the zip file)
ZPAQ: 1983kb
PAQ8o: 1828kb (50% of the zip file)
So... PAQ does twice better than ZIP, but it's not very realistic to use it (too long to compress, too much Ram needed....), so 7z SFX PPMd (which doesn't require any decompression program per se) would be a good alternative for offering a downloading to non-techies. (Okay, Mac users may be non-techies as well but I'm catering only to Windows users for now...)
I'm not exactly sure, though, why bz2 is less efficient on Wedge than on SMF... Do we have that many extra image files in it, for instance...?
Oh, and where are you now, regarding the SMF 2.0.1 patch?
How important is it to keep the download size small vs readable? I'm not actually that fussed myself.
Anything we maintain should be readable. Anything else we use, the smaller the better. We don't want a big download size, not because it's "big", but because it costs money to cover for the bandwidth -- although we can probably share the files via Mediafire and so on... Or just extra mirrors... And because I'm not sure people would like to download and reupload a big file 10 times a year just because we release early and often. I know I wouldn't...
(Then maybe we should have some kind of system that automatically sends the updated files to the server.)
- doing on-the-fly minification (i.e. not cached...) on inline JS and CSS. Do you think it's realistic? I'm not sure myself, but I could cook up a simplified minification function, although I wouldn't know how to deal with strings. Hmm. Still, just as an example, if I manually minify the noi_resize() code, I save something like 60 to 80 bytes off the page, and a reasonable ~20 bytes when gzipped. Considered it's to be found on every single page...
If it's on every single page, why is it inlined? Or, at least, not minified by hand?
It's inlined because I need to execute it ASAP. Waiting for jQuery to run is out of the question. It would save probably a hundred bytes, but would create a very noticeable delay when loading a page that doesn't match the expected viewport width. Then again, onresize() doesn't execute when using the Back button (at least in Opera), so it's not exactly a 'perfect' solution anyway. Doing my best...
Can be compressed manually, yeah. But I figured I'd rather have it done logically -- if one part of the inline code is compressed, then all of it should be compressed. And I'm not sure what to use to minify JS when it comes to something that has to be done on-the-fly for everyone, on every page, instead of done once and cached...
Aren't the timestamps included in the filename? They are for avatars.
No, I don't think they are (for media files). If you update the file with the same file, it gets the same filename.
Which is a bit of a bummer...
Oh, and I'd love us to consider showing preview & full sizes in plain view as well. As long as the filenames can't be 'found out' by analyzing the thumbnail URL...
I wouldn't worry about YepNope, because you're still doing more work that doesn't - ultimately - benefit the user. It just fools a score.
Yeah, I suppose so... And even YSlow doesn't care about jQuery being there. Only Google PageSpeed does. And it would only save at BEST 2% because I get a maximum score of 98%, even with jQuery loaded normally.
- Considering using "FileETag none" in all htaccess files. I'm not a specialist though... But as soon as we have an expiration date, ETags aren't useful at all. I'm not sure if it should be done for all files.
If you have an expiration date, the ETag is still good until that date.
But technically it's not used at all because the browser will no longer request the resource (as long as it caches it, of course.)
Hmm, seems like YSlow doesn't recognize my FileETag query. Maybe I should also do 'Header unset ETag'... And any removed header means more space for pure data...
Does Chrome have an extension to help view request headers?
It's embedded into the HTTP request, which is only sent once for the request,
Phew...
even if the header is split across multiple packets because of lots of extra headers. It's still worth keeping it as small as possible (though, frankly, I don't see how else we can!) simply to hope we can keep it into as few packets as possible.
Yeah, well, I suppose it's not as scary if the cookie is loaded sent once. The first packet will be shortened a lot, but not the rest.
Ah, I forgot about that ticket...
Then I guess we agree on this, eheh.
Do you want to implement the change?