I'm really not that enthusiastic about putting a jQuery widget in for something like that. For uploading multiple plugins, maybe, but anything more than that is IMO unnecessary, as this is something I don't want to spend vast amounts of time on, knowing that most users won't upload multiple plugins at once, probably won't even upload plugins all that often since it should be possible to handle from downloading better...
Sure.
OTOH, the only argument I have against this -- the plugin server would see a significant bandwidth increase if it can't server gzipped content...
Well... plugins would still be compressed, and you have to get to a largeish package before it makes a big difference. SD 2.0 for example is 553K for ZIP vs 404K for .tar.gz, and mostly that's because it has a lot of files in it, of which most won't compress all that well.
The other thing is preference. The majority of plugin authors are likely to be on Windows where making a .zip file is easier.
The alternative is forcing .tar.gz across the board, which is smaller generally. Our plugin server could unpack zip and repack as tar.gz on upload. Those running custom servers would have to just deal with it, I guess.
I'd tend to put everything into the same page, but really that's because it's already the case in SMF.
Other than that... I don't really bother either way.
Well, if it remains as one page, it desperately needs renaming. I don't really have a problem with renaming it, just that my first instinct was to make it a separate page, but I'm increasingly agreeing with the view that it doesn't need to be.
It would have to be seen, first, if anyone creates 'independent' plugin servers that host plugins by multiple users.
Other than that, it'd be just simpler for us to provide RSS feeds on the Wedge plugin server with an author filter. i.e. people could be notified when their favorite authors release new plugins or new versions. Totally missing from the SMF customization site, eh...
Well, this is where it gets into the technicalities of providing support. I suggested doing it as a REST style but I don't think that's viable in the long run. In fact, I'm sensing it would have to pretty much be a SOAP-type request to the server to cope with everything.
What that ultimately means is that the requester sends an HTTP POST with the body being a block of XML. (Or POST vars. Doesn't really matter. The key point is that you make a POST with one or more variables attached.) Then you get a block of XML back in some form.
SOAP formalises this process, and I don't think we need go quite that far, but certainly it needs more than a simple URL request, since the process has to account for various kinds of filtering.
We could add a 'noob-friendly' option to do everything automatically.
If it fails --> manual processing.
I wasn't really going to do anything else. It's not like it's that hard to automate that process, especially since the user will have likely had to provide their FTP/SFTP details at some point (could even do it ahead of time so that the details are known before even trying to download)