View Single Post
Old 02-11-2010, 03:24 PM  
Yil
Too much time...
 
Join Date: May 2005
Posts: 1,194
Default

pion: Think I've mentioned this more than once... I don't plan on trying to add that feature, though others are of course free to do so. Besides your example of very large .rar files there isn't any reason to allow it and lots of reasons you wouldn't want it. In fact, v7 made such a change less likely because it now protects a newly uploaded file from being downloaded until the zipscript has finished processing it. That allows for the adding or stripping of .nfo files from zips without corruption and/or file verification by the zipscript (such as against the .sfv). Then there is the problem of slow uploaders and being stuck with a very slow transfer from a fast FTP because you're essentially limited by the upload speed of the other user. Don't forget the possibility that the upload is interrupted and thus incomplete. Assuming all those problems for the ability to start the transfer faster doesn't seem like a good trade-off to me.

The final straw is the disk fragmentation issue though. I anticipate changing the upload logic to start buffering multiple megabytes at a time so disk writes are very large in an attempt to reduce file fragmentation. That would add more complexity because you would have to check the file and the write buffer before deciding if there was new data to be downloaded... Just not worth it.
Yil is offline   Reply With Quote