This is a problem I've seen before in many places, and it seems like good practice (or at least it must have at some point) - but for me it seems to cause more harm than good.
Many program (Internet Explorer for file downloads, certain ZIP utilities, etc) perform file operations in a temporary folder, then copy the results of the operation to the ultimate destination.
I can see why this makes sense. Depending on the operation, allowing the user to muck with the file during processing can cause data loss, errors, mini-black holes, etc. But when the resulting file is very, very large, the subsequent copy can take forever. Additionally, if there isn't enough space on the destination, then instead of failing fast and reporting the file allocation error, we have to wait until the end of a potentially expensive operation (like unzipping a 12 gig file from an archive) before we find out. In fact, this practice makes such a problem more likely because the space requirements are doubled in order to facilitate the copy.
I'm really just ranting here rather than offering a fix for the problem - the only fix I could possibly suggest would be to avoid doing this at all. If you must, however, make it an option for the user to override, allowing said user to take responsibility for his/her own file system.
*ahem* STOP PROTECTING ME FROM MYSELF.