Do you think tha big files (like system images, big archives and so on) should be fully defragmented, on a standard run?
When a big file is split, say, into 2 big parts -
isn't it sometimes better to leave the file as it is - instead of moving one part to join the other? Usually in these cases, the usage of the file is limited (- in resoration processes or such), and my guess is that it can be left as it is.. no?
Please give me your thoughts on this -
And if you think the same - couldn't there be a configuration parameter that sets a threshold in size for different processing of large files