Continue partial downloads

From MLDonkey
Jump to: navigation, search

there's some overlap between that page and recover_temp //Pango

A local partial file can be completed from p2p sources when the corresponding elink is known (or guessed). This is a useful technique when very large files are truncated that is the most common case found but also damaged files can be completed without having to redownload the full file when the corruption won't caused any shift in the data chunks. This may be the case if you downloaded some split files from Megaupload, Rapidshare or similar services as a free user and reached the daunting limits that impose if don't pay the fees.

Here are the main steps that show how to continue partial downloads:

This works fine with the linux version without needing any other action.


Extract the file

If the partial file is compressed and the target available in the p2p network is the uncompresed one the first action must be to extract the greatest amount of data. The extraction process vary from application to application but basically all have the option to keep the partial data extracted instead of deleting the broken file.

Locate the exact elink

When the source of the incomplete file is a compressed one like rar or zip, can be useful to note the CRC32 sum and uncompressed file size to compare with any potential source available to select the better candidates to the elink. If the size doesn't match it is almost sure that you are looking at the wrong elink, if the file size matches there is no warranty that will be the same file but worth the try. In some elinks has CRC32 sums also that could be helpful since if the CRC32 matches it is almost sure that you got it.

Send the elink to the core to create the temp file and fileinfo. Once some source is found, pause the file to avoid starting to download before completing the process.

Move to the temp directory

Once located the potential elink of the partial file move it (or better copy it) to the temp directory overwritting the urn_ed2k_<hash> file created when the elink was sent to the core. With a truncated file the new temp file will be smaller than the initial but this is not a problem at all.

Check that the new temp file has adecuate permission, and set them if necessary.

Recovering the data

In the console now just issue a verify_chunks <num> command, where <num> is the download number reported with vd, and wait until you get a reply from the core. If the reply is almost instantaneous then probably is that there is nothing to verify, so check that the partial hashes were downloaded from some source before trying again, and if not then remove the pause to search for sources and retrieve the needed file info. (Is not mandatory to pause the file, just to avoid potential problems).

Now the completed chunks that matches the partial hashes are verified, ready to upload and to complete the full file when will be fully downloaded. To recover part of any incomplete chunk like may be at the end of the truncated file, the recover_bytes <num> command will search for a gap of 16 (configurable) 0x00 bytes to continue downloading from this starting point, but this gap may be due to the fact that the file is truncated or just that really the file has these bytes like the avi, iso and many other files. If you issue a recover_bytes <num> command before the verify_chunks <num> command with some of these files then some chunks at the start of the file won't be verified and will be marked as such, but after redownloading from a p2p source these small pieces (generally a few kb) the chunk will verify without problem.

At this point almost all the data must be reported as yet downloaded and verified, but if none are verified or a very small part did, then there is a high probability that the elink is not the right one and you should check for better luck with other one.

Personal tools