LordSqueak wrote:it seems to me that the TTH checking could be optimized.
It's been adjusted a couple times, and I honestly don't see it on my slowest computer, a P3/866. I suspect that's rather low-end hardware compared to the computers of some of the people complaining, so I think that the aforementioned FAQ and computer optimization is probably good for any user to do.
LordSqueak wrote:A good answer.
but does dc++ as it is now , actualy have any use for this?
does dc++ realy check the "portions" of the file it downloads?
This seems mostly like a feature intended for multisource downloads.
Yes the leaves and HashData.dat are used. DC++ exchanges the leaves between 0.402+ clients, and uses them to check the file. It also serves the leaves up to any client that asks, and that includes ReverseConnect and its multisource clients, which use them to make multisource downloading safe. The CVS version goes a bit further with its use of the leaves.
LordSqueak wrote:again , a good answer.
I would suggest a feature in dc++ to rename files/dirs , that way dc++ would know its the same file without having to rehash.
That's a fine suggestion. Once there is a "My library" window, that will certainly be an option, and will/should let DC++ avoid rehashing the files - assuming they weren't moved between drives (which is possible even if the drive letter doesn't change, thanks to NTFS junctions and mount points).
LordSqueak wrote:its not like it going to destroy the community if i went and downloaded , for example ,,, 2 mp3's with different tags.
infact , the "community" isnt bothered by this at all , atleast not until i decide to share the file.
It was my impression that this problem was largely video files. MP3 tags can drastically change the offsets of the data inside the file, especially when ID3v2 tags are used - since they're used at the beginning of the file, and a lot of programs change the amount of padding.
Video files, on the other hand, are nearly always identical lengths, but can be corrupted anywhere. If you look at the archives of the forum, there've been a number of threads about intelligent file repair - which is work, as I've said, but may eventually be implemented.
LordSqueak wrote:How about an setting in configuration that lets you disable the tth blocking.
It is true that this "reduces" (for some clever redefinition) the number of sources you can have for a file, since they're not actually the same file, but doesn't help file integrity. File integrity is pretty important, and without quality files, what user wants to use any file sharing network (Kazaa)?
LordSqueak wrote:( suggesting that someone should read the hashdata.dat is like suggesting that someone should read the phonecatalog when they lament that they are calling a busy number.)
I think Ullner suggested you look at it (and HashData.xml) so you had an idea on your own. He's curious, and likes trying to discover things on his own. It was a helpful suggestion, not a "OMG, here is answer, STFU idiot" answer.
LordSqueak wrote:So please , if the fanboys doesnt have anything useful to say , dont say anything at all. dont ruin this thread.
Er... yeah. Well, I don't think there has been any "fanboy"ism in this thread. If you keep the discussion highbrow enough, they don't jump in, and the very same people post thoughtful responses. Trying to ward them off this way is just rude.