Reverse Connect 0.305b Released

Know of something that might be useful to the DC community? Post it here! (Still, no advertising)

Moderator: Moderators

Locked

GargoyleMT
DC++ Contributor
Posts: 3212
Joined: 2003-01-07 21:46
Location: .pa.us

Post by GargoyleMT » 2003-12-17 11:18

It'll be interesting to see how the ED2K hashes compete with the more widespread BCDC TTH hashes.

Twink
Posts: 436
Joined: 2003-03-31 23:31
Location: New Zealand

Post by Twink » 2003-12-18 05:17

specially as cologic likes to point out flaws in the ED2k Hashes

GargoyleMT
DC++ Contributor
Posts: 3212
Joined: 2003-01-07 21:46
Location: .pa.us

Post by GargoyleMT » 2003-12-22 13:16

Twink wrote:specially as cologic likes to point out flaws in the ED2k Hashes


True, there are known weaknesses in MD4, with published examples of exploits. Cologic was looking into whether those examples could be used to create intentional hash collisions.

And the DC network really needs only one hash standard, and I think TTH is a good choice.

(Plus, the 9mb segments that the eDonkey hash uses is absolutely horrible in terms of efficiency, should a segment be corrupted. I think with BCDC's hash system, the maximum blocksize to re-get should be 256 kilobytes for an incremental hash failure.)

liny
Posts: 30
Joined: 2003-11-01 09:18

Post by liny » 2003-12-22 22:17

GargoyleMT wrote:I think with BCDC's hash system, the maximum blocksize to re-get should be 256 kilobytes for an incremental hash failure.)


How to know that?
I am thoroughly a fool to TTH, and can not read BCDC's code very well.
TTH = tiger hash, right?
Would you like give me some tutorial link about that?
BTW, how many files have been hashed by TTH? and is there any verified link site like sharereactor for TTH?
It is not very hard to change hash from one to other, I would like to use the best.

(enable bbcode if you want quotes to work - GargoyleMT)

GargoyleMT
DC++ Contributor
Posts: 3212
Joined: 2003-01-07 21:46
Location: .pa.us

Post by GargoyleMT » 2003-12-22 23:44

liny wrote:I am thoroughly a fool to TTH, and can not read BCDC's code very well.
TTH = tiger hash, right?

Well, this document explains a lot:
http://www.open-content.net/specs/draft ... ex-02.html
(I believe it includes test vectors / files for verifying your implementation)
Tiger hashing code is also in Bitzi.com's bitcollider:
http://sourceforge.net/projects/bitcollider/

BTW, how many files have been hashed by TTH? and is there any verified link site like sharereactor for TTH?
It is not very hard to change hash from one to other, I would like to use the best.


No sites have raw TTH links. However, magnet links can include a bitprint, which is a SHA hash a period (.) then a TTH root hash. I'm not sure what site, if any, exports these.

Cologic, sandos, and sedulus have a lot of expertise in this area, they're usually in the Dev hub (and have accounts here) so you can pick their brain on the subject of hashing in BCDC and external site integration, hash exchange, etc.

Anything to move away from ed2k hashes... and to make sure that file searches remain as unpolluted as possible. (There really should only be one type of hashes searched for on DC... searching by ed2khash, sha1, tth, sig2dat, etc. would be horrible.)

liny
Posts: 30
Joined: 2003-11-01 09:18

Post by liny » 2003-12-23 04:17

Thank you, I'll study it.

GargoyleMT
DC++ Contributor
Posts: 3212
Joined: 2003-01-07 21:46
Location: .pa.us

Post by GargoyleMT » 2003-12-23 08:35

TTH hashes are also used in the Gnutella network, FYI.

Share & Enjoy!

sandos
Posts: 186
Joined: 2003-01-05 10:16
Contact:

Post by sandos » 2003-12-25 20:03

GargoyleMT wrote:I think with BCDC's hash system, the maximum blocksize to re-get should be 256 kilobytes for an incremental hash failure.)


The segmentsize of 256kb is actually a minimal segment-size used by bcdc currently.

The size for segments are also dependant on filesize to avoid overly large trees for large files. 9 levels (used in bcdc) should give 512 leaf-hashes at most, ie. a 512MB file will have 1MB blocks.

Remember though that both limits are at the *uploading* clients discretion, bcdc will use finer grained control if it can get ahold of any such hashtree.

Locked