[p2p-hackers] HTTP design flawed due to lack of understandingofTCP

Gregory P. Smith greg at electricrain.com
Wed Jan 10 20:51:46 EST 2007


> BitTorrent makes the process of where to put files far more
> cumbersome than should be necessary, though.  There's really no reason to
> create "torrents" apart from plain old files.  All you need is a hash of the
> file and a service for looking up duplicates.  This is again an issue of
> protocol layering.

You need a secure hash of each individual chunk of the file that you
intend to download as a unit from a given source (hashes over smaller
regions that combine to have the same effect naturally work just as
well).  This is effectively all a .torrent is.

Why?  Because computers suck and that will -never- change.  Most have
non-ECC RAM and many still use parallel ATA hard drives.  Both of
which are prone to passing undetected bit errors in data on.  On top
of that crappy consumer network equipment can corrupt data even
further.  These are why if you run bittorrent you'll see it receiving
pieces of the file that fail the hash check and redownloading them
from somewhere else.

A distributed download protocol that does not do integrity checking at
or near the transferred chunk size is doomed to failure.  If you only
have the overall file hash how do you determine which part of the
thing you received from 50,000 peers is bad?  rsync ain't gonna cut
it.

-greg


More information about the p2p-hackers mailing list