minimum download speed
Moderator: Moderators
-
- Posts: 2
- Joined: 2003-06-15 20:27
minimum download speed
i think it would be nice to have a feature that if one of my downloads goes below a certain amount (say 5kbps) then the program removes that user from my cue and starts downloading from another user.
this paired with the auto search for alternative feature already implemeted would be really nice. : )
this paired with the auto search for alternative feature already implemeted would be really nice. : )
I agree with the suggestion, this feature would be very useful for me too.
I wonder if it would be feasible to add to that a timer, so that if a downloads stays below a certain speed for a certain amount of minutes the source I am downloading from will be removed from my queue.
Depending on how stable the connection is, I could specify a longer or shorter duration for the timer, and depending on my connection speed I could specify the minimum download speed I am confortable with.
I would like also to congratulate the developers for this amazing piece of software. I really like DC++, its features and the upgrades that were included in the new versions. You all are doing a great job.
I wonder if it would be feasible to add to that a timer, so that if a downloads stays below a certain speed for a certain amount of minutes the source I am downloading from will be removed from my queue.
Depending on how stable the connection is, I could specify a longer or shorter duration for the timer, and depending on my connection speed I could specify the minimum download speed I am confortable with.
I would like also to congratulate the developers for this amazing piece of software. I really like DC++, its features and the upgrades that were included in the new versions. You all are doing a great job.
My apologies, I replied to the wrong thread.
My reply should have gone in this thread instead:
http://dcplusplus.sourceforge.net/forum ... php?t=3204
My reply should have gone in this thread instead:
http://dcplusplus.sourceforge.net/forum ... php?t=3204
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
Re: minimum download speed
Might be risky. Any time you close a connection (and slot), you risk not being able to get another slot for a long time.beyonder334 wrote:i think it would be nice to have a feature that if one of my downloads goes below a certain amount (say 5kbps) then the program removes that user from my cue and starts downloading from another user.
If it is a rare file with only one or two sources, this could be a big problem.
At the moment I would prefer to manually decide what connection should stay and what should go. Actually, most of the time I don't mind if the connection is slow, as I have DC++ running in the background and will get the file eventually, regardless of the speed.
-
- Posts: 2
- Joined: 2003-06-15 20:27
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
I got thatbeyonder334 wrote:it would be an optional option.. like most of those download settings are now. ;p
I meant it might be risky when it is turned on.
I guess so. But what if all the sources you have are below your threshold? This feature plus rollback consistency checking means you will probably never get the file.beyonder334 wrote:could possibly make it so only does it IF there are some other people in the cue for the file.
There is a way to avoid the risk, but it's complicated and very hard to implement. You see, the risk is that you can't find another and/or it will be slower or the same speed so that you've just wasted time that could have been spent downloading in the search. However, it is possible to completely eliminate the risk. What you'd have to do is have it able to start a second download of the same file in the background (using memory perhaps or at least a temporary file) and let it run a minute or so to see if it's actually going to hold it's speed (most of mine start on one end of the line and usually end up at the other by the end.) When it sees that the other download looks likely to hold a faster speed, it would then stop the current download, roll back the file to the previous position, then append the cached download data to where it belongs. It sounds crazy, but such a system can work, it would just require more and more power as the speed goes up (though I'm thinking that with any connection short of a REAL T3 -- not just one of those that routinely upload at <500B/s through DC -- it woudl be ok.) And, of course, such a feature would be not only fully optional, but defaulting to off. I think it might also be wise to reserve a certain amount of disk space or memory, depending on where the cache is kept. Rather than trying ot figure out exaclty how much needs to be reserved, simply ask the user what they are willing to reserve for the system with, perhaps, a default of something like 50MB.
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
Would these "tests" take up the sources upload slots?
How much data would you have to download before you decide whether the source is "good"? Is it by the byte? The second? Some other measure? Multiply this value by the number of online sources (at least >10) and you start to get a lot of wasted bandwidth and skewed results.
Imagine I am downloading a file from a slow source. I find another source which is fast but I don't know that yet. I "test" their connection. But ten (or twenty or more ...) other people are also testing the connection. The bandwidth is divided between all the users and I end up thinking it's another slow source so I ignore it.
Testing connections speeds seems pointless to me. The only way they would be any good indication is if you have exclusive access to the source.
How much data would you have to download before you decide whether the source is "good"? Is it by the byte? The second? Some other measure? Multiply this value by the number of online sources (at least >10) and you start to get a lot of wasted bandwidth and skewed results.
Imagine I am downloading a file from a slow source. I find another source which is fast but I don't know that yet. I "test" their connection. But ten (or twenty or more ...) other people are also testing the connection. The bandwidth is divided between all the users and I end up thinking it's another slow source so I ignore it.
Testing connections speeds seems pointless to me. The only way they would be any good indication is if you have exclusive access to the source.
Of course they would take up a slot. They would also not take up a slot about a minute later.
And it's not the number of bytes that would have to be downloaded to decide, you'd have to give it a certain amount of time to see whether it is going to go up or down. A person who can download 20KB/s will get more bytes in the same amount of time as a person who can only download 3KB/s, but in both cases, one source might still send faster to them than another in the same period of time, merely the person getting at 3KB/s will have a slower speed in both cases than the one getting 20KB/s. The speeds vary over time, not bytes. Also, perhaps a whole minute isn't necessary, but it would have to give at least enough time to be sure since there is almost always an initial speed burst with the download starting really fast, then slowing down considerably.
You decide a source is good if it's uploading faster.
You assume that people will be granted a free slot to test connections when you say that if 20 other people test the same connection it will make the source unreliable. This is not true. Such a thing screams "please abuse me and make a client that pretends to be testing but gets the whole file for free!" The testing would take a normal slot and have to get into that slot just like everything and everyone else. If the person HAS 20 open slots and 20 people test, then you may as well assume the speed you end up with as one of those 20 is going to be what you'll get anyway since it's only a matter of time before those slots fill up. In fact, the only reason that 20 people testing at once COULD make a difference is if it did ping. Then someone would be hit by 20 pings at once, slowing their response to each. This method is basically no different from the workaround that I and most others I know are doing, which is to have multiple copies of the same file downloading seperatly, wasting space and bandwidth. THAT costs more slots and bandwidth overall than the test method that would discard a connection as soon as sufficient time had passed to be relatively sure.
And it's not the number of bytes that would have to be downloaded to decide, you'd have to give it a certain amount of time to see whether it is going to go up or down. A person who can download 20KB/s will get more bytes in the same amount of time as a person who can only download 3KB/s, but in both cases, one source might still send faster to them than another in the same period of time, merely the person getting at 3KB/s will have a slower speed in both cases than the one getting 20KB/s. The speeds vary over time, not bytes. Also, perhaps a whole minute isn't necessary, but it would have to give at least enough time to be sure since there is almost always an initial speed burst with the download starting really fast, then slowing down considerably.
You decide a source is good if it's uploading faster.
You assume that people will be granted a free slot to test connections when you say that if 20 other people test the same connection it will make the source unreliable. This is not true. Such a thing screams "please abuse me and make a client that pretends to be testing but gets the whole file for free!" The testing would take a normal slot and have to get into that slot just like everything and everyone else. If the person HAS 20 open slots and 20 people test, then you may as well assume the speed you end up with as one of those 20 is going to be what you'll get anyway since it's only a matter of time before those slots fill up. In fact, the only reason that 20 people testing at once COULD make a difference is if it did ping. Then someone would be hit by 20 pings at once, slowing their response to each. This method is basically no different from the workaround that I and most others I know are doing, which is to have multiple copies of the same file downloading seperatly, wasting space and bandwidth. THAT costs more slots and bandwidth overall than the test method that would discard a connection as soon as sufficient time had passed to be relatively sure.
-
- Posts: 19
- Joined: 2003-05-06 22:00
well i got yelled at for making a new post cause i couldn't find this one but my idea consists of a check box next to each file with a speed box in kbps 4 each file... that way you can decide if you want the feature on and would be able to turn it off for rare files... a file specific speed b4 a remove form que basically.
Build a Man a Fire, and he will be Warm for a day.
Set a Man on Fire, and he will be Warm for the Rest of his Life.
Set a Man on Fire, and he will be Warm for the Rest of his Life.
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
It's on the official feature request list:
[ 669593 ] Download from the best, change when below a certain speed
[ 669593 ] Download from the best, change when below a certain speed
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
-
- Posts: 19
- Joined: 2003-05-06 22:00
a minimum upload speed cutoff doesn't work because if you have a very rare file that someone really wants and your sending it at .5 k a second and their willing to wait for it and your upload minimum is cutting them off their gonna get hella pissed off really fast. just leave ur uploast slots alone and let other people decide how slow is too slow for their files.
Build a Man a Fire, and he will be Warm for a day.
Set a Man on Fire, and he will be Warm for the Rest of his Life.
Set a Man on Fire, and he will be Warm for the Rest of his Life.
Dont agree. If there are lot of people with slow speed they can not share it fast to others. Better to share to people how share fast to others. And the slowspeeders get it from someelse later. They are like walking people on a racetrack. Only in the way. And an other idea is to have a queue, and a roundrobin for users that grabs alot. For exaple after 1 GB they will be kicked if there are people in queue. This will do that people share the parts they have downloaded.
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
Agreed, and this is why cutting off slow uploads will never happen in DC++. (That's what the auto-open new slot is for.)Smirnof100 wrote:.. they're willing to wait for it and your upload minimum is cutting them off their gonna get hella pissed off really fast.
As for disconnecting slow downloads, only a stupid scheme would disconnect a file for which there is only one online source.
-
- Posts: 1
- Joined: 2004-01-17 17:11
- Contact:
maybe we can add a new priority level that can give the options to either disconnect and try to find new connections (for common files) or keep the slow speed (for the rare and hard to find files) i'm not sure how much this might lag the computer but its a try
and please make a feature that disconnect 0/KB speed download, sometimes it just stays there, waiting for me to disconnect it manually.
and please make a feature that disconnect 0/KB speed download, sometimes it just stays there, waiting for me to disconnect it manually.
QUALITY!!!! not QUANTITY
With the new system, the client downloads filelists for the autosearch feature. During that process, an upload speed for that user is established. The faster uploaders could have a higher priority...and the slower ones, well, you know.
The problem with this is that some filelists are so small that the speed is deceiving.
The reason I suggest this is because I do this manually, and it works very well. I am ok with the manual operation, but a similar method could be used for those who long for a telepathic interface.
The problem with this is that some filelists are so small that the speed is deceiving.
The reason I suggest this is because I do this manually, and it works very well. I am ok with the manual operation, but a similar method could be used for those who long for a telepathic interface.
Hehe.