Search found 10 matches

by dpm
2004-04-30 13:07
Forum: Hubs and scripts
Topic: need help with direct connect hub versions.
Replies: 4
Views: 2577

I myself use the original hub version, but only because of the antiquated scripts I use. I'd say the original hub is much easier to find and write scripts for, but it leaks memory like crazy. (it's using 35MB RAM, 144MB Virtual currently) Both should work fine. For your IP address, put in your IP ad...
by dpm
2004-04-29 03:21
Forum: Feature Discussion (Archived)
Topic: Some ideas for next dc++ ver if possible
Replies: 8
Views: 4089

Bandwidth limiting can be abused greatly. There is no way I would ever support this feature, especially if implemented by the client that most people use on my hub.
by dpm
2004-04-27 20:15
Forum: Feature Discussion (Archived)
Topic: Suggestion: Confirm when deleting the whole download queue
Replies: 6
Views: 2846

I wouldn't use it though...

Joakim made a comment to the thread: there'd be one dialog per delete action - not per file. And, plus, the dialog could be coded like those in SmartFTP - with a checkbox "never ask again." Then if users disabled that, then deleted their queue, at least they were warned. As long as there is an "un-...
by dpm
2004-04-27 12:38
Forum: Feature Discussion (Archived)
Topic: Speed I really can provide
Replies: 9
Views: 4111

That's a pretty big if...

This would never work. I can provide speeds of 800k/s+ to users on the hub that I run on my local university network. IF I were to connect to other hubs, CIS has this nice 'feature' called 'packetshaping.' So I'd be uploading/downloading to public hubs at a measly 2k/s Maybe we need an option that s...
by dpm
2004-03-31 02:19
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

Ziisas, DPM. You have a whopping 80 processes running and you're complaining about high CPU usage? I don't think I could even find 80 tasks to run on my system... (unless I'd just randomly start up programs unnecessarily) It's easier than you think. There's about 20-30 default Windows processes. DC...
by dpm
2004-03-31 02:05
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

dpm, hashing might be going very slow for you because you have all those other processes taking up cpu time and since the hashing thread priority for dc++ is set to low all other processes running on the system get priority over it. see what happens if you close everything else then run dc++ It's n...
by dpm
2004-03-31 01:35
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

So far, since 9pm, it appears to have hashed all of 21 GB.

This is assuming it hashes alphabetically through my share, recursively, which would lead it to start at Download/250GB/Anime/Chobits/

And ack, there seems to be no way to upload a file.

Image
by dpm
2004-03-31 01:01
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

cologic wrote:I run a DC hub, Apache/mod_python, MySQL, all the normal programs one uses, etc.

And, oh, wait, it works perfectly.
I don't call 8 GB an hour consuming 30% or, 1 GHz of processing power "fine"
by dpm
2004-03-31 00:29
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

Works great for me: 39,081 438,95GB. I'm not a huge sharer, but I am a dev. Hashing my share takes overnight, but even when it's not timed well, I can use my computer just fine. So I take it you too are running about 80 or so processes normally? (this is far from the problem. I run two direct conne...
by dpm
2004-03-30 23:22
Forum: Feature Discussion (Archived)
Topic: Hashing slow down computer too much
Replies: 37
Views: 14582

TheParanoidOne wrote: Um, no. Hashing is not doing anything wild and wacky to your disk. It is reading the contents of a file, doing a calculation and then moving on to the next file.
I'm sure this is fine for all of you users who share less than 19982 files with a total size of 271 GB