Displaying dupes plzzz
Moderator: Moderators
Displaying dupes plzzz
Why not displaying dupes which are removed from my sharing list ?
3.14 times more little than his shadow ...
the purpose of displaying this files is obvious !
I've lot of files, and displaying dupes is a good idea to tidy my drives
And I think it's not difficult to add this function into DC++ because removing the dupes => filenames are known somewhere, so adding it to a List is not difficult , isn't it ?
I don't like having same files in differents directories, I don't want to have a "fake" share, even if the size is substracted ... the files already exist ...
please understand me
I've probably made some mistakes. I'm still learning English ...
And I think it's not difficult to add this function into DC++ because removing the dupes => filenames are known somewhere, so adding it to a List is not difficult , isn't it ?
I don't like having same files in differents directories, I don't want to have a "fake" share, even if the size is substracted ... the files already exist ...
please understand me
I've probably made some mistakes. I'm still learning English ...
3.14 times more little than his shadow ...
I think that is what he means, he wants DC++ to tell him where the files are so he can manually remove them. I don't think he wants to show them in his share.BSOD2600 wrote:Why do you want to see them? What useful purpose would it serve? The only thing I would find useful is some notification somewhere that says how many dups it found and removed.
-
- Posts: 184
- Joined: 2003-05-26 11:29
- Location: UK
-
- Posts: 184
- Joined: 2003-05-26 11:29
- Location: UK
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
There are some changes in 0.307 that make it a little easier to add "system log"-like messages. I'll code this and submit it with one of my normal patches to arne.
(In the future, please don't use "easy" in a request for a feature... Essentially calling someone lazy [whether rightfully or wrongfully] is a poor way to motivate them to do something for you for free.)
(In the future, please don't use "easy" in a request for a feature... Essentially calling someone lazy [whether rightfully or wrongfully] is a poor way to motivate them to do something for you for free.)
would be a bad idea to automatically remove dupes
let's give some examples..
you have a game, like a car racing game, you shared the whole install directory like it appears on the install-cd
the cd has multiple directories with wave files which are needed during the game, there is a possibility that a wave file appears more than one time in different directories, when dc++ deletes all the dupes, the game can be corruptes because it misses files
another example is when you have a lot of complete mp3 albums, but you also made some personal compilations.. the chance is that dc++ first find the mp3 in the compilation folder, and then deletes the same mp3 on all the seperate albums the mp3 appears
that way you get a lot of incomplete albums (which is ver annoying for a user when he finds out there is a missing mp3 in the album while the user he downloaded from is already offline)
let's give some examples..
you have a game, like a car racing game, you shared the whole install directory like it appears on the install-cd
the cd has multiple directories with wave files which are needed during the game, there is a possibility that a wave file appears more than one time in different directories, when dc++ deletes all the dupes, the game can be corruptes because it misses files
another example is when you have a lot of complete mp3 albums, but you also made some personal compilations.. the chance is that dc++ first find the mp3 in the compilation folder, and then deletes the same mp3 on all the seperate albums the mp3 appears
that way you get a lot of incomplete albums (which is ver annoying for a user when he finds out there is a missing mp3 in the album while the user he downloaded from is already offline)
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
-
- Forum Moderator
- Posts: 1420
- Joined: 2003-04-22 14:37
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
The basic functionality is LogManager::getInstance()->message(string);TheParanoidOne wrote:Cool. Generic logging system = good. Can you please point out what part of the code it's in?
In BCDC's code, I extended it to keep 10 of the messages in the tooltip... I may reduce that and add a SystemLog window (that only remembers messages while it's open)... I'm not sure yet. I also inserted a simple logging line right before the fire() in message(), dependent upon a boolean setting.
System Log: Good Idea !
The logging is a good idea !
Back to my question : A solution is putting the removed dupes filenames into the system log instead of creating a new window just for this feature ...
Back to my question : A solution is putting the removed dupes filenames into the system log instead of creating a new window just for this feature ...
3.14 times more little than his shadow ...
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
Re: System Log: Good Idea !
Hashing is included in the log, as are start/ending a file list refresh, and disconnecting users who leave the hub.PI wrote:The logging is a good idea !
Back to my question : A solution is putting the removed dupes filenames into the system log instead of creating a new window just for this feature ...
Code: Select all
C:\Program Files\DC++\Logs>tail system.log
2004-02-22 07:52:18: File list refresh finished
2004-02-22 08:53:07: File list refresh initiated
2004-02-22 08:53:18: File list refresh finished
2004-02-22 09:46:46: Disconnected user leaving the hub: Ichitaka_Seto
2004-02-22 09:54:07: File list refresh initiated
2004-02-22 09:54:30: File list refresh finished
2004-02-22 10:55:07: File list refresh initiated
2004-02-22 10:55:25: File list refresh finished
2004-02-22 11:56:07: File list refresh initiated
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
Ok, the addition of knowing what dupes are being removed is nice, but it still could be improved a little.
Thats what I get in the log file. It sure would be nice if it told which other file its a dup of and the full path of each.
Code: Select all
2004-03-12 10:48:45: Duplicate file will not be shared: Acdc - Problem child.mp3 (size: 5484853 bytes)
2004-03-12 10:48:45: Duplicate file will not be shared: Acdc - Let there be rock.mp3 (size: 5852721 bytes)
2004-03-12 10:48:45: Duplicate file will not be shared: Acdc - Hell ain't a bad place to be.mp3 (size: 4008696 bytes)
2004-03-12 10:48:45: Duplicate file will not be shared: Acdc - Bad boy boogie.mp3 (size: 4311703 bytes)
2004-03-12 10:48:45: Duplicate file will not be shared: folder.jpg (size: 18033 bytes)
2004-03-12 10:48:45: Duplicate file will not be shared: folder.jpg (size: 18033 bytes)
2004-03-12 10:48:46: Duplicate file will not be shared: Maynard Ferguson - 05 - Fiesta.mp3 (size: 7742754 bytes)
2004-03-12 10:48:46: Duplicate file will not be shared: Maynard Ferguson - 01 - Primal Scream.MP3 (size: 10255116 bytes)
2004-03-12 10:48:46: Duplicate file will not be shared: 03 - Maná - Cachito.mp3 (size: 4616280 bytes)
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
Full path? Ok, that can be done (though I haven't written the code). The loop is fairly simple currently, so I'm not sure how you can print "the original" file just once if you have multiple duplicates of it shared...BSOD2600 wrote:Thats what I get in the log file. It sure would be nice if it told which other file its a dup of and the full path of each.
It just saves me the hassle of using crappy windows search to try and find the same named file. DC++ thinks files are dups if they are the exact same byte size or does it look at the filenames too?
Removing the file sorta creates a problem for those who are downloading file sets (ok, an album in my case) since DC++ has now removed one track from it. Yes, they can do a search for that specific missing track, but that creates more hassle.
Removing the file sorta creates a problem for those who are downloading file sets (ok, an album in my case) since DC++ has now removed one track from it. Yes, they can do a search for that specific missing track, but that creates more hassle.
maybe there should be an option to leave the file in the list but just not count it towards your share total?BSOD2600 wrote:It just saves me the hassle of using crappy windows search to try and find the same named file. DC++ thinks files are dups if they are the exact same byte size or does it look at the filenames too?
Removing the file sorta creates a problem for those who are downloading file sets (ok, an album in my case) since DC++ has now removed one track from it. Yes, they can do a search for that specific missing track, but that creates more hassle.
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
That's tongue in cheek, right?Twink wrote:maybe there should be an option to leave the file in the list but just not count it towards your share total?
Code: Select all
SETTINGS_REMOVE_DUPES, // "Remove dupes completely from your share (otherwise, only their size is subtracted, but the files can be seen by others)"
Could this be extended so, that when downloading files, which I already have, would give a duplicate error:GargoyleMT wrote:The dupe check works on exact name and filesize, though hopefully that can be eliminated in favor of hash-based duplicates. =)
"Error downloading: You already have this file".
The comparison would be based on hash of course.
-
- DC++ Contributor
- Posts: 3212
- Joined: 2003-01-07 21:46
- Location: .pa.us
Agreed - check out this RFE for that:YaRi wrote:"Error downloading: You already have this file".
[ 896494 ] Eradicate duplicate downloads
Thanks
Thank you you all for your help !
I'm looking forward to seeing this changes ...
+
I'm looking forward to seeing this changes ...
+
3.14 times more little than his shadow ...
-
- Posts: 15
- Joined: 2004-04-12 19:53
- Contact: