-
-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] - Duplicate functionality to support a primary folder location, rather than be based on age #418
Comments
If I understand this correctly you may look into the rules:
- locations:
- ~/Downloads
- ~/Desktop
subfolders: true
filters:
- duplicate:
detect_original_by: last_seen
actions:
- echo: "Found dup: {duplicate}" Now if you have a duplicate in your downloads and your desktop, the desktop file is assumed to be the orginal. |
Hi @tfeldmann - thanks for responding, Unfortunately that approach only works in specific scenarios, and make a key assumption of which came first is what you want to keep What I was hoping for was something that’s location based, not time based, that way people can move files in to a location that they want to have it preserved, and have any other instance actioned - e.g. identified, or moved, or deleted or …etc. |
„Which came first“ is not time based but based on the order of the locations you provide. Organize walks through the locations in the given order, so files in the first location are always |
Am I missing something here? Can you post a imaginary config to see how this should be specified? |
I think the following is meant: ---
rules:
- locations:
- ~/Desktop
- ~/Downloads
subfolders: true
filters:
- duplicate:
detect_original_by: first_seen
actions:
- echo: "Found dup: {duplicate}" Unfortunately, it doesn't quite solve the problem. I had hoped to be able to solve this simply using the ---
rules:
- name: Remove all file, that we have in the archive
locations:
- ./
- ~/.archive
subfolders: true
filter_mode: all
filters:
- duplicate
- regex: '^\./.*$'
actions:
- delete So it would be nice if you could enable filters like ---
rules:
- name: Remove all file, that we have in the archive
locations:
- ./
- ~/.archive
subfolders: true
filter_mode: all
filter_string: path
filters:
- duplicate
- regex: '^\./.*$'
actions:
- delete For example, you could use the |
Is your feature request related to a problem? Please describe.
Duplicates are a constant problem for me when people upload files, and I’ve tried to implement a process where there is always a primary folder where the one we use should be. (Not by age, but by storage location)
Describe the solution you'd like
To have the ability when using duplicates to specify a folder (including the hierarchy below it to be the primary master storage location.
Describe alternatives you've considered
Looked at standalone duplicate tools like fdupes and jdupes but they do not seem to have a facility to assign a location to be the master (and therefore protected from any duplicates being deleted)
Additional context
Nothing more to add I don’t think, it would be a great feature if possible to do..
The text was updated successfully, but these errors were encountered: