更新时间:2022-10-19 12:50:54
在Amazon S3中没有find duplicate命令。
但是,您执行以下操作:
ETag
(checksum)和大小
他们(非常可能)是重复的对象。
Is there a way to recursively find duplicate files in an Amazon S3 bucket? In a normal file system, I would simply use:
fdupes -r /my/directory
There is no "find duplicates" command in Amazon S3.
However, you do do the following:
ETag
(checksum) and Size
They would (extremely likely) be duplicate objects.