Hamilton LaboratoriesHamilton C shell 2012User guideSamples


Oregon Coast

Previous | Next

#  Look for duplicate files anywhere in a directory tree.

#  Copyright (c) 1990-2012 by Hamilton Laboratories.  All rights reserved.

#  It works by first constructing the list of all the path names, one per line,
#  of everything in the tree using the -r (recursive) option of ls.  The
#  ``...`` part is command substitution to paste the results into the foreach
#  loop, each line of output taken as a single word.  (This way, it works even
#  on filenames containing spaces and special characters.)  The :gt operator
#  means globally edit the list to trim each pathname down to just the tail
#  part; e.g., given "x\y\z.c", the tail is just "z.c".  (There are other
#  pathname editing operators for grabbing just the directory containing,
#  everything except the extension, the fully-qualified name for a relative
#  pathname, etc.)
#  The foreach loop writes each name out to the pipe, one per line.  (I've used
#  a calc statement rather than an echo in case a filename includes special
#  characters.) The sort puts all the lines into alphabetical order and the
#  the uniq -d command gives just the duplicates.

proc duplicat(startdir)
   local i
   foreach i (``ls -1r $startdir``:gt) calc i; end | sort | uniq -d

duplicat $argv

Previous | Next