[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

large files



I have a user that really like to create files.  Then, they don't clean them 
up.  We have already put a quota* on them, but unfortunetly, their directory 
is so large and convaluted, that they can't even figure out where all the 
disk space has gone.  Is there a sane way to generate a report showing the 
disk usage from a certain point on down, sorted by size?  Heres kinda what I 
mean:  for a standard user, I would just run 
'du /u/foo | sort -n | tail -20', and tell them to clean up whatever is there.  
However, I've let a du | sort -n run on this directory for over four hours, 
before giving up in disgust.  It is almost 100Gigs of files, with at least 
four or five directories that have 20K to 30K+ files each (plus hundreds of 
other subdirs).  *And*, it's on a filer, so there are .snapshot directories 
that du thinks it has to plow through, quintupling the amount of work.   I'd 
also like to make this into a weekly report, so that they can make it part of 
their Friday routine (let's go delete 10 gigs of data! Woohoo!).

Ideas?  Other than killing them, of course, no matter how tempting that is...

*100Gigs!
-- 
MuMlutlitithtrhreeaadededd s siigngnatatuurere
D.A.Bishop



Reply to: