home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!destroyer!ncar!vexcel!copper!slate!mbarkah
- From: mbarkah@slate.mines.colorado.edu (Ade Barkah)
- Newsgroups: comp.unix.admin
- Subject: Re: problems with a find command
- Message-ID: <1992Nov23.175601.54551@slate.mines.colorado.edu>
- Date: 23 Nov 92 17:56:01 GMT
- References: <EJH.92Nov23085733@khonshu.colorado.edu>
- Organization: Colorado School of Mines
- Lines: 37
-
- ejh@khonshu.colorado.edu (Edward J. Hartnett) writes:
- : I put a find command in my crontab file which I hoped would compress
- : files larger than 10 megabytes which haven't been accessed for 14
- : days. But here's what I got in my mail about it:
- : Your "cron" job
- :
- : /usr/bin/find / -size 1000000c -atime +14 -exec compress {}\;
- : ...
- : It looks like a good command to me! Can anyone help?
-
- You need a space between {} and \;. Do:
-
- /usr/bin/find / -size 10000000c -atime +14 -exec compress {} \;
-
- or
-
- /usr/bin/find / -size 10000000c -atime +14 -print | xargs compress
-
- Also make sure that the size is 10mb, not 1mb as you have it. You'd
- be in for a big surprise.
-
- Also, I'm not sure if running it recursive from the root directory
- is a good idea. You might just accidentally compress some systems
- file. In fact, it might not be a good idea doing anything of the
- sort.
-
- An alternative would be to print out user files with the above
- specifications, then ask the user to either remove it or compress
- it if possible. Auto compress might just zap a poor user's datafile.
- (say, if he has something which updates the datafile monthly, and
- has been doing so for the last five years...) You be the judge.
-
-
- -Ade.
- --
- Internet : mbarkah@slate.mines.colorado.edu (NeXT Mailable)
- CompuServe: 74160,3404
-