[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Saving logfiles



At 02:15 PM 6/10/97 +0200, Karsten wrote:
>Hello,
>
>    Debian comes with an automated saving and rotating of logfiles.
>    This seems to be nice in most cases. We need to save logfiles
>    for a long time. Therefor we cant rotate them.
>
>    My question is if somebody has adapted cron and savelog scripts
>    to save actual logfiles with date extensions and move old
>    logfiles to a special directory.

I have done this, but not with savelog.  It's just as easy to write your
own scripts for this task.   Here is one that works with the Apache web
server [note these are all custom setups not the default Debian one].  We
run this daily at the crack of midnite:

#!/bin/sh
# Name: rollover
#
# Description: Rollover log files for Web server.

ROOTDIR=/infosys/www/danenet
LOGDIR=$ROOTDIR/logs
ARCHIVEDIR=$ROOTDIR/logs
LOGS="httpd error"

if [ "$1" = "yesterday" ] ; then
  DATE="yesterday"
else
  DATE="today"
fi
TIMESTAMP=`date +%Y-%m-%d -d "$DATE"`

$ROOTDIR/bin/stop
for logfile in $LOGS ; do
  mv $LOGDIR/$logfile.log $ARCHIVEDIR/${logfile}${TIMESTAMP}.log
done
$ROOTDIR/bin/start
for logfile in $LOGS ; do
  gzip -f $ARCHIVEDIR/${logfile}${TIMESTAMP}.log
done

======
Note that Apache has more than one log file, so I just designate a LOG
directory and consider every file with a .log extension to be a logfile.

This process gzips the logs.  Although we are using the same LOG directory
for storing the gzip files, there is no reason you couldn't use another.
By using gunzip -c logfile.gz | you can create a data stream for a report
generator that reads stdin.  This works well with Analog.  And you can
easily select log files by date.  Single day, week, or month, those are all
easily done with gunzip -c logfile$DATE*.gz.  

Naturally, your report generators have to know where to find your gzipped
logs as well as being able to read data from stdin for this to work.
Analog, as noted, does this.  If you wanted to use this scheme for user
accounting, say with sac, you'd have to create a frontend to gunzip to a
temp file and then run sac -f.

> Im also interested in a
>    mechanism to transmit old logfiles from a number of machines to
>    a centralized backup facility.

That would be a matter of doing 

	cp logfile$DATE*.gz $DESTINATION
	rm logfile$DATE*.gz

Obviously you'd have to play around with this according to your archival
policy.  Say you were going to do this every 3 months, archiving the files
more than 3 months old.  You can get at this date with gnu date (standard
for Linux) by

	date -date '-3months'
 	Mon Mar 10 08:04:15 CST 1997

or use the datestamp I've shown above

	date +%Y-%m-%d -date '-3months'
	1997-03-10

perhaps you just want the year month part

	date +%Y-%m-%d -date '-3months'
	1997-03

so on and so forth.  The key here is that the date command does a lot of
the hard work for you, no need to deal with tricky calendar date convensions.


--
Dirk Herr-Hoyman <hoymand@danenet.wicip.org>
DANEnet, Connecting Dane County's Communities
http://danenet.wicip.org


--
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
debian-user-request@lists.debian.org . 
Trouble?  e-mail to templin@bucknell.edu .


Reply to: