But what if you require to keep history of more than 99 files which is the maximum file number for WAS? There is also a maximum file size of 50MB (personally I would recommend much smaller, 20MB size of logs). So there are two problems - number of files and its size. First lets deal with the number of files. To extend it we can simply use a cron mechanism, which will move the files to another directory every 5 minutes. This simple script will do the trick:
mv trace_* archive
It assumes that you have a separate directory /ibm/logs/Srv01/archive - it would be nice if it was another file system. So WAS will produce new logs in trace_
OK, so we managed to have more than 99 files, now what with its size? If we want to handle more than 99 files of size about 20 MB a day it will give us more than 2GB of space per application server per every single day! We have to compress it. It would also be nice to keep it in one archive per day so if we would like to inspect what happed on given day we would just copy only one bundle. A good information is that text files compress very nicely.
To do this we will also use a script, but a bit more complex one. Here's what I did:
# This script will compress all trace logs from yesterday in its working directory
echo '########## Started on ' $(date +%Y%m%d) >> logger.log
YESTERDAY=`TZ=aaa24 date +%y.%m.%d`
if ls $YESTERDAY
echo 'Logs sucessfuly compresed, removing those files:' >> logger.log
tar cvf - $YESTERDAY | gzip > trace_`TZ=aaa24 date +%Y%m%d`.tar.gz
rm $YESTERDAY >> logger.log
echo 'Failed compresing the logs!' >> logger.log
OK, so a quick explanation for this script - it enters the archive directory, then it prepares a $YESTERDAY variable which forms an yesterday date (the script will be triggered every day for yesterday logs). Note that I also added a * sign, so this simple regular expression will enable me to list all files on given day (trace_11.05.09*). You can shorten my script but I created it this way only to give you a step by step guide of what it does. OK, so now if we know what files we want to archive, we just try to list them via ls command, if the script succeed, that means that there are files to be archived so it continue to tar them and also zip them to a single file. The last part is the deletion of compressed files. You can also add another if instruction to the tar command to be sure to remove the plain log files only when the zipping command will succeed.
The result archive will look like this trace_20110507.tar.gz
The script also adds a log to a logger.log file to keep a history of what happed in case you would want to check it out - remember this will run automatically!
The last task would be to add two cron lines, first one will trigger the first script and it triggers every 5 minutes. The second one triggers every 2 AM (the night time is usually the best time to utilize the servers).
5 * * * * /first_script.sh
0 2 * * * /second_script.sh
Hope you'll find it useful. I created those scripts on AIX system but they should also run on most of Linux whith little or no change.