shell script to clean old oracle trace and log files

This code cleans up old trace files, log files, core dumps, etc. It is designed to be run from cron. It takes a somewhat brutal approach by deleting files after just 7 days – good for e.g. dev/test servers, but in production you would probably want to modify this to keep files for longer.

For this example all oracle files of interest were under directory “/ora” – that would need to be changed to suit other sites.

# 1) Remove old oracle-owned aud, trc, trw, core files
#    Also remove old recv error files (these have format ".err_[0-9]")
#    Also remove access_log., error_log., emoms.log. files - which are generated in large numbers
#    Only core files named "core." are removed: because many other files with "core" in their name are not core dumps, are used by oracle.

echo "*** REMOVE OLD ORACLE AUDIT FILES and OLD RECV Error Files and OLD TRACE FILES ***"
find /ora -mtime +7 -user oracle \( \
  -name '*.aud' -o \
  -name '*.trc*' -o \
  -name '*.trw' -o \
  -name 'core.[0-9]*' -o \
  -name '*.err_[0-9]*' -o \
  -name 'access_log.[0-9]*' -o \
  -name 'error_log.[0-9]*' -o \
  -name 'emoms.log.[0-9]*' \) -exec rm {} \;

# 2) Cut down alert logs and listener logs, and also access_log and event_log (webcache/portal/etc.)
#    Old log files are ignored - only log files modified in the last 7 days are worked on.
#    Small log files are ignored - only files bigger than 3mb (=6144*512 byte blocks) are worked on. 3mb is approximately 30,000 lines of text.

echo "*** CUT DOWN THE ALERT LOGS and ORACLE LISTENER.LOG FILE ***"
for FILE in `find /ora -mtime -7 -size +6144 -user oracle \( \
  -name 'alert_*.log' -o \
  -name 'listener*.log' -o \
  -name 'access_log' -o \
  -name 'event_log' -o \
  -name 'http-web-access.log' -o \
  -name 'server.log' \)`
do
  echo "*** cutting $FILE ***"
  cp $FILE $FILE.tmp
  tail -10000 $FILE.tmp > $FILE
  rm $FILE.tmp
done

echo "*** CUT DOWN MESSAGES and WARN FILES  ***"
for FILE in `find /var/log/ -mtime -7 -size +6144 \( \
  -name 'messages' -o \
  -name 'warn'  \)`
do
  echo "*** cutting $FILE ***"
  cp $FILE $FILE.tmp
  tail -10000 $FILE.tmp > $FILE
  rm $FILE.tmp
done

#End of file.
February 26, 2010

  • At this time I am supporting Oracle 9i while data is being cleanup before doing an upgrade to Oracle 11gr2. We are testing ORacle 11gr2. This is nice
    This is what I was looking for!

    Thank you!

  • Is there a configuration parameter you can set in Oracle to only keep these files for a given number of days?

  • Thanks, for the script, it’s very useful.

    One little tip: find … -exec rm {} is slow, because it spawns a new rm process for every file. Also the filenames can contains special characters (especially spaces) which makes rm fail (rm a b.txt -> try to removes “a” and “b.txt” not “a b.txt”). It’s faster and safer using find and rm this way: find … -print0 | xargs -0 –no-run-if-empty — rm.

  • Leave a Reply

    Your email address will not be published. Required fields are marked *