Update – 02-MAY-2012
I have had cause to use this script again at work so thought I’d develop it further. This post has been updated to reflect the new version.
Recently at work I was engaged by one of our application support teams because of application calls failing with “ORA-00020: maximum number of processes (300) exceeded”. We quickly identified that these errors were the symptom of another problem and not a concern in themselves, the real issue was slow database calls due to an infrastructure issue. The ORA-00020 errors were because the application was spawning new connections to try to work around the slow connections. Now I don’t know much about application servers but I know this is not uncommon. The application team however could not find any indication of this in their logs so it was up to me to prove it.
As far as I am aware there are 2 options for retrospectively proving this at a granular level (please share any other options I’ve missed):
1) Query the database audit trail – DBA_AUDIT_SESSION
2) Mine the listener log
On this database no auditing was enabled so I was left with the listener log as my data source.
I knocked up a quick KSH/AWK script to get my data and then massaged it via various means. The output was quite pleasing so over the recent long UK Easter holiday I Googled some Perl commands and had some fun with it, this post is to share the script and an example chart from Microsoft Excel based on the incident I described above.
The script can be downloaded from – llog-miner.pl
Below is an example of how to call the script – it works on *nix and Windows. The “-f” parameter is used to filter log entries on the date, this example is matching any minute between “11:00” and “12:59”. I’ve included a full usage map further down this post.
C:\>perl llog-miner.pl -excel -f "04-APR-2012 1" listener.log > llog.csv
This is the chart showing the logon spike – click to enlarge. You can see the connection spike at 12:56 of over 250 connection attempts.
The script allows a few other options such as allowing listener logs to be “piped” into the script to allow multiple files to be processed in one command. Usage notes below.
C:\>perl llog-miner.pl -help Usage: llog-miner.pl [-help] [-filter <string>] [-[no]excel] [-[no]host] <listener_log_path|-> : -excel|noexcel : Output dates in Excel friendly format (default is noexcel) : -help : Print this help : -host|nohost : Include host on program (default is to include) : -filter <string> : A regexp to match beginning of log line. e.g. "23-DEC-2012 0" or "..-MAR-2013" : listener_log_path|- : Input either a valid listener log name or "-" to read STDIN Examples: # Process listener.log looking for entries on the 21st and 22nd of Sept and format dates for Excel C:\>perl llog-miner.pl -filter "2-SEP-2011" -excel listener.log # Process two listener log files looking for entries on Jan 3rd between 20:00 and 21:59 and exclude the hostname $ cat listener.log listener_db.log | llog-miner.pl -filter "03-JAN-2012 2" -nohost -
An improvement I may make in the future is to expand minutes where there are no connection attempts. I have drafted some code to fill in gaps but have kept it to myself for now as it’s too ugly to share (at the moment).
I hope someone finds this useful and if not, well I found it entertaining if nothing else🙂