WebSphere Monitoring : Wasstat.sh to parse offline PervServletApp XML output
Please use following process when you have PerfServelet (PervServletApp application ) generated offline XML files.
Objective : I have PervServletApp application to retrieve websphere performance matrix. The data returned is in XML format (Sample.xml) and I needed something to analyse that RAW data.
While surfing I stumbled at wasstats script https://code.google.com/p/imakerobots/wiki/wasstats
For windows WebSphere Statistic (wsstat) version please visit & scroll down to “Figure 4. The WebSphere Statistic tool” . Download windows batch script
“Currently, this script only processes JVM Runtime, Session Manager, JDBC Connection Pool and Thread Pool statistics ”
I created a sample.xml file using
“http://%Server_Name%:9080/wasPerfTool/servlet/perfservlet?node=Server1_Node01&server=server_member1&module=connectionPoolModule+jvmRuntimeModule+servletSessionsModule+systemModule+threadPoolModule”
Sample.xml XML file used for Analysis
When ran againt RHEL- WAS 6.1 ND with wastats.py I came across some errors :
[was61@Server1 permon]$ /usr/bin/python wasstats.py sample.xml
Processing: sample.xmlINFO|server_member1:Server1_Node01:SuperSnoop#SuperSnoopWeb.war:RangeStatistic:0:0.0:1283788583899:0:0.0:LiveCount:1283528346720:N/A:0
INFO|server_member1:Server1_Node01:perfServletApp#perfServletApp.war:RangeStatistic:0:0.0:1283788583899:0:0.0:LiveCount:1283528346768:N/A:0
INFO|server_member1:Server1_Node01:prsysmgmt_war#prsysmgmt.war:RangeStatistic:0:0.0:1283788583899:0:0.0:LiveCount:1283528346376:N/A:0
CHECK|OK:HeapSize:Server1_Node01:server_member1:JVM Runtime:1048576/2097152:(50%)|1048576:0.0:1283788583896:1048576:1048576:0.0:1283528313727:2097152:1048576:KILOBYTE
INFO|server_member1:Server1_Node01:JVM Runtime:BoundedRangeStatistic:1048576:0.0:1283788583896:1048576:1048576:0.0:HeapSize:1283528313727:KILOBYTE:2097152:1048576
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:688987:1283788583896:UsedMemory:1283528313726:KILOBYTE
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:260270:1283788583896:UpTime:1283528313726:SECOND
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:0:1283788583899:ProcessCpuUsage:1283528313723:N/A
Traceback (most recent call last):
File “wasstats.py”, line 600, in ?
analyze( xmldoc );
File “wasstats.py”, line 202, in analyze
name = pool.getAttribute( ‘name’ );
UnboundLocalError: local variable ‘pool’ referenced before assignment
Here is a revised wasstats.py script tested on WAS 6.1 ND deployment on RHEL 5.5
[was61@Server1 permon]$ /usr/bin/python wasstats.py sample.xml
Processing: sample.xml
INFO|server_member1:Server1_Node01:SuperSnoop#SuperSnoopWeb.war:RangeStatistic:0:0.0:1283788583899:0:0.0:LiveCount:1283528346720:N/A:0
INFO|server_member1:Server1_Node01:perfServletApp#perfServletApp.war:RangeStatistic:0:0.0:1283788583899:0:0.0:LiveCount:1283528346768:N/A:0
CHECK|OK:HeapSize:Server1_Node01:server_member1:JVM Runtime:1048576/2097152:(50%)|1048576:0.0:1283788583896:1048576:1048576:0.0:1283528313727:2097152:1048576:KILOBYTE
INFO|server_member1:Server1_Node01:JVM Runtime:BoundedRangeStatistic:1048576:0.0:1283788583896:1048576:1048576:0.0:HeapSize:1283528313727:KILOBYTE:2097152:1048576
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:688987:1283788583896:UsedMemory:1283528313726:KILOBYTE
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:260270:1283788583896:UpTime:1283528313726:SECOND
INFO|server_member1:Server1_Node01:JVM Runtime:CountStatistic:0:1283788583899:ProcessCpuUsage:1283528313723:N/A
CHECK|OK:PoolSize:Server1_Node01:server_member1:Default:5/100:(5%)|10:1.303999271E9:1283788583899:1:5:5.010175044932727:1283528313697:100:5:N/A
INFO|server_member1:Server1_Node01:Default:BoundedRangeStatistic:10:1.303999271E9:1283788583899:1:5:5.010175044932727:PoolSize:1283528313697:N/A:100:5
CHECK|CRITICAL:PoolSize:Server1_Node01:server_member1:HAManager.thread.pool:2/2:(100%)|2:5.2047867E8:1283788583899:1:2:1.9997628080374719:1283528313697:2:2:N/A
INFO|server_member1:Server1_Node01:HAManager.thread.pool:BoundedRangeStatistic:2:5.2047867E8:1283788583899:1:2:1.9997628080374719:PoolSize:1283528313697:N/A:2:2
CHECK|OK:PoolSize:Server1_Node01:server_member1:Message Listener:0/50:(0%)|10:0.0:1283788583899:10:10:0.0:1283528313697:50:0:N/A
INFO|server_member1:Server1_Node01:Message Listener:BoundedRangeStatistic:10:0.0:1283788583899:10:10:0.0:PoolSize:1283528313697:N/A:50:0
CHECK|OK:PoolSize:Server1_Node01:server_member1:Object Request Broker:0/50:(0%)|10:0.0:1283788583899:10:10:0.0:1283528313697:50:0:N/A
INFO|server_member1:Server1_Node01:Object Request Broker:BoundedRangeStatistic:10:0.0:1283788583899:10:10:0.0:PoolSize:1283528313697:N/A:50:0
CHECK|OK:PoolSize:Server1_Node01:server_member1:ProcessDiscovery:1/2:(50%)|1:2.5928954E8:1283788583899:1:1:0.9966172837551133:1283528414279:2:1:N/A
INFO|server_member1:Server1_Node01:ProcessDiscovery:BoundedRangeStatistic:1:2.5928954E8:1283788583899:1:1:0.9966172837551133:PoolSize:1283528414279:N/A:2:1
CHECK|OK:PoolSize:Server1_Node01:server_member1:SIBFAPInboundThreadPool:2/50:(4%)|4:5.20254968E8:1283788583899:1:4:1.9989522093993581:1283528320064:50:2:N/A
INFO|server_member1:Server1_Node01:SIBFAPInboundThreadPool:BoundedRangeStatistic:4:5.20254968E8:1283788583899:1:4:1.9989522093993581:PoolSize:1283528320064:N/A:50:2
CHECK|OK:PoolSize:Server1_Node01:server_member1:SIBFAPThreadPool:3/50:(6%)|4:7.60451275E8:1283788583899:1:4:2.9233778697579353:1283528456297:50:3:N/A
INFO|server_member1:Server1_Node01:SIBFAPThreadPool:BoundedRangeStatistic:4:7.60451275E8:1283788583899:1:4:2.9233778697579353:PoolSize:1283528456297:N/A:50:3
CHECK|OK:PoolSize:Server1_Node01:server_member1:SoapConnectorThreadPool:2/5:(40%)|3:5.20338761E8:1283788583899:1:3:1.999267779975942:1283528319233:5:2:N/A
INFO|server_member1:Server1_Node01:SoapConnectorThreadPool:BoundedRangeStatistic:3:5.20338761E8:1283788583899:1:3:1.999267779975942:PoolSize:1283528319233:N/A:5:2
CHECK|OK:PoolSize:Server1_Node01:server_member1:TCPChannel.DCS:4/20:(20%)|5:1.041063168E9:1283788583899:1:5:3.999951342641484:1283528314941:20:4:N/A
INFO|server_member1:Server1_Node01:TCPChannel.DCS:BoundedRangeStatistic:5:1.041063168E9:1283788583899:1:5:3.999951342641484:PoolSize:1283528314941:N/A:20:4
CHECK|OK:PoolSize:Server1_Node01:server_member1:WebContainer:8/100:(8%)|10:1.832855037E9:1283788583899:1:10:7.044842168064254:1283528414119:100:8:N/A
INFO|server_member1:Server1_Node01:WebContainer:BoundedRangeStatistic:10:1.832855037E9:1283788583899:1:10:7.044842168064254:PoolSize:1283528414119:N/A:100:8
No statistics found: “DB2 Universal JDBC Driver Provider”
No statistics found: “WebSphere embedded ConnectJDBC driver for MS SQL Server”
Leave a Reply
You must be logged in to post a comment.
This NAGIOS configuration only worked when ADMIN Security at DMGR->Security -> Secure administration, applications, and infrastructure -> Enable administrative security was SWITCHED OFF.
Where ADMIN Security is enabled , you will be presented with an error message
[was61@check_was-0.3]$ ./check_was -s connectionpool -w 80 -c 90 -p server_member1
ERROR -server_member1.truststore must be provided, check configuration
[img]http://oracledbasupport.co.uk/wp-content/uploads/2010/06/disableAdminSecurity.jpg[/img]
Finally I managed to get it working with Admin Security Enabed. I had to recreate SSL KEY files key.p12/trust.p12 and supply passwords at “check_was.servers”
Note if you have N JVMs then, copy above config lines N times and change member2 to memberN.
server_member2.hostname=localhost
server_member2.port=8882
server_member2.username=XXXXX
server_member2.password=XXXX
server_member2.securityenabled=true
server_member2.truststore=/opt/IBM/WebSphere/AppServer/profiles/Profile01/dmgr/config/cells/server1_Cell/trust.p12
server_member2.keystore=/opt/IBM/WebSphere/AppServer/profiles/Profile01/dmgr/config/cells/server1_Cell/key.p12
server_member2.truststorepassword=WebAS
server_member2.keystorepassword=WebAS