Guest yvette.ye@gmail.com Posted August 23, 2007 Posted August 23, 2007 HI...I am monitoring a server in performance monitor, and here is what I found: there is a client/server application: client - app server - SQL server, when the client run a function of the application, the CPU at the App server climbs up to 25%, and stays for about 5 minutes before the client can see the result. During the period, the memory, disk counter at the App server and SQL server are all low, except the CPU at the APP server maintains at 25% for 5 minutes. My first question is why the CPU not jump up to 100% and finish at a shorter time? My further test is that I have two clients running the same application, surprisely, the CPU is still at 25%, but it doubles the waiting time, make it 10 minutes. What not CPU goes up to 50% and finish at 5 minutes? Both the App server and SQL has dual CPU. But next question is what is the bottlenet in this case, and how can speed up this application by having the application use more than 25% of the CPU for this function? P.S. the applcation is Cognos and the process name is: PPDSWEB.exe My further observation is some other function of the application, the CPU does goes up over 90%. Thanks, fshguo.
Recommended Posts