by Darryl Gove, December 2011
As computer systems continue to become more powerful, application performance is emerging as a critical factor, and bad performance increasingly considered to be application failure. Developers are now keenly aware that they must streamline critical sections of the source code, as well as locate programmatic errors and coding deficiencies, without impacting application accuracy. Oracle Solaris Studio includes the Performance Analyzer, which can help with these tasks.
To help you analyze applications, the Performance Analyzer provides several ways to view collected performance data, including data display at the function or load object level. You can control which metrics are shown, as well as the order in which they appear.
To use the Performance Analyzer, compile your application with any level of parallelization and optimization.
Note: See How to Optimize the Parallel Performance of Applications and How to Optimize the Serial Performance of Applications for information about parallelization and optimization.
To see source code, and to attribute time to lines of source code, you must also specify the
Then, run the application using the
collect command. The command can specify a PID, for example:
% collect -P <pid>
Or, you can use the
collect command to launch the application and its parameters.
% collect <application> <parameters>
collect command gathers experimental performance data during application execution, saving the data to an experiment file called
test.1.er, which is used later during the analysis process. The
collect command enables you to obtain information about the following items:
Once the experiment is complete, load the experiment data from the
test.1.er file into the Performance Analyzer by using the command line or the File menu in the analyzer's GUI.
For example, to start the Performance Analyzer and load one or more experiment files from the command line, type the following command.
% analyzer <control-options> <experiment-file(s)>
The following sections describe some of the tabs in the Performance Analyzer GUI. For more information about these and other tabs, see the Oracle Studio Performance Analyzer documentation.
The Functions tab (Figure 1) shows a list of functions and their metrics. The metrics are derived from the data collected in the experiment. Metrics can be either exclusive or inclusive. Exclusive metrics represent usage within the function itself, while inclusive metrics represent usage within the function and all the functions it called.
Figure 1. The Function tab helps you understand where time is being spent.
The Callers-Callees tab shows a selected function in a pane in the center, with callers of that function in a pane above and callees of that function in a pane below (Figure 2).
For the selected function, the attributed metric represents the time attributed to that function. For the callees, the attributed metric represents the portion of the callee's inclusive metric that is attributable to calls from the selected function.
Figure 2. The Callers-Callees tab shows attributed time related to a selected function.
If you compiled the code with the
-g (debug) option, you can view the source code for a selected function with annotations about performance metrics for each source line along with compiler commentary (Figure 3).
Figure 3. The Source tab shows performance metrics for each source line.
The Disassembly tab (Figure 4) shows the assembly language view of the application. If you compiled the application with the
-g (debug) option, the display interleaves the disassembly information with the source code.
Figure 4. The Disassembly tab shows disassembled code.
The Timeline tab (Figure 5) enables you to view the application timeline and call stack for selected events.
Figure 5. The Timeline tab graphically illustrates the application timeline and call stack.
For more information about Oracle Solaris Studio, please see the complete Oracle Solaris Studio product documentation at http://oracle.com/technetwork/server-storage/solarisstudio/documentation/oss123-docs-1357739.html.
|Revision 1.0, 12/13/2011|