The following alphabetical list provides a convenient look-up for all the command line arguments used in automated regression testing.
Option
|
Description |
Print command line help on the standard output. |
|
Pass the remainder of the command line (after -allArgs) to the program being launched. |
|
Pass command line arguments to the target program. Can be used multiple times. |
|
Load a previously created session into Performance Validator to act as the baseline session against which other sessions are compared. |
|
Points to a file specifying the class and function hook options. |
|
Turn data collection on or off |
|
Turn the collection of function timing on or off |
|
Turn the collection of line timing on or off |
|
Turn collection of stdout on or off |
|
During session comparison choose whether to use the percentage of the parent's execution time or the total time as the comparator. |
|
Control if the each of the function details are shown in an exported session comparison.
|
|
Obsolete and ignored if present. |
|
Set the working directory in which the program is executed. |
|
Force the Performance Validator user interface to be displayed during the test. |
|
Points to a file listing the DLLs to be hooked for the test. |
|
Never display dialog boxes in the target application that is being profiled. |
|
Specify a runtime configuration option to the .Net runtime. |
|
Specify if you are launching a self contained or framework dependent .Net Core application. |
|
Environment variables for program, as a series of name/value pairs |
|
Exports the session data as a HTML file containing a call graph or a call tree when Performance Validator has finished collecting data from the target program. |
|
Export the session data as an HTML or XML file when Performance Validator has finished collecting data. |
|
Specify the file encoding for the exported file |
|
Set the description to be included the exported HTML/XML. |
|
Set the minimum overall percentage contribution that a function must have in order to be included in a call tree or call graph export. |
|
Specify a plain text file listing file locations to be used during testing. See the format of the file below. |
|
Print command line help on the standard output. |
|
Hide the Performance Validator user interface during the test. |
|
Obsolete and ignored if present. |
|
Set the numeric (decimal) id of a process for Performance Validator to attach to. |
|
Set the name of the process for Performance Validator to attach to. |
|
Hide the target application during the test. |
|
Show the target application during the test. |
|
Show the target application maximized and activated. |
|
Show the target application minimized and activated. |
|
Show the target application minimized and not active. |
|
Show the target application at current size and position but not activated. |
|
Show the target application at most recent size and position but not activated. |
|
Show the target application at original size and position and activated. |
|
Points to a previously saved settings file to be used for the test. |
|
Specify the full file system path to a service to monitor, including any extension. The service is not started by Performance Validator but by an external means. |
|
Obsolete and ignored if present. |
|
Obsolete and ignored if present. |
|
Set the number of sessions that can be loaded at once. |
|
Obsolete and ignored if present |
|
Specify the full file system path of the executable target program to be started by Performance Validator, including any extension. |
|
Specify the .Net Core DLL that identifies the program being monitored. Use in conjunction with -programToMonitorEXE. |
|
Changes which program the data is collected from but does not change which process Performance Validator initially launches. |
|
Specify the nth invocation of the programToMonitor which is to have its data collected. |
|
Automatically refresh the Analysis tab once a test is complete. |
|
Automatically refresh the Call Tree tab once a test is complete. |
|
Automatically refresh the Call Graph tab once a test is complete. |
|
Automatically refresh the Statistics tab once a test is complete. |
|
Save the session data when all data has finished being collecting from the target program. |
|
Compare two sessions, producing an HTML or XML report detailing any performance regression and improvements. |
|
Load a previously created session to be compared with the data from the session being recorded. |
|
Environment variables for Performance Validator, as a series of name/value pairs |
|
Points to a previously saved settings file to be used for the test. |
|
Force errors to be displayed using a message box when running from the command line. |
|
Points to a file specifying the source files to be hooked for the test. |
|
Obsolete and ignored if present. |
|
Set the percentage threshold to be used when performing session comparisons. |
|
Name a .Net Core dll that identifies the process to wait for. Use in conjunction with -waitNameEXE. |
|
Name a process that Performance Validator will wait for. |
|
Specify how to hook Win32 API functions called from the target application. |
To run 32 bit performance validator run C:\Program Files (x86)\Software Verify\Performance Validator x86\performanceValidator.exe
To run 64 bit performance validator run C:\Program Files (x86)\Software Verify\Performance Validator x64\performanceValidator_x64.exe