-
Notifications
You must be signed in to change notification settings - Fork 9
Sandbox
- Download a stable Eclipse distribution (e.g. Indigo)
- install the SoPeCo-Libs feature (update-site)
- install the SoPeCo-Core feature (update-site)
- (Re)Start Eclipse
The following instructions are based on "Installing R statistics tool for usage with PCM" at http://sdqweb.ipd.kit.edu/wiki/PCM_Installation .
- Install R (http://www.r-project.org/)
- Open R and use the menu package and sub-menu install package to install the package rJava and the package plotrix. If you are behind a proxy you need to configure R properly. See the SoPeCo troubleshooting page.
- Set the R_HOME environment variable to the R installation directory
- Put the bin directory of your R installation in your systems path environment variable.
- On Windows, set the R_LIB environment variable to point to the win-library directory of R. It is either under the R installation or at [Your_Home_Dir]/Documents/R/win-library.
- Insert a line into the eclipse.ini in the directory of your eclipse installation that sets the java.library.path to the directory containing the file jri.dll. In MacOS this file is under Eclipse.app/Contents/MacOS.
Example Windows: depending on your installation either of the following:
- -Djava.library.path=C:\Programme\R\R-2.12.0\library\rJava\jri
- -Djava.library.path=[Your_Home_Dir]\R\win-library\2.14\rJava\jri
Example Windows7/64Bit: depending on your installation either of the following (notice the "x64" at the end of the path):
- -Djava.library.path=C:\Programme\R\R-2.12.0\library\rJava\jri\x64
- -Djava.library.path=[Your_Home_Dir]\R\win-library\2.14\rJava\jri\x64
Example MacOS X (Multi-User R installation):
- -Djava.library.path=/Library/Frameworks/R.framework/Versions/Current/Resources/library/rJava/jri
The Software Performance Cockpit is a tool which allows for capturing the performance behaviour of software systems and component, infering performance curves and provides an Eclipse-based tool for the visualisation of measurement and analysis results. Using SoPeCo f***** or own measurements comprises some implementation and configuration steps, which are described in the following. These include modelling of the measurement experiment environment and experiment configurations, implementing a software adapter which conducts the single measurements and implementing a Cockpit-Starter which is basicly a configuration of the required Cockpit-Adapters. In the following we will illustrate all these steps by means of a concrete example.
In order to illustrate the idea of how to use the SoPeCo, we introduce a simple example. For the sake of simplicity, in this example we use a Fibonacci-Calculator as the 'Target System'. Assume that we have a system that calculates the Fibonacci function using two different implementations: recursive and iterative. Our goal is to analyze the response time of the system depending on the implementation and the input parameter. The Fibonacci function is defined as follows:
F(n) = F(n-1)+F(n-2), F(0) = 0, F(1) = 1
The target system supports the following implementations:
public static int fibonacciRecursive(int n) {
if ((n==1) || (n==2)){
return 1;
}else{
return fibonacciRecursive(n-1)+fibonacciRecursive(n-2);
}
}
public static int fibonacciIterative(int n) {
int f1 = 0;
int f2 = 1;
for(int i = 1; i < n; i++){
int temp = f2;
f2 = f1+f2;
f1 = temp;
}
return f2;
}
and offers the following interface:
/**
* Calculates the Fibonacci function using either a recursive
* or an iterative implementation.
*
* @param n input number
* @param impType the String representation of an {@link ImplementationType} value.
* @return value of the Fibonacci function for input n
*/
public static int fibonacci(int n, String impType) {
final ImplementationType type =
ImplementationType.valueOf(impType.toLowerCase());
switch (type) {
case iterative:
return fibonacciIterative(n);
case recursive:
return fibonacciRecursive(n);
default:
return fibonacciRecursive(n);
}
}
You can download the complete Java file from here: Fibonacci.java
Generally, this Fibonacci class represent any software system (e.g. database system, messaging middleware, etc) with any input parameters. Furthermore, in order to perform the analysis, we do not have to know the implementation details of the system to be measured, thus, the system can be viewed as a black box. In this particular example, all we care about from outside the system is that implementation type is passed to the system as a parameter and we do not need to know anything about how the parameter is handled internally.
For our Fibonacci example, in the SoPeCo-Eclipse we create a new Plugin-Project ('org.example.fibonacci.implementation'), which is the target system to be measured. The project Fibonacci Implementations contains which offers a simple interface to calculate the Fibonacci function based on an input numeric value and an implementation type. In order that other projects can access the Fibonacci Implementation project, it is necessary to export the corresponding package in the META_INF/MANIFEST.MF file.
As the first step we have to specify our experiment environment. We will place this Configuration model into the same project which later will start the SoPeCo instance. Thus, we have to create a Plugin-Project called org.example.fibonacci.cockpitstarter. We will describe this project later, at this point we need it just to place our models in a resource directory of this project. For this, we create a Source Folder 'rsc' inside the 'org.example.fibonacci.cockpitstarter' project and than a AdapterRepository model (as depicted in Figure 2) named 'Fibonacci.adapterrepository' inside 'rsc'.
In the newly created model we create a new Adapter Descriptor and call it 'Fibonacci Example Adapter' with the id 'FibonacciAdapter'. An adapter descriptor is used to describe the configuration parameters, dependent and independent parameters of the system to be measured. In our example we have only one independent parameter. Thus, we create a configuration parameter group with two parameters called 'InputValue' and 'Implementation'. We specify the type of 'InputValue' as INTEGER and set the default value to '1'. We then specify the type of 'Implementation' as String and set its default value to "iterative". Now we need to specify the dependent parameter, too. For this, we create the parameter 'ResponseTime' of type DOUBLE.
As the next step, we need a Configuration model, which specifies the measurement behavior. Thus, we create a Configuration model named 'FibonacciAnalysis.configuration', analogous to the creation of the AdapterRepository model. As the first Configuration child we create the Resource Environment describing the satellites used to measure the target systems. In our example we have only one satellite with Satellite ID 'FibonacciSatellite'. A satellite is deployed on the same platform/system as the target system to be measured and conducts single measurements. Entire experiments are controlled by the SoPeCo instance, which is (normally) deployed on another system in order to reduce the influence of SoPeCo to the target system. The SoPeCo instance interacts with the satellite over Remote Method Invocations (RMI). Thus, we have to set the Hostname (IP-Address) and Port specifying the RMI accesspoint of the satellite. For the sake of simplicity, in our example, we will deploy all components (the satellite, the SoPeCo instance and the target system) on the same machine. Thus we set the Hostname to 'localhost' and the port to the standard RMI port '1099'; Finally, the satellite has to reference an adapter descriptor, in our case it is the 'Fibonacci Example Adapter'. However, we first have to load our Fibonacci.adapterrepository model into the newly created FibonacciAnalysis.configuration model.
The Measurement Specification is the second mandatory specification (all other specifications are optional and will be described later). A Measurement Specification consists of a set of Experiment Series. However, before we start defining experiment series, we need to expose the input/output parameters of our software adaptor that we want to refer to (within the experiments definitions) using Probes. For our scenario, we create one Probe that includes all our input/output parameters.
As we want to compare the two Fibonacci implementations, we define two series ('FibonacciRecursive' and 'FibonacciIterative'), for each implementation one. We will describe an Experiment Series specification by means of the 'FibonacciRecursive'. For a Experiment Series an exploration strategy has to be specified describing how the parameter space is traversed. There are many different exploration strategies: Full Exploration, Adaptive Breakdown, Plackett Burmann, .... In our example we use the FullExploration Strategy which systematically explores the entire input parameter space. Then, create an Experiment Run Configuration, which describes one Run of an experiment series. First, we have to point to the used adapter descriptor and the resource environment by creating an Experiment Run Start Info. Secondly, we have to specify a termination criteria for the experiment runs. There are two options: either the run should be terminated after a certain number of measurements (Number Of Measurements Termination Criteria) or the run should be terminated after a specified periof of time (Measurement Time Termination Criteria). We use the first option with a number of measurements of 2. Finally, we must declare how the results of all measurements within a run are aggregated to single observation values for each observation parameter. For this, we define a Data Processing Step with the method Average. Having specified the exploration strategy and the run configuration, the input parameter space has to be defined. For each parameter which should be varied, one can define a Variation (Linear Integer Variation, Linear Double Variation, ...) describing the value boundaries and variation step. for a referenced input parameter. Canstants can be used to set fix configuration parameters. In our example, we vary the 'InputNumber' parameter from 1 to 30 with a step width of 10. In the 'FibonacciRecursive' experiment series we set the 'Implementation' parameter to the constant value 'recursive'. The experiment series configurations for 'FibonacciIterative' are identical to the configurations of 'FibonacciRecursive', only the 'ImplementationType' constant is set to 'iterative'.
Now, we are ready with basic configurations of the measurement. There are some additional options which will be described later. For the first measurement run, these configuration are satisfactory.
The main implementation part when using SoPeCo for performance analysis is the creation of the Software Adapter. The software adapter is the part, which is deployed with the satellite to the target platform in order to conduct measurements on the target system. Thus, the software adapter must know how to access the target systems, how to perform single measurements and how to capture the target metrics. We illustrate the idea of a software adapter by describing implementation of the software adapter for our Fibonacci example.
First, we need a new Plugin-Project which we call org.example.fibonacci.swadapter in our example. In this project we create an additional Source Folder 'rsc' in which we copy the 'Fibonacci.adapterrepository' file created in the previous section. We will use this file later. A software adapter has dependencies to the following plug-ins:
- org.sopeco.core.common
- org.sopeco.satellite.adapter.interfaces
Furthermore, our software adapter must access the Fibonacci implementation project, thus it has an additional dependency to the project org.example.fibonacci.implementations. In order to resolve these dependencies we have to add them into the dependencies list of the META-INF/MANIFEST.MF in the 'org.example.fibonacci.swadapter' project, as depicted below.
Now create a package org.example.fibonacci.swadapter with a class 'FibonacciAdapter' inside the 'src' folder. The class 'FibonaciAdapter' should extend the class 'AbstractExperimentRunStarter'. We, first, define some String constants representing the parameter names and valid values.
public class FibonacciAdapter extends AbstractExperimentRunStarter {
/**
* Parameter specifying the response time which is the observed parameter
* in this example.
*/
public static final String RESPONSE_TIME = "ResponseTime";
public static final String PROBE = "ProbeA";
/**
* This is the name of the input number parameter for the Fibonacci calculator.
*/
public static final String INPUT_NUMBER_PNAME = "InputNumber";
/**
* This is the name of the implementation parameter for the Fibonacci calculator.
*/
public static final String IMPLEMENTATION_PNAME = "Implementation";
/**
* Create a new adapter.
*/
public FibonacciAdapter() {
super("Fibonacci.adapterrepository", "FibonacciAdapter");
}
Each sub-class of the AbstractExperimentRunStarter has to implement the following methods:
-
prepareExperimentRun(List<ParameterValue<?>> parameterValueList): This method is called prior to each experiment run. In our example we use this method to initialise our target DataSet for each run. For this, we call the 'startRow()' method in order to initialise the first row. Then, for all parameters the parameter values are set:
/**
- Initialises the experiments with the given configuration. */ @Override public void prepareExperimentRun(List<ParameterValue<?>> parameterValueList) { startRow(); for (ParameterValue pv : parameterValueList) { setConfigurationValue(pv.getParameter(), pv.getValue()); } }
-
startExperimentRun(): This is the main method conducting the actual measurements for the current run. As input parameters do not change during an experiment run, we initially retrieve the value of the input parameter and save it as an Integer variable (inputValue). As we want to repeat the measurements several times for one run, we use a for-loop. The termination condition is determined by the NumberOfExperimentRuns() which was specified in the Measurement Specification as Number of Measurements Termination Creteria. We construct the dataset for each run by copying the previous row of each measurement. For the first measurement no previous row exists, thus skip this step, if i has the value 0. Then we call the Fibonacci calculation function measuring the time this function requires for calculation. Depending on the initially (at the experiment series initialization) captured Fibonacci implementation type (implType) the method calculateFibonacci() calls either the recursive or the iterative algorithm. Having calculated the reponse time we set the it in the target dataset for the run and finish the current dataset row.
/** * Starts the experiment run by calling the FibonacciCalculator with the the * current value of the input parameter. */ @Override public void startExperimentRun() { // get the current value of the configuration parameter // "InputNumber" Parameter inputNumberParameter = ParameterRegistry .getParameterByName(INPUT_NUMBER); ParameterValue<?> inputNumberParameterValue = builder() .getCurrentValue(inputNumberParameter); Integer inputValue = (int) inputNumberParameterValue.getValueAsDouble(); // measure the current configuration n times // in order to get a confident response time value for (int i = 0; i < getNumberOfExperimentRuns(); i++) { // get last values of configuration and observation parameters if (i != 0) builder().startRowCopy(); // measure response time of fibonacci calculation of current input // number long startTime = System.nanoTime(); calculateFibonacci(inputValue.intValue()); double responseTime = (System.nanoTime() - startTime) / 1000; // save the response time in the row of the current run builder().setValue(RESPONSE_TIME, responseTime); builder().finishRow(); } } private void calculateFibonacci(Integer n) { if (implType.equalsIgnoreCase(RECURSIVE)) { FibonacciRecursive.fibonacciRecursive(n); } else if (implType.equalsIgnoreCase(ITERATIVE)) { FibonacciIterative.fibonacciIterative(n); } else { throw new IllegalArgumentException( "Unknown Fibonacci implementation type!"); } }
-
abortExperimentRun(): This method is called by the experiment series controller if an experiment run is aborted. This method for example can be used in order to free allocated resources. In our example we do not need this method, thus we leave it unimplemented:
@Override public void abortExperimentRun() { }
With this, the implementation of the software adapter and the project 'org.example.swadapter.fibonacci' is complete. Again, in order that other projects can use our software adapter we have to export the corresponding package in the MANIFEST file.
In the previous section we explained how to implement a software adapter, which performs the measurements. As already mentioned, (in general), a software adapter is deployed on the same platform as the target system, while the SoPeCo instance itself is separated by deploying it on another platform. The SoPeCo then communicates with the software adapter using RMI. For this, we need a satellite starter which starts the rmi-server and publishes the corresponding satellite controller in order to access the software adapter through rmi.
Create a new Plugin-project with the name org.example.satellite.start.fibonacci. In this project create a second source folder rcs where you create a file with the name fibonacciSatellite.properties. In this file we specify the id and location of our satellite. In our example we will run both, the SoPeCo instance and the satellite on the same platform, thus we set the host to 'localhost' and port to '1099'. The id of our satellite is 'FibonacciSatellite'. We will use these properties for the start of the satellite controller.
A satellite starter project has the following dependencies:
- org.sopeco.satellite
- org.sopeco.satellite.swadapter.interfaces
- org.sopeco.satellite.interfaces
Additionally, the starter has to know the corresponding software adapter. Thus, in our example we have also a dependency to the project 'org.example.swadapter.fibonacci'. As up to now, indirect dependencies cannot be resolved, we need two additional dependencies. (However, we try to resolve this issue.):
- org.eclipse.emf.ecore.xmi
- org.apache.log4j
Modify the dependencies in the META-INF/MANIFEST.MF file of the org.example.satellite.start.fibonacci project accordingly:
Now, in the src folder, create a new package org.example.satellite.start.fibonacci with a new class FibonacciSatelliteStarter:
public class FibonacciSatelliteStarter {
SatelliteController satelliteController;
public static void main(String argV[]) throws Exception
{
FibonacciSatelliteStarter satelliteStarter = new FibonacciSatelliteStarter();
satelliteStarter.buildSatellite();
satelliteStarter.startSatelliteServer();
}
private void startSatelliteServer(){
satelliteController.startRmiServer();
}
public synchronized void buildSatellite(){
// create SatelliteController instance
satelliteController = new SatelliteController("fibonacciSatellite.properties");
// create and bind SoftwareAdapter instances
FibonacciAdapter swAdapter = new FibonacciAdapter();
satelliteController.addSoftwareAdapter(swAdapter);
}
}
The FibonacciSatelliteStarter class has a main-method creating a FibonacciSatelliteStarter instance, building the satellite and starting it. The satellite controller is saved in an instance variable (satelliteController). For the creation of the satellite constroller the name of the properties file described before is passed to the constructor. Then an instance of our FibonacciAdapter is created and registered at the satelliteController. With the method startSatelliteServer the satellite will be started and will wait for instructions from the SoPeCo instance.
In section 'Creating Measurement Specification Models' we created the project org.example.fibonacci.cockpitstarter in order to put our configuration models into this project. In the same project we will create the CockpitStarter which creates an instance of the SoPeCo and initiates the measurements. First, we have to add some dependencies to the META-INFMANIFEST.MF file of the org.example.fibonacci.cockpitstarter project. Every CockpitStarter project has at minimum the following dependencies:
- org.sopeco.core
- org.sopeco.experiment
- org.sopeco.experiment.interfaces
- org.sopeco.experimentseries
- org.sopeco.experimentseries.exploration.interfaces
- org.sopeco.maincontroller
- org.sopeco.satellite
- org.sopeco.satellite.interfaces
- org.sopeco.persistence
Furthermore, depending on the configurations of the measurement specification (*.configuration file) additional dependencies are needed. For our example we additionally need only the dependency org.sopeco.experimentseries.exploration.fullexploration, as we use the Full Exploration Strategy. As up to now, indirect dependencies cannot be resolved, we need additional dependencies. (However, we try to resolve this issue):
- org.apache.log4j
- org.apache.commons.csv
- org.eclipse.emf.ecore.xmi
- org.sopeco.export
- org.sopeco.export.interfaces
- org.sopeco.export.plugin.csv
- org.sopeco.emfutils
- org.sopeco.persistence.visualisation
Next, we create a package org.example.fibonacci.cockpitstarter in the source directory src, where we create the class FibonacciCockpitStarter with the following content:
public class FibonacciCockpitStarter {
MainController mainController;
public static void main(String[] args) throws Exception {
if (args.length == 1) {
FibonacciCockpitStarter cockpit = new FibonacciCockpitStarter();
cockpit.buildCockpit();
cockpit.runCockpit(args[0]);
} else {
throw new RuntimeException("This program requires"
+ " exactly one argument (the path to the"
+ " configuration file).");
}
}
public void buildCockpit() {
// create component instances
this.mainController = new MainController();
TreeRepository dataRepository = openRepository();
ExperimentController experimentController;
try {
experimentController = new ExperimentController();
} catch (RemoteException e) {
throw new IllegalStateException(
"Cannot create experiment controller instance!", e);
}
SFExperimentSeriesInitializer experimentSeriesInitializer = new SFExperimentSeriesInitializer();
FullExplorationStrategyImplementation fullExploration = new FullExplorationStrategyImplementation();
experimentController.setDataRepository(dataRepository);
fullExploration.setExperimentController(experimentController);
experimentSeriesInitializer.setExperimentController(experimentController);
experimentSeriesInitializer.addExplorationController(fullExploration);
mainController.setDataRepository(dataRepository);
mainController.setExperimentSeriesInitializer(experimentSeriesInitializer);
}
private TreeRepository openRepository() {
TreeRepository dataRepository = new TreeRepository();
File repositoryDir = new File("data");
if (!repositoryDir.isDirectory()) {
if (!repositoryDir.mkdir())
throw new IllegalStateException("Could not create directory "
+ repositoryDir.getAbsolutePath()
+ " for dataRepository!");
}
dataRepository.open(repositoryDir.getAbsolutePath());
return dataRepository;
}
public void runCockpit(String pathToConfiguration)
throws InvalidConfigurationException {
mainController.start(pathToConfiguration);
}
}
The main() method of the FibonacciCockpitStarter expects the absolute path of the created *.configuration file as an argument. This argument will be passed, when the FibonacciCockpitStarter will be started (, we describe this later).
If the number of arguments is correct, then the main() method creates a FibonacciCockpitStarter instance, builds it and starts the cockpit. In the buildCockpit() method we have to create all components used by the cockpit and register them at the MainConstroller. The MainController instance is the actual SoPeCo instance coordinating the measurements.
Thus, in the buildCockpit() method we, first, create a new instace of the MainConstroller. Then we open a dataset repository by calling the method openRepository(). This requires a specification of the directory for the repository. In our case, we use the relative path 'data', where all datasets should be stored during measurements. Next, we create instance for the ExperimentController, ExperimentSeriesInitialiser and the FullExplorationStrategyImplementation. For both, the experimentController and the main controller, the dataRepository has to be set. The exploration strategy and the experimentSeriesInitializer must know the experimentController. Furthermore, the experiemntSeriesInitializer must know the used exploration strategy instance. Finally, we have to pass the experimentSeriesInitializer instance to the mainController.
For advanced model configurations, the buildCockpit method is more complex as additional components like analysisController, exportController, etc have to be created. Advanced features, are described later!
Now, we are ready to start the measurements. For this, first, start the satellite: Select the FibonacciSatelliteStarter class in the org.example.satellite.start.fibonacci project and then select Run As -> Java Application in the context menu. In the console the output should look like it is depicted in the next figure:
Next, select the FibonacciCockpitStarter class in the org.sopeco.start.fibonacci project and go to Run As -> Run Configurations in the context menu. There, create a new Java Application configuration by double clicking on the corresponding item. In the Main tab enter org.example.fibonacci.cockpitstarter for the Project field and org.example.fibonacci.cockpitstarter.FibonacciCockpitStarter for the Main class field. Finally, in the Arguments tabs enter the absolute path to the FibonacciAnalysis.configuration file. Now, the cockpit can be started by clicking on the Run button. The measurements may take some time and will provide a console notification as soon as all experiment series are finished.
As we specified the DataRepository path as the local path 'data', there should be a directory named 'data' in the org.example.fibonacci.cockpitstarter project after the measurements are finished (a refresh of the project might be required).
We can analyse the measured data using the SoPeCo visualization tool. Therfore, go to Window -> Open Perspective -> Other and then select Persistence Visualization: