evaluateModelPerformance {LedPred} | R Documentation |
evaluateModelPerformance
function computes the precision and recall measures to evaluate the model through cross validation steps using ROCR
package.
evaluateModelPerformance(data, cl = 1, valid.times = 10, feature.ranking = NULL, feature.nb = NULL, numcores = ifelse(.Platform$OS.type == "windows", 1, parallel::detectCores() - 1), file.prefix = NULL, kernel = "linear", cost = NULL, gamma = NULL)
data |
data.frame containing the training set |
cl |
integer indicating the column number corresponding to the response vector that classify positive and negative regions (default = 1) |
valid.times |
Integer indicating how many times the training set will be split for the cross validation step (default = 10). This number must be smaller than positive and negative sets sizes. |
feature.ranking |
List of ordered features. |
feature.nb |
the optimal number of feature to use from the list of ordered features. |
numcores |
Number of cores to use for parallel computing (default: the number of available cores in the machine - 1) |
file.prefix |
A character string that will be used as a prefix followed by "_ROCR_perf.png" for the result plot file, if it is NULL (default), no plot is returned |
kernel |
SVM kernel, a character string: "linear" or "radial". (default = "radial") |
cost |
The SVM cost parameter for both linear and radial kernels. If NULL (default), the function |
gamma |
The SVM gamma parameter for radial kernel. If radial kernel and NULL (default), the function |
A list with two objects.
probs |
The predictions computed by the model for each subset during the cross-validation |
labels |
The actual class for each subset |
data(crm.features) data(feature.ranking) #probs.labels.list <- evaluateModelPerformance(data.granges=crm.features, # feature.ranking=feature.ranking, feature.nb=50, # file.prefix = "test") #names(probs.labels.list[[1]])