Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

discrete parameters with trafo break printing of log messages #167

Open
schiffner opened this issue Mar 14, 2017 · 4 comments
Open

discrete parameters with trafo break printing of log messages #167

schiffner opened this issue Mar 14, 2017 · 4 comments

Comments

@schiffner
Copy link

See below.
In discreteValueToName transformed and original values are compared, which generates the error.

library(mlr)
options(error = recover)

ps = makeParamSet(
	makeDiscreteParam("cost", values = 1, trafo = function(x) 2^x)
)
tuneParams("classif.svm", iris.task, resampling = cv3, control = makeTuneControlGrid(), par.set = ps)
# [Tune] Started tuning learner classif.svm for parameter set:
         # Type len Def Constr Req Tunable Trafo
# cost discrete   -   -      1   -    TRUE     Y
# With control class: TuneControlGrid
# Imputation value: 1
# Error in getIndex(par$values, x) : Value not found!

# Enter a frame number, or 0 to exit   

 # 1: tuneParams("classif.svm", iris.task, resampling = cv3, control = makeTuneControlGrid(), par.set = ps)
 # 2: sel.func(learner, task, resampling, measures, par.set, control, opt.path, show.info, resample.fun)
 # 3: evalOptimizationStatesTune(learner, task, resampling, measures, par.set, control, opt.path, show.info, xs, dobs = seq_along(xs), eols = NA, remov
 # 4: evalOptimizationStates(learner, task, resampling, measures, par.set, NULL, control, opt.path, show.info, states, dobs, eols, remove.nas, resample
 # 5: parallelMap(evalOptimizationState, dobs, states, level = level, more.args = list(learner = learner, task = task, resampling = resampling, measure
 # 6: mapply(fun2, ..., MoreArgs = more.args, SIMPLIFY = FALSE, USE.NAMES = FALSE)
 # 7: (function (learner, task, resampling, measures, par.set, bits.to.features, control, opt.path, show.info, dob, state, remove.nas, resample.fun) 
# {
 # 8: log.fun(learner, task, resampling, measures, par.set, control, opt.path, dob, state, NA, remove.nas, stage = 1)
 # 9: paramValueToString(par.set, x, show.missing.values = !remove.nas)
# 10: paramValueToString.ParamSet(par.set, x, show.missing.values = !remove.nas)
# 11: sprintf("%s=%s", pn, paramValueToString(p, val, show.missing.values, num.format))
# 12: paramValueToString(p, val, show.missing.values, num.format)
# 13: paramValueToString.Param(p, val, show.missing.values, num.format)
# 14: discreteValueToName(par, x)
# 15: getIndex(par$values, x)

# Selection: 14
# Called from: top level 
# Browse[1]> ls()
# [1] "getIndex" "ns"       "par"      "x"       
# Browse[1]> par$values
# $`1`
# [1] 1

# Browse[1]> x
# [1] 2
@berndbischl
Copy link
Member

@schiffner why would you ever do that? its a discrete param. just enumerate the values you want.
logscale or normal.

if can for sure simply remove the trafo arg for these params?

@schiffner
Copy link
Author

schiffner commented Mar 14, 2017

@schiffner why would you ever do that?

Because I can?! And the docs say I can?! And no one stopped me... :)
It "happened" to me because I switched between tuning methods from random to grid search and had used the trafo arg earlier when the param was still numeric ...

I'm also ok with removing the trafo arg.

@berndbischl
Copy link
Member

Because I can?! And the docs say I can?! And no one stopped me... :)

just trying to figure out whether it makes sense to remove the trafo.....
@mllg ?

@berndbischl
Copy link
Member

@larskotthoff

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants