Skip to content

Commit

Permalink
docs: correct usage of default / initial value
Browse files Browse the repository at this point in the history
  • Loading branch information
sebffischer committed Aug 21, 2023
1 parent 949cef2 commit 8dd9ea2
Show file tree
Hide file tree
Showing 50 changed files with 129 additions and 264 deletions.
4 changes: 2 additions & 2 deletions R/learner_LiblineaR_regr_liblinear.R
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@
#' * `type = 12` – L2-regularized L2-loss support vector regression (dual)
#' * `type = 13` – L2-regularized L1-loss support vector regression (dual)
#'
#' @section Custom mlr3 defaults:
#' @section Initial parameter values:
#' - `svr_eps`:
#' - Actual default: `NULL`
#' - Adjusted default: 0.001
#' - Initial value: 0.001
#' - Reason for change: `svr_eps` is type dependent and the "type" is handled
#' by the mlr3learner. The default value is set to th default of the respective
#' "type".
Expand Down
9 changes: 3 additions & 6 deletions R/learner_dbarts_regr_bart.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,16 @@
#' @template learner
#' @templateVar id regr.bart
#'
#' @section Initial parameter values:
#' @section Custom mlr3 parameters:
#' * Parameter: offset
#' * The parameter is removed, because only `dbarts::bart2` allows an offset during training,
#' and therefore the offset parameter in `dbarts:::predict.bart` is irrelevant for
#' `dbarts::dbart`.
#' * Parameter: nchain, combineChains, combinechains
#' * The parameters are removed as parallelization of multiple models is handled by future.
#'
#' @section Custom mlr3 defaults:
#' * Parameter: keeptrees
#' * Original: FALSE
#' * New: TRUE
#' * Reason: Required for prediction
#' @section Initial parameter values:
#' * `keeptrees` is initialized to `TRUE` because it is required for prediction.
#'
#' @references
#' `r format_bib("sparapani2021nonparametric", "chipman2010bart")`
Expand Down
4 changes: 2 additions & 2 deletions R/learner_flexsurv_surv_flexible.R
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@
#' and covariates \eqn{X^T = (X_0,...,X_P)^T}{X^T = (X0,...,XP)^T}, where \eqn{X_0}{X0} is a column
#' of \eqn{1}s: \eqn{lp = \beta X}{lp = \betaX}.
#'
#' @section Custom mlr3 defaults:
#' @section Initial parameter values:
#' - `k`:
#' - Actual default: `0`
#' - Adjusted default: `1`
#' - Initial value: `1`
#' - Reason for change: The default value of `0` is equivalent to, and a much less efficient
#' implementation of, [LearnerSurvParametric].
#'
Expand Down
13 changes: 3 additions & 10 deletions R/learner_gbm_classif_gbm.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,16 +9,9 @@
#' @template learner
#' @templateVar id classif.gbm
#'
#' @section Custom mlr3 defaults:
#' - `keep.data`:
#' - Actual default: TRUE
#' - Adjusted default: FALSE
#' - Reason for change: `keep.data = FALSE` saves memory during model fitting.
#' - `n.cores`:
#' - Actual default: NULL
#' - Adjusted default: 1
#' - Reason for change: Suppressing the automatic internal parallelization if
#' `cv.folds` > 0.
#' @section Initial parameter values:
#' - `keep.data` is initialized to `FALSE` to save memory.
#' - `n.cores` is initialized to 1 to avoid conflicts with parallelization through future.
#'
#' @references
#' `r format_bib("friedman2002stochastic")`
Expand Down
4 changes: 2 additions & 2 deletions R/learner_glmnet_surv_cv_glmnet.R
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@
#' Generalized linear models with elastic net regularization.
#' Calls [glmnet::cv.glmnet()] from package \CRANpkg{glmnet}.
#'
#' @section Custom mlr3 defaults:
#' - `family` The default is set to `"cox"`.
#' @section Custom mlr3 parameters:
#' - `family` is set to `"cox"` and cannot be changed.
#'
#' @templateVar id surv.cv_glmnet
#' @template learner
Expand Down
4 changes: 2 additions & 2 deletions R/learner_glmnet_surv_glmnet.R
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@
#' Generalized linear models with elastic net regularization.
#' Calls [glmnet::glmnet()] from package \CRANpkg{glmnet}.
#'
#' @section Custom mlr3 defaults:
#' - `family` The default is set to `"cox"`.
# @section Custom mlr3 parameters:
#' - `family` is set to `"cox"` and cannot be changed.
#'
#' @details
#' Caution: This learner is different to learners calling [glmnet::cv.glmnet()]
Expand Down
20 changes: 10 additions & 10 deletions R/learner_lightgbm_classif_lightgbm.R
Original file line number Diff line number Diff line change
Expand Up @@ -14,27 +14,27 @@
#' @templateVar id classif.lightgbm
#'
#' @section Initial parameter values:
#' * `convert_categorical`:
#' Additional parameter. If this parameter is set to `TRUE` (default), all factor and logical
#' columns are converted to integers and the parameter categorical_feature of lightgbm is set to
#' those columns.
#' * `num_class`:
#' This parameter is automatically inferred for multiclass tasks and does not have to be set.
#' @section Custom mlr3 defaults:
#' * `num_threads`:
#' * Actual default: 0L
#' * Adjusted default: 1L
#' * Initial value: 1L
#' * Reason for change: Prevents accidental conflicts with `future`.
#' * `verbose`:
#' * Actual default: 1L
#' * Adjusted default: -1L
#' * Initial value: -1L
#' * Reason for change: Prevents accidental conflicts with mlr messaging system.
#' @section Custom mlr3 defaults:
#' * `objective`:
#' Depending if the task is binary / multiclass, the default is set to `"binary"` or
#' Depending if the task is binary / multiclass, the default is `"binary"` or
#' `"multiclasss"`.
#' @section Custom mlr3 parameters:
#' * `early_stopping`
#' Whether to use the test set for early stopping. Default is `FALSE`.
#' * `convert_categorical`:
#' Additional parameter. If this parameter is set to `TRUE` (default), all factor and logical
#' columns are converted to integers and the parameter categorical_feature of lightgbm is set to
#' those columns.
#' * `num_class`:
#' This parameter is automatically inferred for multiclass tasks and does not have to be set.
#'
#' @references
#' `r format_bib("ke2017lightgbm")`
Expand Down
13 changes: 6 additions & 7 deletions R/learner_lightgbm_regr_lightgbm.R
Original file line number Diff line number Diff line change
Expand Up @@ -14,23 +14,22 @@
#' @templateVar id regr.lightgbm
#'
#' @section Initial parameter values:
#' * `convert_categorical`:
#' Additional parameter. If this parameter is set to `TRUE` (default), all factor and logical
#' columns are converted to integers and the parameter categorical_feature of lightgbm is set to
#' those columns.
#' @section Custom mlr3 defaults:
#' * `num_threads`:
#' * Actual default: 0L
#' * Adjusted default: 1L
#' * Iniital value: 1L
#' * Reason for change: Prevents accidental conflicts with `future`.
#' * `verbose`:
#' * Actual default: 1L
#' * Adjusted default: -1L
#' * Initial value: -1L
#' * Reason for change: Prevents accidental conflicts with mlr messaging system.
#'
#' @section Custom mlr3 parameters:
#' * `early_stopping`
#' Whether to use the test set for early stopping. Default is `FALSE`.
#' * `convert_categorical`:
#' Additional parameter. If this parameter is set to `TRUE` (default), all factor and logical
#' columns are converted to integers and the parameter categorical_feature of lightgbm is set to
#' those columns.
#'
#' @references
#' `r format_bib("ke2017lightgbm")`
Expand Down
9 changes: 3 additions & 6 deletions R/learner_obliqueRSF_surv_obliqueRSF.R
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,14 @@
#' @template learner
#' @templateVar id surv.obliqueRSF
#'
#' @section Initial parameter values:
#' @section Custom mlr3 parameters:
#' - `mtry`:
#' - This hyperparameter can alternatively be set via the added hyperparameter `mtry_ratio`
#' as `mtry = max(ceiling(mtry_ratio * n_features), 1)`.
#' Note that `mtry` and `mtry_ratio` are mutually exclusive.
#'
#' @section Custom mlr3 defaults:
#' - `verbose`:
#' - Actual default: `TRUE`
#' - Adjusted default: `FALSE`
#' - Reason for change: mlr3 already has it's own verbose set to `TRUE` by default
#' @section Initial parameter values:
#' - `verbose` is initialized to `FALSE`
#'
#' @references
#' `r format_bib("jaeger_2019")`
Expand Down
9 changes: 3 additions & 6 deletions R/learner_randomForestSRC_classif_imbalanced_rfsrc.R
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,8 @@
#' as `sampsize = max(ceiling(sampsize.ratio * n_obs), 1)`.
#' Note that `sampsize` and `sampsize.ratio` are mutually exclusive.
#'
#' @section Custom mlr3 defaults:
#' - `cores`:
#' - Actual default: Auto-detecting the number of cores
#' - Adjusted default: 1
#' - Reason for change: Threading conflicts with explicit parallelization via \CRANpkg{future}.
#' @section Initial parameter values:
#' - `cores` is initialized to 1 to avoid threading conflicts with explicit parallelization via \CRANpkg{future}.
#'
#' @templateVar id classif.imbalanced_rfsrc
#' @template learner
Expand Down Expand Up @@ -109,6 +106,7 @@ LearnerClassifImbalancedRandomForestSRC = R6Class("LearnerClassifImbalancedRando
save.memory = p_lgl(default = FALSE, tags = "train"),
perf.type = p_fct(levels = c("gmean", "misclass", "brier", "none"), tags = "train") # nolint
)
ps$values = list(cores = 1L)

super$initialize(
id = "classif.imbalanced_rfsrc",
Expand Down Expand Up @@ -154,7 +152,6 @@ LearnerClassifImbalancedRandomForestSRC = R6Class("LearnerClassifImbalancedRando
pv = self$param_set$get_values(tags = "train")
pv = convert_ratio(pv, "mtry", "mtry.ratio", length(task$feature_names))
pv = convert_ratio(pv, "sampsize", "sampsize.ratio", task$nrow)
cores = pv$cores %??% 1L

if ("weights" %in% task$properties) {
pv$case.wt = as.numeric(task$weights$weight) # nolint
Expand Down
13 changes: 5 additions & 8 deletions R/learner_randomForestSRC_classif_rfsrc.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
#' @template learner
#' @templateVar id classif.rfsrc
#'
#' @section Initial parameter values:
#' @section Custom mlr3 parameters:
#' - `mtry`:
#' - This hyperparameter can alternatively be set via the added hyperparameter `mtry.ratio`
#' as `mtry = max(ceiling(mtry.ratio * n_features), 1)`.
Expand All @@ -18,12 +18,8 @@
#' - This hyperparameter can alternatively be set via the added hyperparameter `sampsize.ratio`
#' as `sampsize = max(ceiling(sampsize.ratio * n_obs), 1)`.
#' Note that `sampsize` and `sampsize.ratio` are mutually exclusive.
#'
#' @section Custom mlr3 defaults:
#' - `cores`:
#' - Actual default: Auto-detecting the number of cores
#' - Adjusted default: 1
#' - Reason for change: Threading conflicts with explicit parallelization via \CRANpkg{future}.
#' @section Initial parameter values:
#' - `cores` is initialized to 1 to avoid threading conflicts with explicit parallelization via \CRANpkg{future}.
#'
#' @references
#' `r format_bib("breiman_2001")`
Expand Down Expand Up @@ -103,6 +99,8 @@ LearnerClassifRandomForestSRC = R6Class("LearnerClassifRandomForestSRC",
perf.type = p_fct(levels = c("gmean", "misclass", "brier", "none"), tags = "train") # nolint
)

ps$values = list(cores = 1)

super$initialize(
id = "classif.rfsrc",
packages = c("mlr3extralearners", "randomForestSRC"),
Expand Down Expand Up @@ -156,7 +154,6 @@ LearnerClassifRandomForestSRC = R6Class("LearnerClassifRandomForestSRC",
pv = self$param_set$get_values(tags = "train")
pv = convert_ratio(pv, "mtry", "mtry.ratio", length(task$feature_names))
pv = convert_ratio(pv, "sampsize", "sampsize.ratio", task$nrow)
cores = pv$cores %??% 1L

if ("weights" %in% task$properties) {
pv$case.wt = as.numeric(task$weights$weight) # nolint
Expand Down
3 changes: 2 additions & 1 deletion R/learner_randomForestSRC_regr_rfsrc.R
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,8 @@ LearnerRegrRandomForestSRC = R6Class("LearnerRegrRandomForestSRC",
perf.type = p_fct(levels = "none", tags = "train")
)

ps$values = list(cores = 1)

super$initialize(
id = "regr.rfsrc",
packages = c("mlr3extralearners", "randomForestSRC"),
Expand Down Expand Up @@ -139,7 +141,6 @@ LearnerRegrRandomForestSRC = R6Class("LearnerRegrRandomForestSRC",
pv = self$param_set$get_values(tags = "train")
pv = convert_ratio(pv, "mtry", "mtry.ratio", length(task$feature_names))
pv = convert_ratio(pv, "sampsize", "sampsize.ratio", task$nrow)
cores = pv$cores %??% 1L

if ("weights" %in% task$properties) {
pv$case.wt = as.numeric(task$weights$weight) # nolint
Expand Down
3 changes: 2 additions & 1 deletion R/learner_randomForestSRC_surv_rfsrc.R
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,8 @@ delayedAssign(
perf.type = p_fct(levels = "none", tags = "train")
)

ps$values = list(cores = 1)

super$initialize(
id = "surv.rfsrc",
packages = c("mlr3extralearners", "randomForestSRC", "pracma"),
Expand Down Expand Up @@ -149,7 +151,6 @@ delayedAssign(
pv = self$param_set$get_values(tags = "train")
pv = convert_ratio(pv, "mtry", "mtry.ratio", length(task$feature_names))
pv = convert_ratio(pv, "sampsize", "sampsize.ratio", task$nrow)
cores = pv$cores %??% 1L

if ("weights" %in% task$properties) {
pv$case.wt = as.numeric(task$weights$weight) # nolint
Expand Down
7 changes: 2 additions & 5 deletions R/learner_ranger_surv_ranger.R
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,8 @@
#' as `mtry = max(ceiling(mtry.ratio * n_features), 1)`.
#' Note that `mtry` and `mtry.ratio` are mutually exclusive.
#'
#' @section Custom mlr3 defaults:
#' - `num.threads`:
#' - Actual default: `NULL`, triggering auto-detection of the number of CPUs.
#' - Adjusted value: 1.
#' - Reason for change: Conflicting with parallelization via \CRANpkg{future}.
#' @section Initial parameter values:
#' - `num.threads` is initialized to 1 to avoid conflicts with parallelization via \CRANpkg{future}.
#'
#' @templateVar id surv.ranger
#' @template learner
Expand Down
7 changes: 2 additions & 5 deletions R/learner_survivalmodels_surv_dnnsurv.R
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,8 @@
#' The number of output channels should be of length `1` and number of input channels is
#' the number of features plus number of cuts.
#'
#' @section Custom mlr3 defaults:
#' - `verbose`:
#' - Actual default: `1L`
#' - Adjusted default: `0L`
#' - Reason for change: Prevents plotting.
#' @section Initial parameter values:
#' - `verbose` is initialized to 0.
#'
#' @references
#' `r format_bib("zhao2019dnnsurv")`
Expand Down
24 changes: 5 additions & 19 deletions R/learner_xgboost_surv_xgboost.R
Original file line number Diff line number Diff line change
Expand Up @@ -8,25 +8,11 @@
#'
#' @template note_xgboost
#'
#' @section Custom mlr3 defaults:
#' - `nrounds`:
#' - Actual default: no default.
#' - Adjusted default: 1.
#' - Reason for change: Without a default construction of the learner
#' would error. Just setting a nonsense default to workaround this.
#' `nrounds` needs to be tuned by the user.
#' - `nthread`:
#' - Actual value: Undefined, triggering auto-detection of the number of CPUs.
#' - Adjusted value: 1.
#' - Reason for change: Conflicting with parallelization via \CRANpkg{future}.
#' - `verbose`:
#' - Actual default: 1.
#' - Adjusted default: 0.
#' - Reason for change: Reduce verbosity.
#' - `objective`:
#' - Actual default: `reg:squarederror`.
#' - Adjusted default: `survival:cox`.
#' - Reason for change: Changed to a survival objective.
#' @section Initial parameter values:
#' - `nrounds` is initialized to 1.
#' - `nthread` is initialized to 1 to avoid conflicts with parallelization via \CRANpkg{future}.
#' - `verbose` is initialized to 0.
#' - `objective` is initialized to `survival:cox` for survival analysis.
#' @section Early stopping:
#' Early stopping can be used to find the optimal number of boosting rounds.
#' The `early_stopping_set` parameter controls which set is used to monitor the performance.
Expand Down
6 changes: 3 additions & 3 deletions inst/templates/learner_template.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@
#' @section Initial parameter values:
#' FIXME: DEVIATIONS FROM UPSTREAM PARAMETERS. DELETE IF NOT APPLICABLE.
#'
#' @section Custom mlr3 defaults:
#' @section Custom mlr3 parameters:
#' FIXME: DEVIATIONS FROM UPSTREAM DEFAULTS. DELETE IF NOT APPLICABLE.
#'
#' @section Installation:
#' FIXME: CUSTOM INSTALLATION INSTRUCTIONS. DELETE IF NOT APPLICABLE.
#' @section Custom mlr3 parameters:
#' FIXME: INITIAL VALUES FOR PARAMETERS. DELETE IF NOT APPLICABLE.
#'
#' @templateVar id <type>.<key>
#' @template learner
Expand Down
2 changes: 1 addition & 1 deletion man/mlr_learners_classif.AdaBoostM1.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/mlr_learners_classif.C50.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/mlr_learners_classif.IBk.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/mlr_learners_classif.J48.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading

0 comments on commit 8dd9ea2

Please sign in to comment.