Normalizes the data row-wise. This is a natural generalization of the "sign" function to higher dimensions.

`R6Class`

object inheriting from `PipeOpTaskPreprocSimple`

/`PipeOpTaskPreproc`

/`PipeOp`

.

PipeOpSpatialSign$new(id = "spatialsign", param_vals = list())

`id`

::`character(1)`

Identifier of resulting object, default`"spatialsign"`

.`param_vals`

:: named`list`

List of hyperparameter settings, overwriting the hyperparameter settings that would otherwise be set during construction. Default`list()`

.

Input and output channels are inherited from `PipeOpTaskPreproc`

.

The output is the input `Task`

with all affected numeric features replaced by their normalized versions.

The `$state`

is a named `list`

with the `$state`

elements inherited from `PipeOpTaskPreproc`

.

The parameters are the parameters inherited from `PipeOpTaskPreproc`

, as well as:

`length`

::`numeric(1)`

Length to scale rows to. Default is 1.`norm`

::`numeric(1)`

Norm to use. Rows are scaled to`sum(x^norm)^(1/norm) == length`

for finite`norm`

, or to`max(abs(x)) == length`

if`norm`

is`Inf`

. Default is 2.

Only methods inherited from `PipeOpTaskPreprocSimple`

/`PipeOpTaskPreproc`

/`PipeOp`

.

Other PipeOps:
`PipeOpEnsemble`

,
`PipeOpImpute`

,
`PipeOpTargetTrafo`

,
`PipeOpTaskPreprocSimple`

,
`PipeOpTaskPreproc`

,
`PipeOp`

,
`mlr_pipeops_boxcox`

,
`mlr_pipeops_branch`

,
`mlr_pipeops_chunk`

,
`mlr_pipeops_classbalancing`

,
`mlr_pipeops_classifavg`

,
`mlr_pipeops_classweights`

,
`mlr_pipeops_colapply`

,
`mlr_pipeops_collapsefactors`

,
`mlr_pipeops_colroles`

,
`mlr_pipeops_copy`

,
`mlr_pipeops_datefeatures`

,
`mlr_pipeops_encodeimpact`

,
`mlr_pipeops_encodelmer`

,
`mlr_pipeops_encode`

,
`mlr_pipeops_featureunion`

,
`mlr_pipeops_filter`

,
`mlr_pipeops_fixfactors`

,
`mlr_pipeops_histbin`

,
`mlr_pipeops_ica`

,
`mlr_pipeops_imputeconstant`

,
`mlr_pipeops_imputehist`

,
`mlr_pipeops_imputelearner`

,
`mlr_pipeops_imputemean`

,
`mlr_pipeops_imputemedian`

,
`mlr_pipeops_imputemode`

,
`mlr_pipeops_imputeoor`

,
`mlr_pipeops_imputesample`

,
`mlr_pipeops_kernelpca`

,
`mlr_pipeops_learner`

,
`mlr_pipeops_missind`

,
`mlr_pipeops_modelmatrix`

,
`mlr_pipeops_multiplicityexply`

,
`mlr_pipeops_multiplicityimply`

,
`mlr_pipeops_mutate`

,
`mlr_pipeops_nmf`

,
`mlr_pipeops_nop`

,
`mlr_pipeops_ovrsplit`

,
`mlr_pipeops_ovrunite`

,
`mlr_pipeops_pca`

,
`mlr_pipeops_proxy`

,
`mlr_pipeops_quantilebin`

,
`mlr_pipeops_randomprojection`

,
`mlr_pipeops_randomresponse`

,
`mlr_pipeops_regravg`

,
`mlr_pipeops_removeconstants`

,
`mlr_pipeops_renamecolumns`

,
`mlr_pipeops_replicate`

,
`mlr_pipeops_scalemaxabs`

,
`mlr_pipeops_scalerange`

,
`mlr_pipeops_scale`

,
`mlr_pipeops_select`

,
`mlr_pipeops_smote`

,
`mlr_pipeops_subsample`

,
`mlr_pipeops_targetinvert`

,
`mlr_pipeops_targetmutate`

,
`mlr_pipeops_targettrafoscalerange`

,
`mlr_pipeops_textvectorizer`

,
`mlr_pipeops_threshold`

,
`mlr_pipeops_tunethreshold`

,
`mlr_pipeops_unbranch`

,
`mlr_pipeops_updatetarget`

,
`mlr_pipeops_vtreat`

,
`mlr_pipeops_yeojohnson`

,
`mlr_pipeops`

#> Species Petal.Length Petal.Width Sepal.Length Sepal.Width #> 1: setosa 1.4 0.2 5.1 3.5 #> 2: setosa 1.4 0.2 4.9 3.0 #> 3: setosa 1.3 0.2 4.7 3.2 #> 4: setosa 1.5 0.2 4.6 3.1 #> 5: setosa 1.4 0.2 5.0 3.6 #> --- #> 146: virginica 5.2 2.3 6.7 3.0 #> 147: virginica 5.0 1.9 6.3 2.5 #> 148: virginica 5.2 2.0 6.5 3.0 #> 149: virginica 5.4 2.3 6.2 3.4 #> 150: virginica 5.1 1.8 5.9 3.0#> Species Petal.Length Petal.Width Sepal.Length Sepal.Width #> 1: setosa 0.2206435 0.03152050 0.8037728 0.5516088 #> 2: setosa 0.2366094 0.03380134 0.8281329 0.5070201 #> 3: setosa 0.2227517 0.03426949 0.8053331 0.5483119 #> 4: setosa 0.2608794 0.03478392 0.8000302 0.5391508 #> 5: setosa 0.2214702 0.03163860 0.7909650 0.5694948 #> --- #> 146: virginica 0.5600146 0.24769876 0.7215572 0.3230853 #> 147: virginica 0.5790902 0.22005426 0.7296536 0.2895451 #> 148: virginica 0.5732312 0.22047353 0.7165390 0.3307103 #> 149: virginica 0.5876164 0.25028107 0.6746707 0.3699807 #> 150: virginica 0.5966647 0.21058754 0.6902592 0.3509792