【问题标题】:Tuning size parameter for neural network调整神经网络的大小参数
【发布时间】:2018-05-22 05:24:36
【问题描述】:

我想使用 caret 包拟合神经网络模型。有 208 个预测变量,所有这些变量都很重要,不能被丢弃。 我可以给 size 参数的最大值是 4,超过这个值我会得到一个错误,说有太多的权重。

> ctrl<-trainControl(method = 'cv',number = 5)
> my.grid <- expand.grid(.decay = 0.1, .size =5)
> nn.fit <- train(train_predictors,train_responses[["r2c1"]],method = "nnet",algorithm = 'backprop', tuneGrid = my.grid,trace=F, linout = TRUE,trControl = ctrl)
Something is wrong; all the RMSE metric values are missing:
      RMSE        Rsquared        MAE     
 Min.   : NA   Min.   : NA   Min.   : NA  
 1st Qu.: NA   1st Qu.: NA   1st Qu.: NA  
 Median : NA   Median : NA   Median : NA  
 Mean   :NaN   Mean   :NaN   Mean   :NaN  
 3rd Qu.: NA   3rd Qu.: NA   3rd Qu.: NA  
 Max.   : NA   Max.   : NA   Max.   : NA  
 NA's   :1     NA's   :1     NA's   :1    
Error: Stopping
In addition: Warning messages:
1: model fit failed for Fold1: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

2: model fit failed for Fold2: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

3: model fit failed for Fold3: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

4: model fit failed for Fold4: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

5: model fit failed for Fold5: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

6: In nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo,  :
  There were missing values in resampled performance measures.

模型在 4 个神经元(大小=4)的情况下表现非常糟糕。如果我想要超过 5 个神经元,我该怎么做才能使模型工作?

【问题讨论】:

    标签: r neural-network r-caret nnet


    【解决方案1】:

    您始终可以使用插入符号train 方法中的... 可选参数将附加参数传递给基础训练方法(在本例中为nnet)。 nnet 包的 CRAN 文档描述了 MaxNwts 参数,该参数允许控制隐藏单元的最大数量。

    【讨论】:

      【解决方案2】:

      您可以为 nnet 方法指定调整网格中的其他参数。每种方法的可用参数都可以在线获得,但很难找到。这是我将 mxnet 用于 adam nn 的示例:

      mxnet_grid_A2 = expand.grid(layer1 = c(10, 12),   
                                 layer2 = c(4, 6),
                                 layer3 = 2,
                                 learningrate = c(0.001, 0.0001),
                                 dropout = c(0, 0.2)
                                 beta1 = .9,
                                 beta2 = 0.999,
                                 activation = 'relu')
      

      【讨论】:

        猜你喜欢
        • 2020-09-29
        • 2013-07-01
        • 2021-11-14
        • 2020-09-20
        • 2020-02-01
        • 2018-03-29
        • 2020-06-29
        • 2022-01-23
        • 1970-01-01
        相关资源
        最近更新 更多