1. Feedforward and cost function;

CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)

2.Regularized cost function:

CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)

3.Sigmoid gradient

The gradient for the sigmoid function can be computed as:

CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)

where:

CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)

4.Random initialization

randInitializeWeights.m

 1 function W = randInitializeWeights(L_in, L_out)
 2 %RANDINITIALIZEWEIGHTS Randomly initialize the weights of a layer with L_in
 3 %incoming connections and L_out outgoing connections
 4 %   W = RANDINITIALIZEWEIGHTS(L_in, L_out) randomly initializes the weights 
 5 %   of a layer with L_in incoming connections and L_out outgoing 
 6 %   connections. 
 7 %
 8 %   Note that W should be set to a matrix of size(L_out, 1 + L_in) as
 9 %   the column row of W handles the "bias" terms
10 %
11 
12 % You need to return the following variables correctly 
13 W = zeros(L_out, 1 + L_in);
14 
15 % ====================== YOUR CODE HERE ======================
16 % Instructions: Initialize W randomly so that we break the symmetry while
17 %               training the neural network.
18 %
19 % Note: The first row of W corresponds to the parameters for the bias units
20 %
21 epsilon_init = 0.12;
22 W = rand(L_out, 1 + L_in) * 2 * epsilon_init - epsilon_init;
23 
24 % =========================================================================
25 
26 end
View Code

相关文章:

  • 2022-01-02
  • 2021-08-08
  • 2021-12-16
  • 2021-08-23
  • 2021-04-15
  • 2021-12-19
  • 2021-04-22
猜你喜欢
  • 2021-12-12
  • 2022-01-10
  • 2021-07-13
  • 2021-09-25
  • 2021-07-26
  • 2021-06-08
  • 2021-05-24
相关资源
相似解决方案