【问题标题】:Implementing and ploting a perceptron in MATLAB在 MATLAB 中实现和绘制感知器
【发布时间】:2011-02-03 03:24:04
【问题描述】:

我正在审查来自Toronto perceptron MATLAB code 的代码

代码是

function [w] = perceptron(X,Y,w_init)

w = w_init;
for iteration = 1 : 100  %<- in practice, use some stopping criterion!
  for ii = 1 : size(X,2)         %cycle through training set
    if sign(w'*X(:,ii)) ~= Y(ii) %wrong decision?
      w = w + X(:,ii) * Y(ii);   %then add (or subtract) this point to w
    end
  end
  sum(sign(w'*X)~=Y)/size(X,2)   %show misclassification rate
end

所以我正在阅读如何将这个函数应用于数据矩阵 X 和目标 Y,但是,不知道如何使用这个函数,我理解,它返回一个权重向量,所以它可以分类。

你能举个例子,解释一下吗?

我试过了

X=[0 0; 0 1; 1 1]
Y=[1 0; 2 1]
w=[1 1 1]
Result = perceptron( X, Y, w )

??? Error using ==> mtimes
Inner matrix dimensions must agree.

Error in ==> perceptron at 15
            if sign(w'*X(:,ii)) ~= Y(ii) 

    Result = perceptron( X, Y, w' )

??? Error using ==> ne
Matrix dimensions must agree.

Error in ==> perceptron at 19
        sum(sign(w'*X)~=Y) / size(X,2);     

谢谢

谢谢你的回答,我又得到了一个,如果我改变 Y = [0, 1],算法会发生什么?

那么,任何输入数据都不适用于 Y = [0,1] 与此感知器代码,对吗?,

--------------编辑-------------- ----------

还有一个问题,如果我想绘制 划分 2 个类别的线,我知道我们可以得到与权重有关的线求解线性方程组,但是如何, 我能做什么?,我正在尝试类似的东西

% the initial weights
w_init = [ 1 1 1]';  
% the weights returned from perceptron    
wtag   = perceptron(X,Y,w_init,15);

% concatenate both
Line = [wtag,w_init] 

% solve the linear system, am I correct doing this?
rref(Line')

% plot???

【问题讨论】:

    标签: matlab artificial-intelligence


    【解决方案1】:

    您应该首先了解每个输入的含义:

    • X 是样本的输入矩阵,大小为 M x N,其中 M 是特征向量的维度,N 是样本数。由于用于预测的感知器模型是Y=w*X+b,因此您必须在X 中提供一个额外的维度,该维度是常量,通常设置为1,因此b 术语“内置”到X 中。在下面的X 示例中,我在所有示例中将X 的最后一个条目设置为1
    • Y 是来自X 的每个样本的正确分类(您希望感知器学习的分类),因此它应该是一个 N 维行向量 - 每个输入示例一个输出。由于感知器是一个二元分类器,它应该只有 2 个不同的可能值。查看代码,您会看到它检查预测的符号,这告诉您Y 的允许值应该是-1,+1(而不是0,1,例如)。
    • w 是您要学习的权重向量。

    所以,尝试调用函数:

    X=[0 0; 0 1; 1 1];
    Y=[1 -1];
    w=[.5; .5; .5];
    

    编辑

    使用以下代码调用感知器算法并以图形方式查看结果:

    % input samples
    X1=[rand(1,100);rand(1,100);ones(1,100)];   % class '+1'
    X2=[rand(1,100);1+rand(1,100);ones(1,100)]; % class '-1'
    X=[X1,X2];
    
    % output class [-1,+1];
    Y=[-ones(1,100),ones(1,100)];
    
    % init weigth vector
    w=[.5 .5 .5]';
    
    % call perceptron
    wtag=perceptron(X,Y,w);
    % predict
    ytag=wtag'*X;
    
    
    % plot prediction over origianl data
    figure;hold on
    plot(X1(1,:),X1(2,:),'b.')
    plot(X2(1,:),X2(2,:),'r.')
    
    plot(X(1,ytag<0),X(2,ytag<0),'bo')
    plot(X(1,ytag>0),X(2,ytag>0),'ro')
    legend('class -1','class +1','pred -1','pred +1')
    

    【讨论】:

    • 非常感谢,我真的理解你的例子,但还有一个问题:如果类 1 的例子比类 0 多,你会怎么做?在您提供的示例中,两个类 X1 和 X2 的示例数量相同
    • 这是否正确,我现在无法测试:X1=[rand(1,100);rand(1,100);ones(1,100)]; % 类 '+1' X2=[rand(1,300);1+rand(1,300);ones(1,300)]; % 类'-1' X=[X1,X2]; % 输出类 [-1,+1]; Y=[-ones(1,100),ones(1,300)]; % 初始权重向量 w=[.5 .5 .5]'; wtag=perceptron(X,Y,w);
    • 示例的数量无关紧要,为了方便,我只是为每个班级选择了相同的数量。没有什么需要改变。 w 的大小应该是样本的维度(包括常数项),因为任何预测都是基于点积 w*x 的值 - 所以 wx 应该具有相同的大小。
    • 是的,这似乎是正确的。此外,这只是一个综合示例。在实际示例中,您的数据是混合的,您并不关心每个类的大小 - 只需确保每个类中有足够的示例来覆盖样本空间。
    • 你是对的:类的平衡应该无关紧要。感知器存在线性可分性和稳定性问题。类的先验概率在其他线性工具的计算中很重要,例如线性判别分析。
    【解决方案2】:

    如果你有兴趣,这里有一个相当教程的感知器演示:

    function perceptronDemo
    %PERCEPTRONDEMO
    %
    %   A simple demonstration of the perceptron algorithm for training
    %   a linear classifier, made as readable as possible for tutorial
    %   purposes. It is derived from the treatment of linear learning
    %   machines presented in Chapter 2 of "An Introduction to Support
    %   Vector Machines" by Nello Cristianini and John Shawe-Taylor.
    %
    %
    
        Data  = createTrainingData;
        Model = trainPerceptron( Data );
    
    end
    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    function Model = trainPerceptron( Data )
    %TRAINPERCEPTRON
    
        DOWN   = 1;
        ACROSS = 2;
    
        assert( isequal( unique( Data.labels ), [-1; +1] ), ...
            'Labels must be -1 or +1' );
    
        % ---------------------------------------------------------------------
        % Normalise the data by calculating z-scores
        %
        %   This makes plotting easier, but is not needed by the algorithm.
        %
    
        sampleMean   = mean( Data.samples );
        sampleStdDev = std(  Data.samples );
        Data.samples = bsxfun( @minus,   Data.samples, sampleMean   );
        Data.samples = bsxfun( @rdivide, Data.samples, sampleStdDev );
    
        % ---------------------------------------------------------------------
        % Calculate the squared radius of the smallest ball that encloses the
        % data and is centred on the origin. This is used to provide an
        % appropriate range and step size when updating the threshold (bias)
        % parameter.
        %
    
        sampleSize = size( Data.samples, DOWN );
        maxNorm    = realmin;
        for iObservation = 1:sampleSize
            observationNorm = norm( Data.samples(iObservation,:) );
            if observationNorm > maxNorm
                maxNorm = observationNorm;
            end
        end
        enclosingBallRadius        = maxNorm;
        enclosingBallRadiusSquared = enclosingBallRadius .^ 2;
    
        % ---------------------------------------------------------------------
        % Define the starting weight vector and bias. These should be zeros,
        % as the algorithm omits a learning rate, and it is suggested in
        % Cristianini & Shawe-Taylor that learning rate may only be omitted
        % safely when the starting weight vector and bias are zero.
        %
    
        Model.weights = [0.0 0.0];
        Model.bias    = 0.0;
    
        % ---------------------------------------------------------------------
        % Run the perceptron training algorithm
        %
        %   To prevent program running forever when nonseparable data are
        %   provided, limit the number of steps in the outer loop.
        %
    
        maxNumSteps = 1000;
    
        for iStep = 1:maxNumSteps
    
            isAnyObsMisclassified = false;
    
            for iObservation = 1:sampleSize;
    
                inputObservation = Data.samples( iObservation, : );
                desiredLabel     = Data.labels(  iObservation    ); % +1 or -1
    
                perceptronOutput = sum( Model.weights .* inputObservation, ACROSS ) + Model.bias;
                margin           = desiredLabel * perceptronOutput;
    
                isCorrectLabel   = margin > 0;
    
                % -------------------------------------------------------------
                % If the model misclassifies the observation, update the
                % weights and the bias.
                %
    
                if ~isCorrectLabel
    
                    isAnyObsMisclassified = true;
    
                    weightCorrection = desiredLabel  * inputObservation;
                    Model.weights    = Model.weights + weightCorrection;
    
                    biasCorrection   = desiredLabel .* enclosingBallRadiusSquared;
                    Model.bias       = Model.bias   + biasCorrection;
    
                    displayPerceptronState( Data, Model );
    
                end % if this observation misclassified.
    
            end % loop over observations
    
            if ~isAnyObsMisclassified
                disp( 'Done!' );
                break;
            end
    
        end % outer loop
    
    end % TRAINPERCEPTRON
    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    function Data = createTrainingData
    %CREATETRAININGDATA
    %
    %   Return a structure containing training data suitable for linear
    %   classification.
    %
    
        sampleAsize   = 1024;
        sampleBsize   = 1024;
    
        sampleAmean   = [ 5.5 5.0 ];
        sampleAstdDev = [ 0.5 1.0 ];
    
        sampleBmean   = [ 2.5 3.0 ];
        sampleBstdDev = [ 0.3 0.7 ];
    
        Data.samples  = [ normallyDistributedSample( sampleAsize, sampleAmean, sampleAstdDev ); ...
                          normallyDistributedSample( sampleBsize, sampleBmean, sampleBstdDev ) ];
    
        Data.labels   = [  ones(sampleAsize,1); ...
                          -ones(sampleBsize,1) ];
    
        % ---------------------------------------------------------------------
        % Randomly permute samples & class labels.
        %
        %   This is not really necessary, but done to illustrate that the order
        %   in which observations are evaluated does not matter.
        %
    
        randomOrder   = randperm( sampleAsize + sampleBsize );
        Data.samples  = Data.samples( randomOrder, : );
        Data.labels   = Data.labels(  randomOrder, : );
    
    end % CREATETRAININGDATA
    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    function samples = normallyDistributedSample( sampleSize, sampleMean, sampleStdDev )
    %NORMALDISTRIBUTIONSAMPLE
    %
    %   Draw a sample from a normal distribution with specified mean and
    %   standard deviation.
    %
    
        assert(    isequal( size( sampleMean ), size( sampleStdDev ) ) ...
                && 1 == size( sampleMean, 1 ),                         ...
            'Sample mean and standard deviation must be row vectors of equal length.' );
    
        numFeatures = numel( sampleMean );
        samples     = randn( sampleSize, numFeatures );
        samples     = bsxfun( @times, samples, sampleStdDev );
        samples     = bsxfun( @plus,  samples, sampleMean   );
    
    end % NORMALDISTRIBUTIONSAMPLE
    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    function displayPerceptronState( Data, Model )
    %DISPLAYPERCEPTRONSTATE
    
        hFig = figure( 1 );
        clf;
        set( hFig,                        ...
            'NumberTitle', 'off',         ...
            'Name',         mfilename,    ...
            'MenuBar',      'none',       ...
            'Color',        [1.0 1.0 1.0] );
    
        displayXmin = -4;
        displayXmax =  4;
        displayYmin = -4;
        displayYmax =  4;
    
        hAx = subplot( 1, 1, 1 );
        axis('equal');
        set( hAx,                                  ...
            'Box',      'on',                      ...
            'NextPlot', 'add',                     ...
            'xgrid',    'on',                      ...
            'ygrid',    'on',                      ...
            'xlim',     [displayXmin displayXmax], ... % Bounds suitable for Z-scored data
            'ylim',     [displayYmin displayYmax]  );
        xlabel( 'x_1' );
        ylabel( 'x_2' );
    
        % ---------------------------------------------------------------------
        % Plot data points from the two classes
        %
    
        isPositiveClass = Data.labels >  0;
        isNegativeClass = Data.labels <= 0;
    
        plot( hAx, Data.samples(isPositiveClass,1), Data.samples(isPositiveClass,2), 'b+' );
        plot( hAx, Data.samples(isNegativeClass,1), Data.samples(isNegativeClass,2), 'rx' );
    
        % ---------------------------------------------------------------------
        % Display parameters for separating hyperplane in title
        %
    
        xWeight   = Model.weights(1);
        yWeight   = Model.weights(2);
        bias      = Model.bias;
    
        szTitle   = sprintf( 'Linear classifier parameters: %0.2f x_1 + %0.2f x_2 + %0.2f = 0', xWeight, yWeight, bias );
        title( szTitle );
    
        % ---------------------------------------------------------------------
        % Plot separating hyperplane
        %
    
        y1 = ( (xWeight*displayXmin) + bias ) ./ -yWeight;
        y2 = ( (xWeight*displayXmax) + bias ) ./ -yWeight;
    
        plot( hAx, [displayXmin; displayXmax], [y1, y2], 'k-', 'linewidth', 2 );
    
        pause(0.1);
    
    end % DISPLAYPERCEPTRONSTATE
    

    【讨论】:

      【解决方案3】:

      试试这个:

      perceptron([1 2 1 2], [1 0 1 0], 0.5);
      

      【讨论】:

      • 您的示例不起作用,因为该算法假定输出值为 [-1,+1],而不是 [0,1]。 w 向量根本不会更新。
      • 另外,输入应该至少是维度2,否则你明确地假设y=a*x+b中的b=0
      猜你喜欢
      • 2018-07-16
      • 2018-04-23
      • 2017-12-28
      • 2011-06-07
      • 2015-12-05
      • 2013-12-23
      • 1970-01-01
      • 2015-04-25
      • 1970-01-01
      相关资源
      最近更新 更多