【问题标题】:Tunning Neural Network made in Python with NumPy使用 NumPy 调整 Python 中的神经网络
【发布时间】:2020-04-05 20:19:45
【问题描述】:

我为使用 sigmoid 函数的神经网络编写了代码。我用 NumPy 和 Python 做到了。 代码运行良好,但现在我想对其进行调整,以提高准确性。如何调整我的神经网络,我需要添加一些参数,还是添加隐藏层? 有没有可能?

这是我的代码:

import numpy as np
import pandas as pd

df = pd.DataFrame({'input 1':[0.5, 0.3, 0, 0.1, 0.4, -0.4, 0.4, 0.1, -0.6, 0.2, 0.6, 0, 0.2, 0.2, -0.1, -0.1, 0, 0.4, -0.2, -0.4],
                   'input 2':[0.3, 0.6, -0.4, -0.2, 0.9, 0, 0.35, -0.4, -0.9, 0.4, 0.3, -0.1, 0.1, 0.3, 0.1, 0.1, 0.3, 0.1, 0.3, 0.3],
                   'input 3':[0, 0.4, 0, -0.1, 0.4, -0.2, 0.7, -0.3, -0.1, 0.1, 0.3, 0, 0.5, 0.4, -0.31, 0.1, 0.3, 0.1, 0.1, 0.2],
                   'result':[1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 0]})

print(df)

def sigmoid(x):
    return 1 / (1 + np.exp(-x))

def sigmoid_derivate(x):
    return x * (1 - x)


features = df.iloc[:,:-1].to_numpy()
results =  df.iloc[:,-1:].to_numpy()

np.random.seed(1)

weights = 2 * np.random.random((3,1)) - 1

print('These are my random weights:\n')
print(weights)

for iteration in range(100000):

    input_layer = features

    outputs = sigmoid(np.dot(input_layer, weights))

    error = results - outputs

    adjustments = error * sigmoid_derivate(outputs)
    weights += np.dot(input_layer.T, adjustments)

outputs = outputs.round(0).tolist()
outputs  = list(itertools.chain(*outputs))

outputs.insert(0,'None')

df['output prediction'] = outputs
print(df)

df1 = df.tail(len(df)-1)
#print(df1)

acc = 0
for i, j in zip(df1['result'] ,df1['output prediction']):

    if i == j:

        acc += 1

accuracy = round(acc * 100 /len(df1), 2)
print(accuracy)

我认为我应该将它添加到我定义权重的部分下方,但我不确定。

感谢您的帮助!

【问题讨论】:

    标签: python numpy neural-network


    【解决方案1】:
    import numpy as np
    import pandas as pd
    
    df = pd.DataFrame({'input 1':[0.5, 0.3, 0, 0.1, 0.4, -0.4, 0.4, 0.1, -0.6, 0.2, 0.6, 0, 0.2, 0.2, -0.1, -0.1, 0, 0.4, -0.2, -0.4],
                       'input 2':[0.3, 0.6, -0.4, -0.2, 0.9, 0, 0.35, -0.4, -0.9, 0.4, 0.3, -0.1, 0.1, 0.3, 0.1, 0.1, 0.3, 0.1, 0.3, 0.3],
                       'input 3':[0, 0.4, 0, -0.1, 0.4, -0.2, 0.7, -0.3, -0.1, 0.1, 0.3, 0, 0.5, 0.4, -0.31, 0.1, 0.3, 0.1, 0.1, 0.2],
                       'result':[1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 0]})
    
    print(df)
    
    def sigmoid(x):
        return 1 / (1 + np.exp(-x))
    
    def sigmoid_derivate(x):
        return x * (1 - x)
    
    alpha=0.1#define alpha
    features = df.iloc[:,:-1]
    results =  df.iloc[:,-1:]
    features=np.array(features)
    results=np.array(results)
    
    np.random.seed(1)
    
    
    weight0  = 2*np.random.random((3,10)) - 1 #3 - number of features; 10 - number of nodes in hidden layer
    weight1  = 2*np.random.random((10,4)) - 1 #10 - number of nodes in hidden layer; 4 - number of nodes in output layer
    weight2  = 2*np.random.random((4,1)) - 1 #4 - number of nodes in output layer; 1 - number of labels
    # you can change layer's nodes, but they must be able to make dot product. For example (320,160) and (160,40)
    for iteration in range(1000):
    
        l0 = features
        l1 = sigmoid(np.dot(l0,weight0)) 
        l2 = sigmoid(np.dot(l1,weight1))
        l3 = sigmoid(np.dot(l2,weight2))
    
        l3_error = results - l3
        print ("Error after "+str(iteration)+" iterations:" + str(np.mean(np.abs(l3_error))))
        l3_delta = l3_error*sigmoid_derivate(l3)
        l2_error = l3_delta.dot(weight2.T)
        l2_delta = l2_error * sigmoid_derivate(l2)
        l1_error = l2_delta.dot(weight1.T)
        l1_delta = l1_error * sigmoid_derivate(l1)
        weight2 += alpha*l2.T.dot(l3_delta)
        weight1 += alpha*l1.T.dot(l2_delta)
        weight0 += alpha*l0.T.dot(l1_delta)
    

    这是您的代码,其中包含 1 个输入层、1 个隐藏层和 1 个输出层。

    【讨论】:

    • 谢谢我的朋友,还有一个问题吗?有没有办法确定节点数( (3,10), (10,4), (4,1) )和层数?
    • 当然。要更改层数,您必须添加权重(weight3、weight4 等)、层(l4、l5 等)、deltas(l4_delta、l5_delta 等)、错误(l4_error、l5_error 等),最后更新您的权重(weight3 += alpha*l3.T.dot(l4_delta) 等)。要更改节点数,请查看代码,我已经对其进行了编辑。
    猜你喜欢
    • 2020-01-18
    • 2020-10-27
    • 2018-05-04
    • 2020-09-29
    • 1970-01-01
    • 1970-01-01
    • 2018-05-22
    • 2018-11-01
    • 1970-01-01
    相关资源
    最近更新 更多