【问题标题】:line 880, in load_eof raise EOFError EOFError when trying to load using dill第 880 行,在 load_eof 中尝试使用 dill 加载时引发 EOFError EOFError
【发布时间】:2016-08-31 23:46:01
【问题描述】:

我对 python 和机器学习相当陌生。我一直在使用 neupy 库成功创建神经网络。但是,现在我有一个不错的网络,我想保存它。该文档显示了如何使用 dill 库来执行此操作。网络似乎可以正确写入文件,但不会加载要使用的文件。重复代码是因为我打算在正确实现代码后必须分离脚本。我已经完全按照给定的方式复制了代码 (http://neupy.com/docs/storage.html)

我的代码是:

import dill
import csv
import numpy as np
from sklearn import datasets, preprocessing
from sklearn.cross_validation import train_test_split
from neupy import algorithms, layers
from neupy.functions import rmsle

np.random.seed(0)

#variables
EPOCHS = 200
HIDDENLAYER = 17
miss = 0.1
hit = 0.2
TRAIN = 0.7
ROUND = 2
STEP = 0.003
TOL = 0.02
with open('binary_conversion_dataset_input_2.csv','r') as dest1_f:
    data_iter = csv.reader(dest1_f,
                           delimiter = ',',
                           quotechar = '"')
    data = [data for data in data_iter]
data_array1 = np.asarray(data, dtype = float)
hitmiss_in = data_array1    #loads entire dataset from excel csv file

with open('binary_conversion_dataset_target_2.csv','r') as dest2_f:
    data_iter = csv.reader(dest2_f,
                           delimiter = ',',
                           quotechar = '"')
    data = [data for data in data_iter]
data_array2 = np.asarray(data, dtype = float)
hitmiss_target = data_array2    #loads entire dataset from excel csv file



hitmiss_input = hitmiss_in[:,:]   

hitmiss_target = hitmiss_target[:,:]   


hitmiss_predict = [0.53, 0.80, 0.40, 0.20, 0.07]

#####break target set into single numbers
hitmiss_target1a = hitmiss_target[:,0]
hitmiss_target1b = hitmiss_target[:,1]
hitmiss_target1c = hitmiss_target[:,2]
hitmiss_target1d = hitmiss_target[:,3]
hitmiss_target1e = hitmiss_target[:,4]
##hitmiss_target1f = hitmiss_target[:,5]
##hitmiss_target1g = hitmiss_target[:,6]
##hitmiss_target1h = hitmiss_target[:,7]
##hitmiss_target1i = hitmiss_target[:,8]
##hitmiss_target1j = hitmiss_target[:,9]
##hitmiss_target1k = hitmiss_target[:,10]
##hitmiss_target1l = hitmiss_target[:,11]
##hitmiss_target1m = hitmiss_target[:,12]
##hitmiss_target1n = hitmiss_target[:,13]
##hitmiss_target1o = hitmiss_target[:,14]
##hitmiss_target1p = hitmiss_target[:,15]
##hitmiss_target1q = hitmiss_target[:,16]
##hitmiss_target1r = hitmiss_target[:,17]
##hitmiss_target1s = hitmiss_target[:,18]
##hitmiss_target1t = hitmiss_target[:,19]

################################################Neural Network for hit miss

x_train, x_test, y_train, y_test = train_test_split(
   hitmiss_input, hitmiss_target1a, train_size=TRAIN
   )

cgnet = algorithms.ConjugateGradient(
   connection=[
       layers.TanhLayer(5),
       layers.TanhLayer(HIDDENLAYER),
       layers.OutputLayer(1),
   ],
   search_method='golden',
   tol = TOL, step = STEP,
   show_epoch=25,
   optimizations=[algorithms.LinearSearch],
)

cgnet.train(x_train, y_train, x_test, y_test, epochs=EPOCHS)

hitmiss_final_A = cgnet.predict(hitmiss_predict).round(ROUND)

with open('network-storage.dill', 'w') as net:
    dill.dumps(net, dill.HIGHEST_PROTOCOL)

#p = pickle.dumps(g, pickle.HIGHEST_PROTOCOL)
print hitmiss_final_A


import dill
import csv
import numpy as np
from sklearn import datasets, preprocessing
from sklearn.cross_validation import train_test_split
from neupy import algorithms, layers
from neupy.functions import rmsle

np.random.seed(0)

#variables
EPOCHS = 2000
HIDDENLAYER = 17
miss = 0.1
hit = 0.2
TRAIN = 0.7
ROUND = 2
STEP = 0.003
TOL = 0.02
with open('binary_conversion_dataset_input_2.csv','r') as dest1_f:
    data_iter = csv.reader(dest1_f,
                           delimiter = ',',
                           quotechar = '"')
    data = [data for data in data_iter]
data_array1 = np.asarray(data, dtype = float)
hitmiss_in = data_array1    #loads entire dataset from excel csv file

with open('binary_conversion_dataset_target_2.csv','r') as dest2_f:
    data_iter = csv.reader(dest2_f,
                           delimiter = ',',
                           quotechar = '"')
    data = [data for data in data_iter]
data_array2 = np.asarray(data, dtype = float)
hitmiss_target = data_array2    #loads entire dataset from excel csv file




hitmiss_input = hitmiss_in[:,:]    

hitmiss_target = hitmiss_target[:,:]    


hitmiss_predict = [0.53, 0.80, 0.40, 0.20, 0.07]

#####break target set into single numbers
hitmiss_target1a = hitmiss_target[:,0]
hitmiss_target1b = hitmiss_target[:,1]
hitmiss_target1c = hitmiss_target[:,2]
hitmiss_target1d = hitmiss_target[:,3]
hitmiss_target1e = hitmiss_target[:,4]


###Neural Network

x_train, x_test, y_train, y_test = train_test_split(
   hitmiss_input, hitmiss_target1a, train_size=TRAIN
   )

with open('network-storage.dill', 'r') as f:
    cgnet = dill.load(f)



hitmiss_final_A = cgnet.predict(hitmiss_predict).round(ROUND)

print hitmiss_final_A

产生的错误是:

Traceback (most recent call last):
  File "C:\Python27\save network script.py", line 171, in <module>
    cgnet = dill.load(f)
  File "C:\Python27\lib\site-packages\dill\dill.py", line 128, in load
    obj = pik.load()
  File "C:\Python27\lib\pickle.py", line 858, in load
    dispatch[key](self)
  File "C:\Python27\lib\pickle.py", line 880, in load_eof
    raise EOFError
EOFError

是否有可能是我选择的变量表示法导致它多次循环导致问题?或者可能有很多东西要存储?

【问题讨论】:

  • 我认为这意味着network-storage.dill 文件存在问题。 dill.load() 提前到达文件末尾。

标签: python numpy pickle dill neupy


【解决方案1】:

你的倾销线应该是这样的

dill.dump(obj, file)

或者

file.write(dill.dumps(...))

dumps 返回一个字符串并且不会自己写入文件。它仍然是空的,在读取它时您会立即收到 EOF(文件结尾)错误。

【讨论】:

  • 第一个通常是更好的方法,但我同意。
  • 我做了建议的更改,它现在产生(一些多次)错误:文件“C:\Python27\lib\pickle.py”,第 286 行,保存 f(self,obj) # 使用显式 self 调用未绑定方法 File "C:\Python27\lib\site-packages\dill\dill.py", line 418, in save_function obj.func_closure), obj=obj) File "C:\Python27\lib\ pickle.py”,第 405 行,在 save_reduce self.memoize(obj) 文件“C:\Python27\lib\pickle.py”,第 244 行,在 memoize 中断言 id(obj) 不在 self.memo AssertionError
  • 从您在上面的评论中呈现错误的方式很难判断发生了什么,现在您已经编辑了代码,我们看不到编辑。你能用一个新的部分更新你的问题吗?或者提出一个新问题并参考这个问题?
猜你喜欢
  • 2013-03-21
  • 2023-03-24
  • 1970-01-01
  • 2018-02-17
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多