【发布时间】:2017-05-16 16:38:56
【问题描述】:
我正在使用 scikit learn 的“光谱聚类”功能。我能够对 8100 x 8100 矩阵执行聚类,但是这个函数会抛出 10000 x 10000 矩阵的错误。
有人用这个函数处理大矩阵吗?
编辑:我收到以下错误消息:
Not enough memory to perform factorization.
Traceback (most recent call last):
File "combined_code_img.py", line 287, in <module>
labels=spectral.fit_predict(Affinity)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/base.py",
line 410, in fit_predict
self.fit(X)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/cluster/spectral.py", line 463, in fit
assign_labels=self.assign_labels)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/cluster/spectral.py", line 258, in spectral_clustering
eigen_tol=eigen_tol, drop_first=False)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/manifold/spectral_embedding_.py", line 265, in spectral_embedding
tol=eigen_tol, v0=v0)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 1560, in eigsh
symmetric=True, tol=tol)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 1046, in get_OPinv_matvec
return SpLuInv(A.tocsc()).matvec
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 907, in __init__
self.M_lu = splu(M)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/dsolve/linsolve.py", line 261, in splu
ilu=False, options=_options)
MemoryError
我的机器有 16 GB 内存。
【问题讨论】:
-
显然这取决于你的记忆。 100x100 很小,所以大小应该不是问题。实际错误是什么?
-
抱歉,我提供了错误的尺寸。功能适用于 90*90 x 90*90 即 8100 x 8100。
-
尝试计算 8100x8100 矩阵的内存需求,双精度(8 字节)和矩阵的两个副本。
标签: machine-learning scikit-learn cluster-analysis spectral