【问题标题】:Added layer must be an instance of class layer添加的层必须是类层的实例
【发布时间】:2020-07-17 08:02:49
【问题描述】:

我正在构建一个 Bi-LSTM 网络,其中包含一个注意力层。但是添加的层必须是类层的实例会给出错误。

我已经导入的一些库是

from keras.models import Model, Sequential
from keras.layers import LSTM, Activation, Dense, Dropout, Input, Embedding, Bidirectional, Conv1D, Flatten, GlobalMaxPooling1D, SpatialDropout1D
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras import backend as K
from tensorflow.keras.layers import *

注意层类是

class attention(Layer):
    
    def __init__(self, return_sequences=True):
        self.return_sequences = return_sequences
        super(attention,self).__init__()
        
    def build(self, input_shape):
        
        self.W=self.add_weight(name="att_weight", shape=(input_shape[-1],1),
                               initializer="normal")
        self.b=self.add_weight(name="att_bias", shape=(input_shape[1],1),
                               initializer="zeros")
        
        super(attention,self).build(input_shape)
        
    def call(self, x):
        
        e = K.tanh(K.dot(x,self.W)+self.b)
        a = K.softmax(e, axis=1)
        output = x*a
        
        if self.return_sequences:
            return output
        
        return K.sum(output, axis=1)

模型是这样的

model = Sequential()
model.add(Embedding(max_words, 1152, input_length=max_len, weights=[embeddings]))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(Dropout(0.5))
model.add(Bidirectional(LSTM(32, return_sequences=True)))
model.add(attention(return_sequences=True))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))
model.summary()

但它给出了一个错误

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-41-ba5b52fe2c87> in <module>()
      1 model = Sequential()
----> 2 model.add(Embedding(max_words, 1152, input_length=max_len, weights=[embeddings]))
      3 model.add(BatchNormalization())
      4 model.add(Activation('tanh'))
      5 #model.add(SpatialDropout1D(0.5))

/usr/local/lib/python3.6/dist-packages/keras/engine/sequential.py in add(self, layer)
    131             raise TypeError('The added layer must be '
    132                             'an instance of class Layer. '
--> 133                             'Found: ' + str(layer))
    134         self.built = False
    135         if not self._layers:

TypeError: The added layer must be an instance of class Layer. Found: <tensorflow.python.keras.layers.embeddings.Embedding object at 0x7f0da41aec50>

【问题讨论】:

  • 你从哪里导入你的Layer?好像你在这个库中混淆了 tf.keraskeras 库。
  • 我已经包含了 'from tensorflow.keras.layers import *' 所以这包含了所有层
  • 我已经用我的模型中导入的库更新了问题。
  • U 以错误的方式导入库...请在导入时也按照此处的示例:colab.research.google.com/drive/…
  • @MarcoCerliani 我正在使用与您发布的笔记本相同的笔记本,但现在它给出了错误module 'keras.layers.embeddings' has no attribute 'shape

标签: python-3.x tensorflow keras nlp


【解决方案1】:

This documentation page 声明在定义自定义Layer 时,应使用以下语法:

class Linear(tf.keras.layers.Layer):
    def __init__(self, units=32, input_dim=32):
        super(Linear, self).__init__()
        w_init = tf.random_normal_initializer()
        self.w = tf.Variable(
            initial_value=w_init(shape=(input_dim, units), dtype="float32"),
            trainable=True,
        )
        b_init = tf.zeros_initializer()
        self.b = tf.Variable(
            initial_value=b_init(shape=(units,), dtype="float32"), trainable=True
        )

    def call(self, inputs):
        return tf.matmul(inputs, self.w) + self.b

所以,您的 Layer 导入在技术上是正确的。但是,您使用纯keras 层初始化模型,这会导致错误。在任何地方使用tf.keras 功能,错误就会消失,例如:

https://www.tensorflow.org/guide/keras/sequential_model

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

# Your own layer here
class AttentionCustom(layers.Layer):
     pass

model = keras.Sequential(
    [
        layers.Dense(2, activation="relu", name="layer1"),
        layers.Dense(3, activation="relu", name="layer2"),
        Attention(),
        layers.Dense(4, name="layer3"),
    ]
)

【讨论】:

  • 我应该在哪里添加 tf.keras?在自定义注意力层中?
  • 不,只需删除您的纯 keras 导入。例如:from keras.models import Model, Sequentialfrom keras.layers import LSTM, Activation, Dense, Dropout, Input, Embedding, Bidirectional, Conv1D, Flatten, GlobalMaxPooling1D, SpatialDropout1D 并尝试仅导入 import tensorflow as tffrom tensorflow import kerasfrom tensorflow.keras import layers
  • 我已将答案标记为正确。请也支持我的问题:)
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2019-02-05
  • 2019-08-14
  • 1970-01-01
  • 2019-08-19
  • 2020-06-06
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多