【问题标题】:Keras: stacking multiple LSTM layer withKeras:堆叠多个 LSTM 层
【发布时间】:2018-07-08 16:23:50
【问题描述】:

我有以下工作正常的网络:

output = LSTM(8)(output)
output = Dense(2)(output)

现在对于同一个模型,我正在尝试堆叠几个 LSTM 层,如下所示:

output = LSTM(8)(output, return_sequences=True)
output = LSTM(8)(output)
output = Dense(2)(output)

但我收到以下错误:

TypeError                                 Traceback (most recent call last)
<ipython-input-2-0d0ced2c7417> in <module>()
     39 
     40 output = Concatenate(axis=2)([leftOutput,rightOutput])
---> 41 output = LSTM(8)(output, return_sequences=True)
     42 output = LSTM(8)(output)
     43 output = Dense(2)(output)

/usr/local/lib/python3.4/dist-packages/keras/layers/recurrent.py in __call__(self, inputs, initial_state, constants, **kwargs)
    480 
    481         if initial_state is None and constants is None:
--> 482             return super(RNN, self).__call__(inputs, **kwargs)
    483 
    484         # If any of `initial_state` or `constants` are specified and are Keras

/usr/local/lib/python3.4/dist-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
    601 
    602             # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 603             output = self.call(inputs, **kwargs)
    604             output_mask = self.compute_mask(inputs, previous_mask)
    605 

TypeError: call() got an unexpected keyword argument 'return_sequences'

这很令人困惑,因为 return_sequences 是基于 Keras 文档的有效参数:https://keras.io/layers/recurrent/#lstm

我在这里做错了什么?谢谢!

【问题讨论】:

  • 试试:output = LSTM(8, return_sequences=True)(output)
  • 有效。谢谢!
  • 那我可以制定答案吗?

标签: machine-learning neural-network deep-learning keras lstm


【解决方案1】:

问题在于return_sequences 应该作为参数传递给层构造函数——而不是层调用。将代码更改为:

output = LSTM(8, return_sequences=True)(output)

解决了问题。

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 2017-03-12
    • 2023-03-17
    • 1970-01-01
    • 2018-04-29
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2019-08-18
    相关资源
    最近更新 更多