【发布时间】:2021-03-20 04:35:31
【问题描述】:
我有一个有 2 个输入和 4 个输出的网络。我已经建立了一个时间步长 = 5 的 LSTM 模型。
import numpy as np
import tensorflow as tf
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten, LSTMD
from keras.layers import Input, LSTM, concatenate, Dense, Lambda
from keras.models import Model
from sklearn.metrics import mean_squared_error
from keras.models import load_model
from keras.utils.vis_utils import plot_model
from tensorflow.keras import layers
import keras
from keras_self_attention import SeqSelfAttention
from tensorflow.keras.layers import Attention
这是输入和输出:
X = np.random.normal(size=(100,5,2)) # input
Y = np.random.normal(size=(100,4))
这是我的模型:
model = keras.models.Sequential()
model.add(keras.layers.LSTM(units = 50, return_sequences=True, input_shape=(X.shape[1],X.shape[2])))
model.add(SeqSelfAttention(attention_width=50,attention_activation='linear',name='Attention'))
model.add(keras.layers.Dense(4, activation='linear'))
model.compile(optimizer = 'adam', loss = 'mean_squared_error',metrics = ['MAE'])
model.summary()
model.fit( X, Y, epochs = 1, batch_size = 500)
model.save('model.h5')
当我运行模型时,我遇到了这个错误:
Error when checking target: expected dense_56 to have 3 dimensions, but got array with shape (100,4)
谁能帮助我?谢谢
【问题讨论】:
标签: python-3.x tensorflow lstm attention-model