【发布时间】:2019-12-24 04:22:39
【问题描述】:
我想训练一个具有 3 个输入通道的模型。每个通道都从嵌入(作为 lambda)开始,然后是卷积。但是,我无法处理形状。
# Build network
def swish(x):
return K.sigmoid(x) * x
def make_model():
embed_size = 512 #must be 512 for embedding layer
input_text1 = Input(shape=(1,), dtype=tf.string)
embedding1 = Lambda(QTEmbedding, output_shape=(embed_size,))(input_text1)
con11 = Conv1D(32, 3, activation='relu')(embedding1)
pool11 = MaxPooling1D(2)(con11)
con12 = Conv1D(64,3, activation='relu')(pool11)
pool12 = MaxPooling1D(2)(con12)
flat1 = Flatten()(pool12)
input_text2 = Input(shape=(1,), dtype=tf.string)
embedding2 = Lambda(QBEmbedding, output_shape=(embed_size,))(input_text2)
con21 = Conv1D(32, 3, activation='relu')(embedding2)
pool21 = MaxPooling1D(2)(con21)
con22 = Conv1D(64,3, activation='relu')(pool21)
pool22 = MaxPooling1D(2)(con22)
flat2 = Flatten()(pool22)
input_text3 = Input(shape=(1,), dtype=tf.string)
embedding3 = Lambda(AEmbedding, output_shape=(embed_size,))(input_text3)
con31 = Conv1D(32, 3, activation='relu')(embedding3)
pool31 = MaxPooling1D(2)(con31)
con32 = Conv1D(64,3, activation='relu')(pool31)
pool32 = MaxPooling1D(2)(con32)
flat3 = Flatten()(pool32)
x = Concatenate()([flat1,flat2,flat3])
x = Dense(512, activation=swish)(x)
x = Dropout(0.4)(x)
x = BatchNormalization()(x)
x = Dense(256, activation=swish)(x)
x = Dropout(0.4)(x)
x = BatchNormalization()(x)
x = Dense(64, activation=swish, kernel_regularizer=keras.regularizers.l2(0.001))(x)
x = Dropout(0.4)(x)
x = BatchNormalization()(x)
output = Dense(len(targets),activation='sigmoid',name='output')(x)
model = Model(inputs=[input_text1,input_text2,input_text3], outputs=[output])
model.summary()
return model
我收到此错误消息:
ValueError:输入 0 与层 conv1d_24 不兼容:预期 ndim=3,发现 ndim=2。
我已经四处搜索,但没有找到特定于这个问题的解决方案。 请不要将我链接到具有相同错误的另一个 LSTM 问题。
我觉得嵌入函数可能是问题,因为它们会输出 2D 张量。
def QTEmbedding(x):
results = qa.signatures['question_encoder'](tf.squeeze(tf.cast(x, tf.string)))['outputs']
return keras.backend.concatenate([results])
def QBEmbedding(x):
results = general(tf.squeeze(tf.cast(x, tf.string)))
return keras.backend.concatenate([results])
def AEmbedding(x):
results = qa.signatures['response_encoder'](input=tf.squeeze(tf.cast(x, tf.string)), context=tf.squeeze(tf.cast(x, tf.string)))['outputs']
return keras.backend.concatenate([results])
这些是模型。
import tensorflow_hub as hub
general = hub.load("https://tfhub.dev/google/universal-sentence-encoder-large/5")
qa = hub.load('https://tfhub.dev/google/universal-sentence-encoder-qa/3')
【问题讨论】:
标签: python tensorflow keras