【问题标题】:Ensemble forecast with keras on GPU在 GPU 上使用 keras 进行集成预测
【发布时间】:2021-04-18 14:59:19
【问题描述】:

我正在尝试使用同样构建的 keras 模型进行整体预测。单个 NN 的数据具有相同的形状。我想使用 GPU,因为模型应该并行训练。因此,我正在尝试合并模型。因为模型的数量应该是可配置的,所以我想循环执行此操作。我找到了一些解决方案,但对于每一个我都有循环问题。这是我的方法:

from keras import Sequential, Model
from keras.layers import Embedding, GlobalAveragePooling1D, Dense, concatenate
import numpy as np

nummodels=3

model = Sequential()
model.add(Embedding(20, 10, trainable=True))
model.add(GlobalAveragePooling1D())
model.add(Dense(1, activation='sigmoid'))

for i in range(nummodels-1):
    model2 = Sequential()
    model2.add(Embedding(20, 10, trainable=True))
    model2.add(GlobalAveragePooling1D())
    model2.add(Dense(1, activation='sigmoid'))
    
    model_concat = concatenate([model.output, model2.output], axis=-1)
    model_concat = Dense(1, activation='softmax')(model_concat)
    model = Model(inputs=[model.input, model2.input], outputs=model_concat)

model.compile(loss='binary_crossentropy', optimizer='adam')

# somehow generating testdata x1,x2,x3 and y1,y2,y3...
# not implemented yet...

# Training
model.fit([x1,x2,x3],[y1,y2,y3], epochs = 50)
# prediction
ypred1,ypred2,ypred3 = model.predict([x1,x2,x3])

循环不工作。我希望你能告诉我有什么问题。稍后我将循环训练和预测。我的代码中的拟合和预测是否正确?如果无法循环执行此操作,请给我其他解决方案。

编辑:

根据 M.Innat 的回答,我更改了代码。现在,在循环中,我将 NN 的输入和输出添加到一个列表中,稍后我将使用该列表进行连接。但是第一个循环之后的串联仍然不起作用。

from keras import Model
from keras.layers import Dense, Input
import numpy as np
import tensorflow as tf

nummodels=3

inputs=[]
outputs=[]
final_outputs=[]
init = 'uniform'
activation = 'tanh'
for i in range(nummodels):
    
    input_layer = Input(shape=(11,))
    A2 = Dense(8, kernel_initializer=init, activation=activation)(input_layer)
    A2 = Dense(5, kernel_initializer=init, activation=activation)(A2)
    A2 = Dense(3, kernel_initializer=init, activation=activation)(A2)
    A2 = Dense(1, kernel_initializer=init, activation=activation)(A2)
    model = Model(inputs=input_layer, outputs=A2, name="Model"+str(i+1))
       
    inputs.append(model.input)
    outputs.append(model.output)
    
model_concat = tf.keras.layers.concatenate(outputs, name='target_concatenate')    

for i in range(nummodels):
    final_outputs.append(Dense(1, activation='sigmoid')(model_concat))
   
# whole models 
composed_model = tf.keras.Model(inputs=inputs, outputs=final_outputs)

# model viz
tf.keras.utils.plot_model(composed_model)

# compile the model
composed_model.compile(loss='binary_crossentropy', optimizer='adam')

# data generation 
x1 = np.random.randint(1000, size=(32, 11))
x2 = np.random.randint(1000, size=(32, 11))
x3 = np.random.randint(1000, size=(32, 11))
y1 = np.random.randint(1000, size=(32, 1))
y2 = np.random.randint(1000, size=(32, 1))
y3 = np.random.randint(1000, size=(32, 1))

# Training
composed_model.fit([x1,x2,x3],[y1,y2,y3], epochs = 50)
# prediction
y1p, y2p, y3p = composed_model.predict([x1,x2,x3])

【问题讨论】:

  • 从您的建模方法来看,您似乎尝试构建 n 次模型并将它们合并以进行训练。激活是什么sigmoidsoftmax
  • 我只想将模型与模型 2 连接起来。两者都具有 sigmoid 激活。可能 softmax 的线是错误的。
  • 看到给定的答案,我想这就是你要找的。​​span>

标签: python tensorflow keras deep-learning ensemble-learning


【解决方案1】:

首先,您正在尝试构建nummodels 次模型并连接它们的输出。其次,您尝试 .fit 具有 multi-input 的模型并尝试获得 multi-output - 这意味着您的每个模型都将采用单独的输入并合并模型将给出nummodels 次输出。因此,根据我对您的问题的理解,这里是您的一种可能的解决方案。


模型

import tensorflow as tf
from tensorflow.keras import Input, Model
from tensorflow.keras.layers import Dense, Embedding, GlobalAveragePooling1D

# Define Model A 
input_layer = Input(shape=(10,))
A2 = Embedding(1000, 64)(input_layer)
A2 = GlobalAveragePooling1D()(A2)
model_a = Model(inputs=input_layer, outputs=A2, name="ModelA")

# Define Model B 
input_layer = Input(shape=(10,))
A2 = Embedding(1000, 64)(input_layer)
A2 = GlobalAveragePooling1D()(A2)
model_b = Model(inputs=input_layer, outputs=A2, name="ModelB")

# Define Model C
input_layer = Input(shape=(10,))
A2 = Embedding(1000, 64)(input_layer)
A2 = GlobalAveragePooling1D()(A2)
model_c = Model(inputs=input_layer, outputs=A2, name="ModelC")

合并模型

import numpy as np 
import tensorflow as tf 

# concate their output 
model_concat = tf.keras.layers.concatenate(
    [
     model_a.output, model_b.output, model_c.output
    ], name='target_concatenate')

# final activation layer 
final_out1 = Dense(1, activation='sigmoid')(model_concat)
final_out2 = Dense(1, activation='sigmoid')(model_concat)
final_out3 = Dense(1, activation='sigmoid')(model_concat)

# whole models 
composed_model = tf.keras.Model(
    inputs=[model_a.input, model_b.input, model_c.input],
    outputs=[final_out1, final_out2, final_out3]
)

# model viz
tf.keras.utils.plot_model(composed_model)

I/O

# compile the model
composed_model.compile(loss='binary_crossentropy', optimizer='adam')

# dummies 
input_array1 = np.random.randint(1000, size=(32, 10))
input_array2 = np.random.randint(1000, size=(32, 10))
input_array3 = np.random.randint(1000, size=(32, 10))

# get some prediction - multi-input / multi-output
output_array1, output_array2, output_array3 = composed_model.predict([input_array1,
                                                                      input_array2,
                                                                      input_array3])

查看这些答案以获得更多理解。


根据您的评论和编辑的部分问题,尝试如下:

models = []
for i in range(nummodels):
    ....
    ....
    model = Model(inputs=input_layer, outputs=A2, name="Model"+str(i+1))

    models.append(model)
    # inputs.append(model.input)
    # outputs.append(model.output)
# model_concat = tf.keras.layers.concatenate(outputs, name='target_concatenate')    
model_concat = tf.keras.layers.concatenate(
    [
     models[0].output, models[1].output, models[2].output
    ], name='target_concatenate')
final_outputs=[]
for i in range(nummodels):
    final_outputs.append(Dense(1, activation='sigmoid')(model_concat))
   
# whole models 
composed_model = tf.keras.Model(inputs=[models[0].input, 
                                        models[1].input, 
                                        models[2].input], 
                                outputs=final_outputs)

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 2018-09-24
    • 2020-10-25
    • 2018-05-23
    • 2021-09-15
    • 2020-10-30
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多