【发布时间】:2020-08-07 13:10:00
【问题描述】:
我的情况与"Save/load a keras model with constants"类似,但略有不同
我正在tf.keras(TFv1.12,是的,我知道)中创建一个对象检测模型(基于YOLO),其原始输出需要后处理到边界框。
这涉及一些参数,这些参数对于模型而言是恒定的,但构建模型的脚本的参数:例如类的数量,以及生成框的“锚”位置矩阵。
我的模型将被加载到 TFServing 容器中,因此我正在努力确保:
- 转换被封装在模型中,而不是让用户去做或拆分后处理逻辑
- 保存的模型工件(例如 Keras h5 或 TF pb+params)足以加载和服务模型
这样做的正确方法是什么?
据我所知,以下不起作用:
import numpy as np
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import backend as K
from tensorflow.keras.layers import Lambda, Layer
# Lambda layer using closure scope triggers error when trying to load the model:
# seems like `param` is defined but some weird object
def make_output_lambda(param):
def mylambda(raw_output):
return raw_output + param
return Lambda(mylambda)
# Even if the custom layer type is added to `custom_objects` on
# `tf.keras.models.load_model()` - it seems to get called without the positional
# arguments:
class MyCustomLayer(Layer):
def __init__(self, param, **kwargs):
super(YOLOHeadLayer, self).__init__(**kwargs)
self.param = param # or K.constant(param) - same overall problem
def call(self, inputs):
return inputs + self.param
# Keras throws an error when creating a `Model` that depends on a constant
# tensor which isn't an `Input` (and who wants a constant "Input"?)
def lambdatwo(inputs):
return inputs[0] + inputs[1]
param_tensor = K.constant(param)
y = Lambda(lambdatwo)((raw_output, param_tensor))
【问题讨论】:
-
你知道如何将联系人和模型一起保存吗?
标签: python tensorflow keras deep-learning