【发布时间】:2021-12-18 00:42:42
【问题描述】:
我正在尝试在 sagemaker 上使用我自己的自定义推理容器部署模型。我正在关注这里的文档https://docs.aws.amazon.com/sagemaker/latest/dg/adapt-inference-container.html
我有一个入口点文件:
from sagemaker_inference import model_server
#HANDLER_SERVICE = "/home/model-server/model_handler.py:handle"
HANDLER_SERVICE = "model_handler.py"
model_server.start_model_server(handler_service=HANDLER_SERVICE)
我有一个 model_handler.py 文件:
from sagemaker_inference.default_handler_service import DefaultHandlerService
from sagemaker_inference.transformer import Transformer
from CustomHandler import CustomHandler
class ModelHandler(DefaultHandlerService):
def __init__(self):
transformer = Transformer(default_inference_handler=CustomHandler())
super(HandlerService, self).__init__(transformer=transformer)
我有我的 CustomHandler.py 文件:
import os
import json
import pandas as pd
from joblib import dump, load
from sagemaker_inference import default_inference_handler, decoder, encoder, errors, utils, content_types
class CustomHandler(default_inference_handler.DefaultInferenceHandler):
def model_fn(self, model_dir: str) -> str:
clf = load(os.path.join(model_dir, "model.joblib"))
return clf
def input_fn(self, request_body: str, content_type: str) -> pd.DataFrame:
if content_type == "application/json":
items = json.loads(request_body)
for item in items:
processed_item1 = process_item1(items["item1"])
processed_item2 = process_item2(items["item2])
all_item1 += [processed_item1]
all_item2 += [processed_item2]
return pd.DataFrame({"item1": all_item1, "comments": all_item2})
def predict_fn(self, input_data, model):
return model.predict(input_data)
将模型部署到图像中包含这些文件的端点后,我收到以下错误:ml.mms.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'model_handler'。
我真的被困在这里做什么。我希望有一个如何以上述方式端到端执行此操作的示例,但我认为没有。谢谢!
【问题讨论】:
标签: amazon-sagemaker