且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何使用 tensorflow 服务使 tensorflow hub 嵌入可用?

更新时间:2023-12-02 18:46:28

现在您可能需要手动执行此操作.这是我的解决方案,类似于之前的答案,但更通用 - 展示如何在不猜测输入参数的情况下使用任何其他模块,以及通过验证和使用进行扩展:

Right now you probably need to do this by hand. Here is my solution, similar to previous answer but more general - show how to use any other module without guessing input parameters, as well as extended with verification and usage:

import tensorflow as tf
import tensorflow_hub as hub
from tensorflow.saved_model import simple_save

export_dir = "/tmp/tfserving/universal_encoder/00000001"
with tf.Session(graph=tf.Graph()) as sess:
    module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder/2") 
    input_params = module.get_input_info_dict()
    # take a look at what tensor does the model accepts - 'text' is input tensor name

    text_input = tf.placeholder(name='text', dtype=input_params['text'].dtype, 
        shape=input_params['text'].get_shape())
    sess.run([tf.global_variables_initializer(), tf.tables_initializer()])

    embeddings = module(text_input)

    simple_save(sess,
        export_dir,
        inputs={'text': text_input},
        outputs={'embeddings': embeddings},
        legacy_init_op=tf.tables_initializer())

感谢 module.get_input_info_dict() 你知道你需要传递给模型的张量名称 - 你使用这个名称作为 inputs={}simple_save 方法.

Thanks to module.get_input_info_dict() you know what tensor names you need to pass to the model - you use this name as a key for inputs={} in simple_save method.

请记住,要为模型提供服务,它需要位于以 version 结尾的目录路径中,这就是为什么 '00000001'saved_model.pb 所在的最后一个路径的原因.

Remember that to serve the model it needs to be in directory path ending with version, that's why '00000001' is the last path in which saved_model.pb resides.

导出模块后,查看模型是否正确导出以供服务的最快方法是使用 saved_model_cli API:

After exporting your module, quickest way to see if your model is exported properly for serving is to use saved_model_cli API:

saved_model_cli run --dir /tmp/tfserving/universal_encoder/00000001 --tag_set serve --signature_def serving_default --input_exprs 'text=["what this is"]'

从 docker 为模型提供服务:

To serve the model from docker:

docker pull tensorflow/serving  
docker run -p 8501:8501 -v /tmp/tfserving/universal_encoder:/models/universal_encoder -e MODEL_NAME=universal_encoder -t tensorflow/serving