且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

将 Keras 模型导出为 TF Estimator:找不到训练好的模型

更新时间:2023-12-02 13:12:34

我的解决方案如下.检查 ./tf 文件夹清楚地表明对 model_to_estimator 的调用将必要的文件存储在 keras 子文件夹中,而 export_model 期望这些文件直接位于 ./tf 文件夹中,因为这是我们为 model_dir 参数指定的路径:

$ 树 ./tf./tf└── 凯拉斯├──检查站├── keras_model.ckpt.data-00000-of-00001├── keras_model.ckpt.index└── keras_model.ckpt.meta1个目录,4个文件

简单的解决方法是将这些文件向上移动一个文件夹.这可以用 Python 来完成:

导入操作系统进口木材从 pathlib 导入路径def up_one_dir(路径):"""将path中的所有文件向上移动一个文件夹,并删除空文件夹"""parent_dir = str(Path(path).parents[0])对于 os.listdir(path) 中的 f:Shutil.move(os.path.join(path, f), parent_dir)关闭.rmtree(路径)up_one_dir('./tf/keras')

这将使 model_dir 目录看起来像这样:

$ 树 ./tf./tf├──检查站├── keras_model.ckpt.data-00000-of-00001├── keras_model.ckpt.index└── keras_model.ckpt.meta0 个目录,4 个文件

model_to_estimatorexport_savedmodel 调用之间执行此操作允许根据需要导出模型:

export_path = './export'estimator.export_savedmodel(出口路径,serving_input_receiver_fn=serving_input_receiver_fn())

INFO:tensorflow:SavedModel 写入:./export/temp-b'1549796240'/saved_model.pb

I encountered the following issue when trying to export a Keras model as a TensorFlow Estimator with the purpose of serving the model. Since the same problem also popped up in an answer to this question, I will illustrate what happens on a toy example and provide my workaround solution for documentation purposes. This behaviour occurs with Tensorflow 1.12.0 and Keras 2.2.4. This happens with actual Keras as well as with tf.keras.

The problem occurs when trying to export an Estimator that was created from a Keras model with tf.keras.estimator.model_to_estimator. Upon calling estimator.export_savedmodel, either a NotFoundError or a ValueError is thrown.

The below code reproduces this for a toy example.

Create a Keras model and save it:

import keras
model = keras.Sequential()
model.add(keras.layers.Dense(units=1,
                                activation='sigmoid',
                                input_shape=(10, )))
model.compile(loss='binary_crossentropy', optimizer='sgd')
model.save('./model.h5')

Next, convert the model to an estimator with tf.keras.estimator.model_to_estimator, add an input receiver function and export it in the Savedmodel format with estimator.export_savedmodel:

# Convert keras model to TF estimator
tf_files_path = './tf'
estimator =
    tf.keras.estimator.model_to_estimator(keras_model=model,
                                          model_dir=tf_files_path)
def serving_input_receiver_fn():
    return tf.estimator.export.build_raw_serving_input_receiver_fn(
        {model.input_names[0]: tf.placeholder(tf.float32, shape=[None, 10])})

# Export the estimator
export_path = './export'
estimator.export_savedmodel(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn())

This will throw:

ValueError: Couldn't find trained model at ./tf.

My workaround solution is as follows. Inspecting the ./tf folder makes clear that the call to model_to_estimator stored the necessary files in a keras subfolder, while export_model expects those files to be in the ./tf folder directly, as this is the path we specified for the model_dir argument:

$ tree ./tf
./tf
└── keras
    ├── checkpoint
    ├── keras_model.ckpt.data-00000-of-00001
    ├── keras_model.ckpt.index
    └── keras_model.ckpt.meta

1 directory, 4 files

The simple workaround is to move these files up one folder. This can be done with Python:

import os
import shutil
from pathlib import Path

def up_one_dir(path):
    """Move all files in path up one folder, and delete the empty folder
    """
    parent_dir = str(Path(path).parents[0])
    for f in os.listdir(path):
        shutil.move(os.path.join(path, f), parent_dir)
    shutil.rmtree(path)

up_one_dir('./tf/keras')

Which will make the model_dir directory look like this:

$ tree ./tf
./tf
├── checkpoint
├── keras_model.ckpt.data-00000-of-00001
├── keras_model.ckpt.index
└── keras_model.ckpt.meta

0 directories, 4 files

Doing this manipulation in between the model_to_estimator and the export_savedmodel calls allows to export the model as desired:

export_path = './export'
estimator.export_savedmodel(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn())

INFO:tensorflow:SavedModel written to: ./export/temp-b'1549796240'/saved_model.pb