且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

来自训练元图的权重和偏差

更新时间:2023-12-02 11:58:22

MetaGraphDef proto 实际上并不包含权重和偏差的值.相反,它提供了一种将 GraphDef 与存储在一个或多个检查点文件中的权重相关联的方法,该文件由 tf.train.Saver.MetaGraphDef 教程有更多细节,但大致结构如下:

The MetaGraphDef proto doesn't actually contain the values of the weights and biases. Instead it provides a way to associate a GraphDef with the weights stored in one or more checkpoint files, written by a tf.train.Saver. The MetaGraphDef tutorial has more details, but the approximate structure is as follows:

  1. 在您的训练计划中,使用 tf.train.Saver 写出一个检查点.这也会将 MetaGraphDef 写入同一目录中的 .meta 文件.

  1. In you training program, write out a checkpoint using a tf.train.Saver. This will also write a MetaGraphDef to a .meta file in the same directory.

saver = tf.train.Saver(...)
# ...
saver.save(sess, "model")

您应该在检查点目录中找到名为 model.metamodel-NNNN(对于某些整数 NNNN)的文件.

You should find files called model.meta and model-NNNN (for some integer NNNN) in your checkpoint directory.

在另一个程序中,您可以导入您刚刚创建的 MetaGraphDef,并从检查点恢复.

In another program, you can import the MetaGraphDef you just created, and restore from a checkpoint.

saver = tf.train.import_meta_graph("model.meta")
saver.restore("model-NNNN")  # Or whatever checkpoint filename was written.

如果你想获取每个变量的值,你可以(例如)在tf.all_variables()集合中找到该变量并将其传递给sess.run() 获取其值.例如,要打印所有变量的值,您可以执行以下操作:

If you want to get the value of each variable, you can (for example) find the variable in tf.all_variables() collection and pass it to sess.run() to get its value. For example, to print the values of all variables, you can do the following:

for var in tf.all_variables():
  print var.name, sess.run(var)

您还可以过滤 tf.all_variables() 以找到您试图从模型中提取的特定权重和偏差.

You could also filter tf.all_variables() to find the particular weights and biases that you're trying to extract from the model.