且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

Keras 模型中的 Tensorflow op

更新时间:2023-12-02 08:41:40

大约两周过去了,我现在似乎可以回答我自己的问题了.

如果您使用 this注册它们,似乎 tensorflow 可以查找梯度> 装饰器.在撰写本文时,此功能在 C++ 中(尚)不可用,而这正是我所寻找的.一种解决方法是在 C++ 中定义一个普通操作,并使用提到的装饰器将其包装在 python 方法中.如果这些具有相应梯度的函数在 tensorflow 中注册,反向传播将自动"发生.

I'm trying to use a tensorflow op inside a Keras model. I previously tried to wrap it with a Lambda layer but I believe this disables that layers' backpropagation.

More specifically, I'm trying to use the layers from here in a Keras model, without porting it to Keras layers (I hope to deploy to tensorflow later on). I can compile these layers in a shared library form and load these into python. This gives me tensorflow ops and I don't know how to combine this in a Keras model.

A simple example of a Keras MNIST model, where for example one Conv2D layer is replaced by a tf.nn.conv2d op, would be exactly what I'm looking for.

I've seen this tutorial but it appears to do the opposite of what I am looking for. It seems to insert Keras layers into a tensorflow graph. I'm looking to do the exact opposite.

Best regards, Hans

Roughly two weeks have passed and it seems I am able to answer my own question now.

It seems like tensorflow can look up gradients if you register them using this decorator. As of writing, this functionality is not (yet) available in C++, which is what I was looking for. A workaround would be to define a normal op in C++ and wrap it in a python method using the mentioned decorator. If these functions with corresponding gradients are registered with tensorflow, backpropagation will happen 'automagically'.