更新时间:2022-12-14 15:19:13
TensorFlow 中的所有反向传播都是通过自动微分网络前向传播中的操作,并添加显式操作来计算网络中每个点的梯度.一般实现可以在<.gradients>
,但使用的特定版本取决于你的 LSTM 是如何实现的:
All backpropagation in TensorFlow is implemented by automatically differentiating the operations in the forward pass of the network, and adding explicit operations for computing the gradient at each point in the network. The general implementation can be found in tf.gradients()
, but the particular version used depends on how your LSTM is implemented:
tf.gradients()
中的算法来构建展开的反向传播反方向循环.tf.while_loop()
,它会使用额外的支持来区分 control_flow_grad.py
.tf.gradients()
to build an unrolled backpropagation loop in the opposite direction.tf.while_loop()
, it uses additional support for differentiating loops in control_flow_grad.py
.