更新时间:2023-12-02 18:38:10
你也可以做 tf.reduce_mean(x, axis=[1,2]),特别是如果你的高度和宽度是未定义.
You could also do tf.reduce_mean(x, axis=[1,2]), specially if your height and width are not defined.
通常,在 CNN 中,张量具有 b, h, w, c
的形状,其中 b
是批大小,w
和h
对应的是宽高尺寸,c
是通道数/过滤器数.
Typically, in a CNN, tensors have a shape of b, h, w, c
where b
is the batch size, w
and h
correspond to the width and height dimensions, and c
is the number of channels/filters.
当你沿轴减少 [1,2]
时,你会减少张量的第一维和第二维(保持批量大小,以及通道/过滤器的数量)
When you reduce along the axis [1,2]
, you reduce on the first and second dimensions of the tensor (keeping the batch size, and the number of channels/filters)