且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

在Keras中,TimeDistributed层的作用是什么?

更新时间:2023-12-03 10:51:28

keras中-在构建顺序模型时-通常是第二维(样本维之后),它与time维相关.这意味着,例如,如果您的数据是5-dim(sample, time, width, length, channel),则可以沿时间维度使用TimeDistributed(适用于4-dim(sample, width, length, channel))应用卷积层(将同一层应用于每个时间片)以获得5-d输出.

In keras - while building a sequential model - usually the second dimension (one after sample dimension) - is related to a time dimension. This means that if for example, your data is 5-dim with (sample, time, width, length, channel) you could apply a convolutional layer using TimeDistributed (which is applicable to 4-dim with (sample, width, length, channel)) along a time dimension (applying the same layer to each time slice) in order to obtain 5-d output.

使用Dense的情况是在版本2.0中的keras中,默认情况下仅将Dense应用于最后一个尺寸(例如,如果将Dense(10)应用于形状为(n, m, o, p)的输入,则输出为形状(n, m, o, 10)),因此在您的情况下DenseTimeDistributed(Dense)是等效的.

The case with Dense is that in keras from version 2.0 Dense is by default applied to only last dimension (e.g. if you apply Dense(10) to input with shape (n, m, o, p) you'll get output with shape (n, m, o, 10)) so in your case Dense and TimeDistributed(Dense) are equivalent.