且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

验证准确性不会增加培训ResNet50

更新时间:2023-12-02 20:43:10

之所以发生这种情况,是因为我只是直接添加完全连接的层而没有先对其进行培训,如keras博客中所述, https://blog.keras. io/building-powerful-image-classification-models-using-very-little-data.html

It was happening because i was just directly adding the fully-connected layers without training it first, as mentioned in Blog of keras, https://blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html

为了执行微调,所有层都应从经过适当训练的权重开始:例如,您不应在预先训练的卷积基础上拍打随机初始化的完全连接的网络.这是因为由随机初始化的权重触发的大梯度更新将破坏卷积基础中的学习权重.在我们的案例中,这就是为什么我们首先训练***分类器,然后才开始对它进行微调卷积权重的原因.

in order to perform fine-tuning, all layers should start with properly trained weights: for instance you should not slap a randomly initialized fully-connected network on top of a pre-trained convolutional base. This is because the large gradient updates triggered by the randomly initialized weights would wreck the learned weights in the convolutional base. In our case this is why we first train the top-level classifier, and only then start fine-tuning convolutional weights alongside it.

所以答案是首先分别训练***模型,然后创建一个具有ResNet50模型的新模型,其权重,***模型及其权重都位于resnet模型(基础模型)的顶部,然后通过冻结对其进行首次训练基本模型(ResNet50)和基础模型的最后一层.

so answer is first train the top-model separately, then create a new model having ResNet50 model with its weight, with top-model and its weights on top of resnet model(base-model), then train it first by freezing the base-model(ResNet50) and the with the last layer of the base-model.

推荐文章