更新时间:2023-11-25 19:15:40
这似乎是一个回归问题.我注意到的一件事是,您的最后一层仍然具有ReLU激活功能.我建议在最后一层取出ReLU.
It seems to be a regression problem. One thing I noticed is that your last layer still has the ReLU activation function. I would recommend taking the ReLU out at the last layer.