-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loss nan #2
Comments
@tak-s As for the performance, the possible reason is:
Any suggestion from your side is also welcomed |
Is the VOC dataset obtained with get_voc.sh when you get mAP = 0.23? I would like to start by matching with your learning results. |
@tak-s
but I checked my result, I guess i only run 20 epochs for that result due to my limited computation power. |
作者您好,谢谢提供的有关信息,可是我在coco数据集上运行了train.py,还是会出现loss为nan的情况,请问有什么建议吗? |
你好,loss还是出现NAN的话,建议按一下常见步骤查看:
|
感觉时代码有bug,作者不打算修一修? @LongxingTan |
Thanks author for providing a good tensorflow yolov5 implementation, there's a mistake in the iou calculation which leads to incorrect iou loss and Nan loss value. I filed a PR #4 that can fix this issue. |
Thank you for providing a useful repository.
I run this train code on TF2.4.
After 5k iteration, loss is nan...
Please show us your training parameters and result information on this repo.
The text was updated successfully, but these errors were encountered: