Evaluation results and important update #49
fpetric
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Following the second round of submissions, these are current results for FBM1, FBM2 and simulation.
More detailed feedback will be sent to team leaders via emails (along with the team code). Team codes are not related to the list on the ICUAS webpage.
Most of the solutions do not build or work out of the box. We extend our special thanks to the teams whose solutions do. To others, we kindly request your docker images, docker files and codes build and run cleanly.
For FBM2, make sure you provide the results in the appropriate format and store the results according to the instructions. Otherwise, your score will be N/A.
For simulation, you should publish only the annotated images to the
/red/crack_image_annotated
topic. As it's name suggests, the topic is meant to receive only the annotations of cracks (not the constant stream from your detector regardless of the detection). We do not have the time to sort through thousands of images in search of a couple of detections.We plan to use the time and the number of correct/false detections for the simulation points, but since there aren't many successful attempts I am inclined to postopone that decision. The guidelines are the same as before, speed and correct detections without false positives will get you most of the points.
Time limit for now has been set to 10 minutes. Make sure to send the UAV to the start position (which is different to the one we provided with the repo), so please record your starting position at the start of the challenge and send the UAV back when you are finished.
Beta Was this translation helpful? Give feedback.
All reactions