Solutions and team information - Upload instructions #39
fpetric
announced in
Announcements
Replies: 2 comments 1 reply
-
Hello, for theSimulation Benchmark, can we also submit a docker image via DockerHub, that you can run via ./startup/challenge/start.sh ? Cause I am not sure how will you build the packages -> 1. On your host machine? Or 2. In the simulation docker container? |
Beta Was this translation helpful? Give feedback.
0 replies
-
When will the results be announced? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Uploading team info and your solutions
Dear Teams,
we have sent emails to team leaders containing links to shared folders that we will use to process your team info and also your solutions.
Important: If your team leader has not received a link, or you received a wrong link, let us know as soon as possible by sending email to competition email. The folder name should correspond to your team's name, without spaces and special characters (such as
,
,&
and similar).Uploading team info
Each folder contains a subfolder named
team_members
. This folder is reserved for a text file containing list of team members, theri emails and institution. Short instructions are inside the folder.Uploading solutions
For each of the benchmarks, we have prepared a separate folder in the shared folder assigned to your team. Depending on the benchmark, the submission procedure is different so please, read carefully.
FBM1 and FBM2 (detection and pose estimation)
To submit a solution for FBM1 and FBM2, use a dedicated folder. You can submit your solution in two ways:
Option 1: Dockerfile and accompanying building and running scripts that build on top of the Docker image provided for the benchmarks, alongside any packages that you developed / need. Upload the solution as a .zip file named
<team_name>_FBM_i_dockerfile.zip
. For team name, please use the name of the folder you were shared (team name without special characters). Replacei
with appropriate FBM number, or with ‘12’ if the same dockerfile applies to both FBMs.Option 2: Docker image via Dockerhub. To submit a solution, upload a .txt file named
<team_name>_FBM_i_dockerhub.txt
with a link to your image on Dockerhub to the corresponding folder. As in option 1, please use the name of the folder you were shared (team name without special characters) and replacei
with appropriate FBM number, or with ‘12’ if the uploaded image includes solutions to both FBMs.Simulation benchmark
You need to upload a
.zip
file named<team_name>_docker.zip
to the appropriate folder. Again, for team name, please use the name of the folder you were shared (team name without special characters). The zip file should contain:your version of
icuas23_competition
package (with yoursession.yml
and any custom config files)update the docker file and build/run scripts to enable the download and build of all your code
all private ROS packages that you developed that cannot be downloaded using the submitted docker file
You must not include
uav_ros_simulation
package and its dependencies in your.zip
file.To increase the chance of your code working, please keep your submission as clean as possible. Never include build output from your machine in your submission.
Important:
You must publish annotated image of the crack detection as a proof of detection (annotation in form of a bounding box around the crack is sufficient). The image should be published as a
sensor_msgs/Image
to topic/red/crack_image_annotated
. Your score for the simulation benchmark will depend on the time needed to visit all points of interest and number of correctly identified cracks. Final scoring scheme, including the time limit, for the simulation benchmark will be set during the first evaluations.1st evaluation run - deadline
The deadline for the first submission is Monday, April 3rd, 23:59 CEST. Submission is not mandatory. We expect to return the evaluation results by the end of the week.
Evaluation schedule
We will collect the solutions on each Monday night (April 3rd, April 10th, April 17th, 23:59 CEST) and evaluate them as soon as possible. The schedule may change based on the results, the number of solutions and time required to evaluate, we will have more information after the first evaluation runs.
Beta Was this translation helpful? Give feedback.
All reactions