Submission Guide

Submission Workflow

Step by step
  1. Go to the relevant leaderboard page (see Step 1 below) and use the built-in scorer on the website to check your CSV locally before submission.
  2. Clone or fork the repository and prepare exactly one submission file for the correct benchmark folder:
    In-distribution -> submissions/iid/team-name.csv Out-of-distribution -> submissions/ood/team-name.csv Zero-shot -> submissions/zero-shot/team-name.csv
  3. Open a pull request against main and fill in the metadata block completely so the leaderboard can display the method name, authors, affiliation, and paper or code links.
  4. Wait for the scoring workflow to validate the submission and comment the preview score. After review and merge, the leaderboard workflow updates the published benchmark results.

Step 1: Score On The Website

Use the scorer available on the relevant leaderboard before opening a pull request. This lets you verify that the file format is correct and that the score looks reasonable.

Step 2: Clone Or Fork And Place The CSV

Clone or fork the repository, then add your CSV.

git clone git@github.com:accidentbench/accidentbench.github.io.git

Add exactly one CSV file under the matching folder.

In-distribution -> submissions/iid/ Out-of-distribution -> submissions/ood/ Zero-shot -> submissions/zero-shot/

Step 3: Add Metadata

Complete the PR metadata block so the leaderboard can show your method and reference links in a standardized way. You can review the required fields in the submission PR template.

  • Include: method name, authors, affiliation, benchmark, contact email, paper link, code link, and a short description.
  • Use: paper_url: N/A or code_url: N/A if a link is not public yet.

Step 4: Wait For Validation

Open the pull request against main and let the workflows handle the rest. The submission workflow checks the CSV, posts a score preview, and after merge the leaderboard pipeline can update the results.

  • Before merge: the PR is validated and scored.
  • After merge: leaderboard data can be refreshed from the merged submission and metadata.

Reference examples

Example CSV Format

Your CSV must contain one row per evaluated clip and use the following columns.

path accident_time center_x center_y type
path,accident_time,center_x,center_y,type
videos/Z4kg2Ev3vhk_00.mp4,14.9835,0.5,0.5,single
videos/unS0-TLF1ao_00.mp4,2.96,0.5,0.5,t-bone
videos/UarP8qU1S-c_00.mp4,7.6665,0.5,0.5,single

Example PR Metadata

Use the PR template and complete the metadata block so the leaderboard can display your submission consistently.

method_name: Molmo-7B End-to-End
authors: Jane Doe, John Smith
affiliation: Example Lab, Example University
paper_url: https://arxiv.org/abs/2604.09819
code_url: https://github.com/example/accident-method
contact_email: jane.doe@example.edu
benchmark: zero-shot
short_description: End-to-end zero-shot accident understanding baseline using prompt-based temporal, spatial, and collision-type reasoning.