mahirlabibdihan commited on
Commit
9d9cfd2
·
verified ·
1 Parent(s): 942f936

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -21,27 +21,27 @@ tags:
21
 
22
  This dataset was introduced in [MapEval: A Map-Based Evaluation of Geo-Spatial Reasoning in Foundation Models](https://arxiv.org/abs/2501.00316)
23
 
24
- ## Example
25
 
26
  ![Image](example.jpg)
27
 
28
- ### Query
29
  I am presently visiting Mount Royal Park . Could you please inform me about the nearby historical landmark?
30
 
31
- ### Options
32
  1. Circle Stone
33
  2. Secret pool
34
  3. Maison William Caldwell Cottingham
35
  4. Poste de cavalerie du Service de police de la Ville de Montreal
36
 
37
- ### Correct Option
38
  1. Circle Stone
39
 
40
- ## Prerequisite
41
 
42
  Download the [Vdata.zip](https://huggingface.co/datasets/MapEval/MapEval-Visual/resolve/main/Vdata.zip?download=true) and extract in the working directory. This directory contains all the images.
43
 
44
- ## Usage
45
  ```python
46
  from datasets import load_dataset
47
  import PIL.Image
@@ -72,7 +72,7 @@ for item in ds["test"]:
72
  print([prompt, img]) # Replace with your processing logic
73
  ```
74
 
75
- ## Leaderboard
76
 
77
  | Model | Overall | Place Info | Nearby | Routing | Counting | Unanswerable |
78
  |---------------------------|:-------:|:----------:|:------:|:-------:|:--------:|:------------:|
@@ -93,7 +93,7 @@ for item in ds["test"]:
93
  | Llava-1.5-7B-hf | 20.05 | 22.31 | 18.89 | 13.75 | 28.41 | 0.00 |
94
  | Human | 82.23 | 81.67 | 82.42 | 85.18 | 78.41 | 65.00 |
95
 
96
- ## Citation
97
 
98
  If you use this dataset, please cite the original paper:
99
 
 
21
 
22
  This dataset was introduced in [MapEval: A Map-Based Evaluation of Geo-Spatial Reasoning in Foundation Models](https://arxiv.org/abs/2501.00316)
23
 
24
+ # Example
25
 
26
  ![Image](example.jpg)
27
 
28
+ #### Query
29
  I am presently visiting Mount Royal Park . Could you please inform me about the nearby historical landmark?
30
 
31
+ #### Options
32
  1. Circle Stone
33
  2. Secret pool
34
  3. Maison William Caldwell Cottingham
35
  4. Poste de cavalerie du Service de police de la Ville de Montreal
36
 
37
+ #### Correct Option
38
  1. Circle Stone
39
 
40
+ # Prerequisite
41
 
42
  Download the [Vdata.zip](https://huggingface.co/datasets/MapEval/MapEval-Visual/resolve/main/Vdata.zip?download=true) and extract in the working directory. This directory contains all the images.
43
 
44
+ # Usage
45
  ```python
46
  from datasets import load_dataset
47
  import PIL.Image
 
72
  print([prompt, img]) # Replace with your processing logic
73
  ```
74
 
75
+ # Leaderboard
76
 
77
  | Model | Overall | Place Info | Nearby | Routing | Counting | Unanswerable |
78
  |---------------------------|:-------:|:----------:|:------:|:-------:|:--------:|:------------:|
 
93
  | Llava-1.5-7B-hf | 20.05 | 22.31 | 18.89 | 13.75 | 28.41 | 0.00 |
94
  | Human | 82.23 | 81.67 | 82.42 | 85.18 | 78.41 | 65.00 |
95
 
96
+ # Citation
97
 
98
  If you use this dataset, please cite the original paper:
99