Spaces:
Runtime error
Runtime error
bhavitvyamalik
commited on
Commit
•
306db2e
1
Parent(s):
68fa3ca
add bias, new images
Browse files
misc/examples/female_biker_resized.jpg
ADDED
misc/examples/female_dev_1.jpg
ADDED
misc/examples/female_dev_2.jpg
ADDED
misc/examples/female_dev_4.jpg
ADDED
misc/examples/female_doctor.jpg
ADDED
misc/examples/female_doctor_1.jpg
ADDED
misc/examples/women_cricket.jpg
ADDED
sections/bias.md
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
## Bias Analysis
|
2 |
+
Due to the gender bias in data, gender identification by an image captioning model suffers. Also, the gender-activity bias, owing to the word-by-word prediction, influences other words in the caption prediction, resulting in the well-known problem of label bias.
|
3 |
+
|
4 |
+
One of the reasons why we chose Conceptual 12M over COCO captioning dataset for training our Multi-lingual Image Captioning model was that in former all named entities of type Person were substituted by a special token <PERSON>. Because of this, the gendered terms in our captions became quite infrequent. We'll present a few captions from our model to analyse how our model performed on different images on which different pre-trained image captioning model usually gives gender prediction biases
|