edia / languages /en.js
LMartinezEXEX's picture
Changed buttons labels
664fbd5
raw
history blame
4.41 kB
const english_data = {
"toolName": "EDIA: Stereotypes and Discrimination in Artificial Intelligence",
"toolSubname": "Stereotypes and Discrimination in Artificial Intelligence",
"introduction_1": "Language models and word representations obtained with machine learning contain discriminatory stereotypes. Here we present the EDIA project (Stereotypes and Discrimination in Artificial Intelligence). This project aimed to design and evaluate a methodology that allows social scientists and domain experts in Latin America to explore biases and discriminatory stereotypes present in word embeddings (WE) and language models (LM). It also allowed them to define the type of bias to explore and do an intersectional analysis using two binary dimensions (for example, <i>female-male</i> intersected with <i>fat-skinny</i>).",
"introduction_2": "EDIA contains several functions that serve to detect and inspect biases in natural language processing systems based on language models or word embeddings. We have models in Spanish and English to work with and explore biases in different languages ​​at the user's request. Each of the following spaces contains different functions that bring us closer to a particular aspect of the problem of bias and they allow us to understand different but complementary parts of it.",
"wordBias": {
"title": "Biases in words",
"description": "Based on a technique to detect biases in WE, this function allows us to visualize the distribution of words in 2D space and thus observe the distance between them. The more occurrence contexts they share, the closer they will be, and the fewer occurrence contexts they share, the further they will be. This usually makes words with a similar meaning appear close. From the creation of word lists to define semantic fields, we will be able to observe biases and explore neighboring words between those meanings.",
"tutorial": "Tutorial: Word lists exploration",
"manual-1": "Handbook:<br>Word exploration",
"manual-2":"Handbook:<br>Word bias"
},
"phraseBias": {
"title": "Sentence biases",
"description": "Here we deploy a tool that uses language models to reveal sentence biases, allowing us to work with non-binary biases (such as female-male) and avoid ambiguities (product of polysemy). From sentences where one of them contains <i>a) stereotype</i> and the other <i>b) anti-stereotype</i> (example: <i>a)</i> <i>Homosexual</i> couples should not be allowed to get married, <i>b)</i> <i>Heterosexual</i> couples should not be allowed to get married). We seek to define the preferences of a pre-trained language model when producing language. If the model were unbiased, both would have the same level of preference, but if the model were biased, one would have a higher preference.",
"tutorial-1": "Tutorial:<br>Phrase bias",
"manual-1": "Handbook:<br>Phrase bias",
"tutorial-2": "Tutorial:<br>Crows - Pairs",
"manual-2": "Handbook:<br>Crows - Pairs"
},
"dataBias": {
"title": "Data",
"description": "This tool shows additional information about the word, such as frequency and context of occurrence within the training corpus used to explain and interpret unexpected behaviors in other tabs as a result of polysemy or the infrequency of words, and from this exploration, to be able to make pertinent modifications in our lists of words and phrases.",
"tutorial": "Tutorial: Words data",
"manual": "Handbook"
},
"our-pages-title": "Where you can find EDIA:",
"footer": "IMPORTANT CLARIFICATIONS: Queries made when using this software are automatically registered in our system. We declare that the information collected is anonymous, confidential and that it will only be used for research purposes. <i>To perform the explorations of the dimensions of analysis, such as gender, we need to simplify it to a binary phenomenon; We understand that it is an oversimplification, it is a first approximation and we are aware of this limitation while representing complex phenomena within social constructs.</i>",
"hf_btn": "Try it out in HuggingFace🤗!",
"ccad_btn": "Try it out in CCAD!",
"tutorial_btn": "EDIA video presentation"
}