suyash2102 commited on
Commit
8047ea5
1 Parent(s): 9851a31

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -6
README.md CHANGED
@@ -13,25 +13,27 @@ model-index:
13
 
14
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
  should probably proofread and complete it, then remove this comment. -->
 
 
 
16
 
17
  # model-en-to-fr
18
 
19
  This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
 
20
 
21
  ## Model description
22
 
23
- More information needed
24
-
25
- ## Intended uses & limitations
26
-
27
- More information needed
28
 
29
  ## Training and evaluation data
30
 
31
- More information needed
32
 
33
  ## Training procedure
34
 
 
 
35
  ### Training hyperparameters
36
 
37
  The following hyperparameters were used during training:
 
13
 
14
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
  should probably proofread and complete it, then remove this comment. -->
16
+ # Introduction
17
+
18
+ I have made a working user interactive Gradio Language Translation model which translates any English sentence into French sentence. For this i have fine tuned a pre trained model which i have used from HuggingFace.
19
 
20
  # model-en-to-fr
21
 
22
  This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
23
+ I have used this model to translate english sentences to French.
24
 
25
  ## Model description
26
 
27
+ I have used the inbuilt features of transformers to make this model. The model is made from AutoModelForSeq2SeqLM and i have tokenized the dataset accoding to the pre trained model.
 
 
 
 
28
 
29
  ## Training and evaluation data
30
 
31
+ I have used the Sacrebleu method to evaluate my model which is generally used in language translation. It compares the number of common words in predicted and correct output and then gives its correctness.
32
 
33
  ## Training procedure
34
 
35
+ I have used the Seq2SeqTrainer function to train my dataset over the pre trained model.THe specific parameters are given below which i have used.
36
+
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training: