mllmTeam commited on
Commit
4871958
·
verified ·
1 Parent(s): 7ca7943

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -1,3 +1,11 @@
 
 
 
 
 
 
 
 
1
  PhoneLM-0.5B is a 0.5 billion parameter decoder-only language model pre-trained on 1.1 trillion tokens.
2
 
3
  ## Usage
@@ -60,4 +68,4 @@ The training dataset PhoneLM used is comprised of a filtered mixture of open-sou
60
  | Cerebras-GPT-590M | 32.3 | 49.8 | 62.8 | 68.2 | 59.2 | 41.2 | 23.5 | 48.14 |
61
 
62
  ## LICENSE
63
- * This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B/LICENSE) License.
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - mlfoundations/dclm-baseline-1.0
5
+ language:
6
+ - en
7
+ library_name: transformers
8
+ ---
9
  PhoneLM-0.5B is a 0.5 billion parameter decoder-only language model pre-trained on 1.1 trillion tokens.
10
 
11
  ## Usage
 
68
  | Cerebras-GPT-590M | 32.3 | 49.8 | 62.8 | 68.2 | 59.2 | 41.2 | 23.5 | 48.14 |
69
 
70
  ## LICENSE
71
+ * This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B/LICENSE) License.