gugarosa mojanjp commited on
Commit
1dc35eb
1 Parent(s): 41217aa

Update README.md (#69)

Browse files

- Update README.md (8584061b4d9f189aea26e170cb1c285a22fe731d)


Co-authored-by: Mojan Javaheripi <[email protected]>

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  inference: false
3
  license: other
4
- license_name: microsoft-research-license
5
  license_link: https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx
6
  language:
7
  - en
@@ -76,8 +76,8 @@ def print_prime(n):
76
  where the model generates the text after the comments.
77
 
78
  **Notes:**
79
- * Phi-1.5 is intended for research purposes. The model-generated text/code should be treated as a starting point rather than a definitive solution for potential use cases. Users should be cautious when employing these models in their applications.
80
- * Direct adoption for production tasks is out of the scope of this research project. As a result, Phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details.
81
  * If you are using `transformers>=4.36.0`, always load the model with `trust_remote_code=True` to prevent side-effects.
82
 
83
  ## Sample Code
@@ -151,7 +151,7 @@ Furthermore, in the forward pass of the model, we currently do not support outpu
151
  * [Flash-Attention](https://github.com/HazyResearch/flash-attention)
152
 
153
  ### License
154
- The model is licensed under the [Research License](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx).
155
 
156
  ### Citation
157
 
 
1
  ---
2
  inference: false
3
  license: other
4
+ license_name: mit
5
  license_link: https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx
6
  language:
7
  - en
 
76
  where the model generates the text after the comments.
77
 
78
  **Notes:**
79
+ * Phi-1.5-generated text/code should be treated as a starting point rather than a definitive solution for potential use cases. Users should be cautious when employing these models in their applications.
80
+ * Phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details.
81
  * If you are using `transformers>=4.36.0`, always load the model with `trust_remote_code=True` to prevent side-effects.
82
 
83
  ## Sample Code
 
151
  * [Flash-Attention](https://github.com/HazyResearch/flash-attention)
152
 
153
  ### License
154
+ The model is licensed under the [MIT License](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx).
155
 
156
  ### Citation
157