Move sentence from inside code block to outside
Browse files
README.md
CHANGED
@@ -12,9 +12,10 @@ GPT-J 162B was pre-trained on the Pile, a large-scale curated dataset created by
|
|
12 |
Architext models learn an inner representation of the architectural design that can be used to generate a larger diversity of geometric designs and can be useful for many downstream design workflows and tasks. While it could be adapted to many different design outputs, the model is best at generating residential floor plans given a natural language prompt.
|
13 |
|
14 |
# How to use
|
15 |
-
|
16 |
This model can be easily loaded using the AutoModelForCausalLM functionality:
|
17 |
|
|
|
18 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
19 |
|
20 |
tokenizer = AutoTokenizer.from_pretrained("architext/gptj-162M")
|
|
|
12 |
Architext models learn an inner representation of the architectural design that can be used to generate a larger diversity of geometric designs and can be useful for many downstream design workflows and tasks. While it could be adapted to many different design outputs, the model is best at generating residential floor plans given a natural language prompt.
|
13 |
|
14 |
# How to use
|
15 |
+
|
16 |
This model can be easily loaded using the AutoModelForCausalLM functionality:
|
17 |
|
18 |
+
```python
|
19 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
20 |
|
21 |
tokenizer = AutoTokenizer.from_pretrained("architext/gptj-162M")
|