HamidRezaAttar
commited on
Commit
•
ee9390c
1
Parent(s):
691a62f
UPDATE README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,45 @@
|
|
1 |
-
---
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en
|
3 |
+
tags:
|
4 |
+
- text-generation
|
5 |
+
license: apache-2.0
|
6 |
+
widget:
|
7 |
+
- text: "Maximize your bedroom space without sacrificing style with the storage bed."
|
8 |
+
- text: "Handcrafted of solid acacia in weathered gray, our round Jozy drop-leaf dining table is a space-saving."
|
9 |
+
- text: "Our plush and luxurious Emmett modular sofa brings custom comfort to your living space."
|
10 |
+
|
11 |
+
|
12 |
+
---
|
13 |
+
## HomeGPT2
|
14 |
+
|
15 |
+
This model is fine-tuned using GPT-2 on amazon products metadata.
|
16 |
+
It can generate descriptions for your **home** products by getting a text prompt.
|
17 |
+
|
18 |
+
### Model description
|
19 |
+
|
20 |
+
|
21 |
+
[GPT-2](https://openai.com/blog/better-language-models/) is a large [transformer](https://arxiv.org/abs/1706.03762)-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
|
22 |
+
|
23 |
+
### How to use
|
24 |
+
For best experience and clean outputs, please use the notebook mentioned in my [GitHub](https://github.com/HamidRezaAttar/gpt-2-home-product-description-generation)
|
25 |
+
|
26 |
+
Also, you can use this model directly with a pipeline for text generation.
|
27 |
+
```python
|
28 |
+
>>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
|
29 |
+
>>> tokenizer = AutoTokenizer.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
|
30 |
+
>>> model = AutoModelForCausalLM.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
|
31 |
+
>>> generator = pipeline('text-generation', model, tokenizer=tokenizer, config={'max_length':100})
|
32 |
+
>>> generated_text = generator("This bed is very comfortable.")
|
33 |
+
```
|
34 |
+
|
35 |
+
### Citation info
|
36 |
+
```bibtex
|
37 |
+
@misc{HomeGPT2,
|
38 |
+
author = {HamidReza Fatollah Zadeh Attar},
|
39 |
+
title = {HomeGPT2 the English product description generator},
|
40 |
+
year = {2021},
|
41 |
+
publisher = {GitHub},
|
42 |
+
journal = {GitHub repository},
|
43 |
+
howpublished = {\url{https://github.com/HamidRezaAttar/gpt-2-home-product-description-generation}},
|
44 |
+
}
|
45 |
+
```
|