File size: 7,175 Bytes
5b8af3b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
---
base_model: Qwen/Qwen2.5-7B-Instruct
language:
- zh
license: apache-2.0
license_link: https://huggingface.co./Qwen/Qwen2.5-7B-Instruct/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- pytorch
- Qwen
- Qwen2.5
- ContaLLM
- ContaAI
library_name: transformers
---
<img src="https://conta-ai-image.oss-cn-shanghai.aliyuncs.com/contaai/logo2.png" alt="ContaLLM" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# ContaLLM-Fashion-7B-Instruct
ContaLLM-Fashion-7B-Instruct is a large-scale Chinese vertical marketing mode focusing on the fashion industry. You can customize and generate marketing texts according to users' specific marketing needs, brand, selection, content type, article length, topic, selling point, hashtag, scene, etc. Use the LLM's capabilities and training on existing high-quality marketing materials to help companies generate diversified, high-quality marketing content and improve marketing conversion rates.
## Model description
- **Model type:** A model trained on a mix of publicly available, synthetic and human-annotated datasets.
- **Language(s) (NLP):** Primarily Chinese
- **Industry:** Fashion Makeup Industry Marketing
- **License:** apache-2.0
- **Finetuned from model:** Qwen/Qwen2.5-7B-Instruct
### Model Stage
| **Industry** | **Version** | **Qwen 2.5 7B**
|--------------|-------------|------------------------------------------------------------------------------------------------------------|
| **Fashion** | **bf16** | [ContaAI/ContaLLM-Fashion-7B-Instruct](https://huggingface.co./ContaAI/ContaLLM-Fashion-7B-Instruct) |
| **Fashion** | **8bit** | [ContaAI/ContaLLM-Fashion-7B-Instruct-8bit](https://huggingface.co./ContaAI/ContaLLM-Fashion-7B-Instruct-8bit) |
| **Fashion** | **4bit** | [ContaAI/ContaLLM-Fashion-7B-Instruct-4bit](https://huggingface.co./ContaAI/ContaLLM-Fashion-7B-Instruct-4bit) |
## Using the model
### Loading with HuggingFace
To load the model with HuggingFace, use the following snippet:
```
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("ContaAI/ContaLLM-Fashion-7B-Instruct")
```
### System Prompt
The model is a Chinese beauty marketing model, so we use this system prompt by default:
```
system_prompt = '请根据用户提供的营销需求和其他信息写一篇时尚行业的营销推文。'
```
### User Prompt
Users can enter the required marketing needs according to their own needs, non-required including brand, product selection, content type, topics, selling point, hashtag, scenes, content length, which content length has three specifications, respectively, shorter, medium, longer. The details are as follows:
| Parameter name | Required | Meaning and optional range |
|-------------------|-----------------------|------------------------------------------------------------------------------------------------------|
| **营销需求** | required | Fill in your marketing requirements, cannot be blank |
| **品牌** | optional | Fill in your marketing brand, or remove this row from the prompt |
| **选品** | optional | Fill in your product selection, or remove this row from the prompt |
| **内容类型** | optional | Fill in the article type, or remove this row from the prompt |
| **内容长度** | optional | choices=['较长', '中等', '较短'], choose what you need, or remove this row from the prompt |
| **话题** | optional | Fill in your marketing topic, or remove this row from the prompt |
| **卖点** | optional | Fill in the selling point for your marketing needs, or remove this row from the prompt |
| **标签** | optional | Fill in the hashtag, or remove this row from the prompt |
| **场景** | optional | Fill in the scenes for your marketing needs, or remove this row from the prompt |
Example:
```
user_prompt = """营销需求:秋冬大包包推荐
品牌:Celine
选品:CELINE托特包
内容类型:产品种草与测评
内容长度:较短
话题:CELINE托特包、秋冬大包包、托特包用途
卖点:慵懒设计、大容量、新款限定设计
标签:CELINE、托特包、新品
场景:日常通勤、妈咪包使用、秋冬搭配"""
```
### Use example (with template)
```
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "ContaAI/ContaLLM-Fashion-7B-Instruct"
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_name)
system_prompt = '请根据用户提供的营销需求和其他信息写一篇时尚行业的营销推文。'
user_prompt = """营销需求:秋冬大包包推荐
品牌:Celine
选品:CELINE托特包
内容类型:产品种草与测评
内容长度:较短
话题:CELINE托特包、秋冬大包包、托特包用途
卖点:慵懒设计、大容量、新款限定设计
标签:CELINE、托特包、新品
场景:日常通勤、妈咪包使用、秋冬搭配"""
prompt_template = '''<|im_start|>system
{}<|im_end|>
<|im_start|>user
{}<|im_end|>
<|im_start|>assistant
'''
prompt = prompt_template.format(system_prompt, user_prompt)
tokenized_message = tokenizer(
prompt,
max_length=1024,
return_tensors="pt",
add_special_tokens=False
)
response_token_ids= model.generate(
**tokenized_message,
max_new_tokens=1024,
do_sample=True,
top_p=1.0,
temperature=0.5,
min_length=None,
use_cache=True,
top_k=50,
repetition_penalty=1.2,
length_penalty=1,
)
generated_tokens = response_token_ids[0, tokenized_message['input_ids'].shape[-1]:]
generated_text = tokenizer.decode(generated_tokens, skip_special_tokens=True)
print(generated_text)
```
### Bias, Risks, and Limitations
The ContaLLM models implemented safety techniques during data generation and training, but they are not deployed automatically with in-the-loop filtering of responses like ChatGPT during inference, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base Qwen2.5 models, however it is likely to have included a mix of Web data and technical sources like books and code.
The use of the models is at your own risk. You may need to monitor the outputs of the model and take appropriate actions such as content filtering if necessary.
## License and use
All Qwen 2.5 ContaAI models are released under Qwen's [Qwen 2.5 Community License Agreement](https://huggingface.co./Qwen/Qwen2.5-7B-Instruct/blob/main/LICENSE).
|