AI-DrivenExploitGeneration / tokenizer.json
Canstralian's picture
Create tokenizer.json
611e778 verified
raw
history blame contribute delete
143 Bytes
from transformers import BertTokenizer
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
tokenizer.save_pretrained('./hf_model')