BERT for Patents

BERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERTLARGE.

If you want to learn more about the model, check out the blog post, white paper and GitHub page containing the original TensorFlow checkpoint.


Projects using this model (or variants of it):

Downloads last month
8,203
Safetensors
Model size
346M params
Tensor type
I64
Β·
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for anferico/bert-for-patents

Adapters
4 models
Finetunes
1 model

Spaces using anferico/bert-for-patents 9