Edit model card

FinancialBERT is a BERT model pre-trained on a large corpora of financial texts. The purpose is to enhance financial NLP research and practice in financial domain, hoping that financial practitioners and researchers can benefit from it without the necessity of the significant computational resources required to train the model.

The model was trained on a large corpus of financial texts:

  • TRC2-financial: 1.8M news articles that were published by Reuters between 2008 and 2010.
  • Bloomberg News: 400,000 articles between 2006 and 2013.
  • Corporate Reports: 192,000 transcripts (10-K & 10-Q)
  • Earning Calls: 42,156 documents.

More details on FinancialBERT can be found at: https://www.researchgate.net/publication/358284785_FinancialBERT_-_A_Pretrained_Language_Model_for_Financial_Text_Mining

Created by Ahmed Rachid Hazourli

Downloads last month
398
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using ahmedrachid/FinancialBERT 1