Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,10 @@ language:
|
|
8 |
This repository accompanies the paper: Hechtner, F., Schmidt, L., Seebeck, A., & Weiß, M. (2025). How to design and employ specialized large language models for accounting and tax research: The example of TaxBERT.
|
9 |
TaxBERT is a domain-adapated RoBERTa model, specifically designed to analyze qualitative corporate tax disclosures.
|
10 |
|
|
|
|
|
|
|
|
|
11 |
**SSRN**: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5146523
|
12 |
The paper provides an ‘A-to-Z’ description of how to design and employ specialized Bidirectional Encoder Representation of Transformers (BERT) models that are environmentally sustainable and practically feasible for accounting and tax researchers.
|
13 |
|
|
|
8 |
This repository accompanies the paper: Hechtner, F., Schmidt, L., Seebeck, A., & Weiß, M. (2025). How to design and employ specialized large language models for accounting and tax research: The example of TaxBERT.
|
9 |
TaxBERT is a domain-adapated RoBERTa model, specifically designed to analyze qualitative corporate tax disclosures.
|
10 |
|
11 |
+
In the future, we will add the following features:
|
12 |
+
- Tax Sentence Recognition
|
13 |
+
- Tax Risk Sentiment
|
14 |
+
|
15 |
**SSRN**: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5146523
|
16 |
The paper provides an ‘A-to-Z’ description of how to design and employ specialized Bidirectional Encoder Representation of Transformers (BERT) models that are environmentally sustainable and practically feasible for accounting and tax researchers.
|
17 |
|