Relationship with dbmdz/distilbert-base-german-europeana-cased?
Is this model the same as https://huggingface.co./dbmdz/distilbert-base-german-europeana-cased?
Following up on
@Marissa
's question, I wondered if there is more info on the data that were used for the initial training of distilbert-base-german-cased?
Thanks!
cc'ing @stefan-it just in case :)
Are there any further informations about the training of this model? It's hard to compare without these informations
Hi guys!
More information can be found here:
https://github.com/huggingface/transformers/pull/1873
So there's no connection between the DistilBERT for Europeana, because this model here was really trained and distilled from the (DBMDZ) German BERT model, whereas the Europeana DistilBERT model uses the Europeana BERT model as teacher (Europeana BERT was trained on historic German newspapers).
@Marissa I'm sorry for the late reply!
Thanks so much, @stefan-it . That helps a lot! โบ๏ธ Do you know whether itโs possible to also update the model card with this relevant information (to guide future users)?