BioQwen: A Small-Parameter, High-Performance Bilingual Model for Biomedical Multi-Tasks
This repository contains the quantized weights for the BioQwen 1.8B version, processed through the MLC-LLM project. When you download the BioQwen.apk via this link, it will automatically download the files from this repository. Therefore, it is generally unnecessary to download and use these files separately.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.