This repository contains the full weights and LoRA weights for Zh-MT-LLM v1.0 which fine-tuned with ChatGLM3-6b-base. ### Zh-MT-LLM Zheng He Maritime Large Language Model (Zh-MT-LLM) is a vertical domain maritime Large Language Model developed by the Intelligent Technology Laboratory of Dalian Maritime University for practitioners, trainers and students in the maritime field, providing questions and answers on maritime laws and regulations, maritime education and training, and questions and answers on maritime expertise. Corresponding to the above three segments, our model has the following three main characteristics: - Maritime Laws and Regulations Q&A: The model is trained on a wide range of maritime laws and regulations, providing consulting services for those in the maritime field. - Maritime Education and Training: The model learns from maritime professional test questions, vocational examination syllabi, and high-quality crew common Q&A to provide training knowledge. - Maritime Expertise Q&A: The model covers ship maintenance, safety management, port operations, maritime logistics, navigation technology, marine environmental protection, and scientific research to answer questions for maritime industry practitioners. ### Zh-MT-SFT Dataset The specific statistics of the dataset used for the above training are as follows:
Services | Subtasks | Data sets | Data volume |
---|---|---|---|
Maritime Laws and Regulations Q&A | Maritime Legal Advice | CrimeKgAssitant | 18,279 |
Zh-law-qa | 59,244 | ||
The Court held | Zh-law-court | 2,684 | |
Sentence projections | Zh-law-predict | 3,004 | |
Maritime education and training | Maritime Education Counseling | Zh-edu-qa | 41,052 |
Maritime Specialization Question Bank | Zh-edu-qb | 23,531 | |
Maritime Expertise Q&A | Ship Knowledge | Zh-mt-qa | 46,759 |
Navigational Knowledge | |||
Port knowledge | |||
Marine knowledge | |||
Generic Dialogue | moss-003-sft-data | 300,000 | |
Total | 494,553 |