{"id":35989,"date":"2025-05-07T09:10:34","date_gmt":"2025-05-07T01:10:34","guid":{"rendered":"https:\/\/www.wsisp.com\/helps\/35989.html"},"modified":"2025-05-07T09:10:34","modified_gmt":"2025-05-07T01:10:34","slug":"%e5%bc%80%e6%ba%90%e6%a8%a1%e5%9e%8b%e5%ba%94%e7%94%a8%e8%90%bd%e5%9c%b0-deepseek-r1-distill-qwen-7b-lora%e5%be%ae%e8%b0%83-llama-factory-%e5%8d%95%e6%9c%ba%e5%8d%95%e5%8d%a1-v100%ef%bc%88%e4%b8%80","status":"publish","type":"post","link":"https:\/\/www.wsisp.com\/helps\/35989.html","title":{"rendered":"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09"},"content":{"rendered":"<h2>\u4e00\u3001\u524d\u8a00<\/h2>\n<p>\u00a0\u00a0\u00a0 \u5982\u4eca&#xff0c;\u5927\u8bed\u8a00\u6a21\u578b\u9886\u57df\u70ed\u95f9\u975e\u51e1&#xff0c;\u5404\u79cd\u6a21\u578b\u4e0d\u65ad\u6d8c\u73b0\u3002DeepSeek-R1-Distill-Qwen-7B \u6a21\u578b\u51ed\u501f\u5176\u51fa\u8272\u7684\u6548\u679c\u548c\u6027\u80fd&#xff0c;\u5438\u5f15\u4e86\u4f17\u591a\u5f00\u53d1\u8005\u7684\u76ee\u5149\u3002\u800c LLaMa-Factory \u4f5c\u4e3a\u5f3a\u5927\u7684\u5fae\u8c03\u5de5\u5177&#xff0c;\u80fd\u8ba9\u6a21\u578b\u66f4\u597d\u5730\u6ee1\u8db3\u4e2a\u6027\u5316\u9700\u6c42\u3002<\/p>\n<p>\u00a0\u00a0\u00a0 \u5728\u672c\u7bc7\u4e2d&#xff0c;\u5c06\u6df1\u5165\u63a2\u8ba8\u5982\u4f55\u8fd0\u7528 LLaMa-Factory \u5bf9 DeepSeek-R1-Distill-Qwen-7B \u6a21\u578b\u8fdb\u884c\u5fae\u8c03&#xff0c;\u63a2\u7d22\u5982\u4f55\u901a\u8fc7\u5fae\u8c03&#xff0c;\u8ba9\u6a21\u578b\u66f4\u597d\u5730\u4e3a\u6211\u4eec\u6240\u7528\u3002<\/p>\n<p class=\"img-center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"519\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab3070b8f7.png\" width=\"1503\" \/><\/p>\n<hr \/>\n<h2>\u4e8c\u3001\u672f\u8bed\u4ecb\u7ecd<\/h2>\n<h3>2.1. LoRA\u5fae\u8c03<\/h3>\n<p>\u00a0 \u00a0 LoRA (Low-Rank Adaptation) \u7528\u4e8e\u5fae\u8c03\u5927\u578b\u8bed\u8a00\u6a21\u578b (LLM)\u3002 \u00a0\u662f\u4e00\u79cd\u6709\u6548\u7684\u81ea\u9002\u5e94\u7b56\u7565&#xff0c;\u5b83\u4e0d\u4f1a\u5f15\u5165\u989d\u5916\u7684\u63a8\u7406\u5ef6\u8fdf&#xff0c;\u5e76\u5728\u4fdd\u6301\u6a21\u578b\u8d28\u91cf\u7684\u540c\u65f6\u663e\u7740\u51cf\u5c11\u4e0b\u6e38\u4efb\u52a1\u7684\u53ef\u8bad\u7ec3\u53c2\u6570\u6570\u91cf\u3002<\/p>\n<h3>2.2. \u53c2\u6570\u9ad8\u6548\u5fae\u8c03(PEFT)\u00a0<\/h3>\n<p>\u00a0 \u00a0 \u4ec5\u5fae\u8c03\u5c11\u91cf (\u989d\u5916) \u6a21\u578b\u53c2\u6570&#xff0c;\u540c\u65f6\u51bb\u7ed3\u9884\u8bad\u7ec3 LLM \u7684\u5927\u90e8\u5206\u53c2\u6570&#xff0c;\u4ece\u800c\u5927\u5927\u964d\u4f4e\u4e86\u8ba1\u7b97\u548c\u5b58\u50a8\u6210\u672c\u3002<\/p>\n<h3>2.3. LLaMA-Factory<\/h3>\n<p>\u00a0 \u00a0 \u662f\u4e00\u4e2a\u4e0e LLaMA&#xff08;Large Language Model Meta AI&#xff09;\u76f8\u5173\u7684\u9879\u76ee&#xff0c;\u65e8\u5728\u4e3a\u7528\u6237\u63d0\u4f9b\u4e00\u79cd\u7b80\u5316\u548c\u4f18\u5316\u7684\u65b9\u5f0f\u6765\u8bad\u7ec3\u3001\u5fae\u8c03\u548c\u90e8\u7f72\u5927\u578b\u8bed\u8a00\u6a21\u578b\u3002\u8be5\u5de5\u5177\u901a\u5e38\u5305\u62ec\u4e00\u7cfb\u5217\u529f\u80fd&#xff0c;\u5982\u6570\u636e\u5904\u7406\u3001\u6a21\u578b\u914d\u7f6e\u3001\u8bad\u7ec3\u76d1\u63a7\u7b49&#xff0c;\u4ee5\u5e2e\u52a9\u7814\u7a76\u4eba\u5458\u548c\u5f00\u53d1\u8005\u66f4\u9ad8\u6548\u5730\u4f7f\u7528 LLaMA \u6a21\u578b\u3002<\/p>\n<p>\u00a0 \u00a0 LLaMA-Factory\u652f\u6301\u7684\u6a21\u578b\u5217\u8868&#xff1a;<\/p>\n<p class=\"img-center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"811\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab30748d65.png\" width=\"776\" \/><\/p>\n<p class=\"img-center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"812\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab3077c0f1.png\" width=\"1079\" \/><\/p>\n<h3>2.4. DeepSeek-R1-Distill-Qwen-7B<\/h3>\n<p>\u00a0 \u00a0 \u662f\u4e00\u4e2a\u7531DeepSeek\u5f00\u53d1\u7684\u6a21\u578b&#xff0c;\u5b83\u662f\u901a\u8fc7\u84b8\u998f\u6280\u672f\u5c06Qwen-7B\u5927\u578b\u6a21\u578b\u7684\u4e00\u90e8\u5206\u77e5\u8bc6\u7cbe\u534e\u63d0\u53d6\u51fa\u6765&#xff0c;\u4ee5\u9002\u5e94\u66f4\u5c0f\u578b\u7684\u6a21\u578b\u9700\u6c42\u3002<\/p>\n<hr \/>\n<h2>\u4e09\u3001\u524d\u7f6e\u6761\u4ef6<\/h2>\n<h3>\u00a03.1. \u57fa\u7840\u73af\u5883\u53ca\u524d\u7f6e\u6761\u4ef6<\/h3>\n<p>\u00a0\u00a0 \u00a0 1. \u64cd\u4f5c\u7cfb\u7edf&#xff1a;centos7<\/p>\n<p>\u00a0 \u00a0 \u00a02. NVIDIA Tesla V100 32GB \u00a0 CUDA Version: 12.2\u00a0<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"302\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab307d22dd.png\" width=\"726\" \/><\/p>\n<p>\u00a0 \u00a0 \u00a03. \u63d0\u524d\u4e0b\u8f7d\u597dDeepSeek-R1-Distill-Qwen-7B\u6a21\u578b\u00a0 \u00a0 \u00a0 \u00a0 \u00a0<\/p>\n<p>\u00a0\u901a\u8fc7\u4ee5\u4e0b\u4e24\u4e2a\u5730\u5740\u8fdb\u884c\u4e0b\u8f7d&#xff0c;\u4f18\u5148\u63a8\u8350\u9b54\u642d\u00a0 \u00a0 \u00a0 \u00a0\u00a0<\/p>\n<p>huggingface&#xff1a;<\/p>\n<p>https:\/\/huggingface.co\/deepseek-ai\/DeepSeek-R1-Distill-Qwen-7B\/tree\/main<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"746\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab307e14b8.png\" width=\"1534\" \/><\/p>\n<p>ModelScope&#xff1a;<\/p>\n<p>\u9b54\u642d\u793e\u533a<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"897\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011032-681ab3083617b.png\" width=\"1814\" \/><\/p>\n<p>\u00a0\u6309\u9700\u9009\u62e9SDK\u6216\u8005Git\u65b9\u5f0f\u4e0b\u8f7d<\/p>\n<p class=\"img-center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"485\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab3090035b.png\" width=\"544\" \/><\/p>\n<\/p>\n<p>\u00a0 \u4f7f\u7528git-lfs\u65b9\u5f0f\u4e0b\u8f7d\u793a\u4f8b&#xff1a;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"198\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab30917108.png\" width=\"983\" \/><\/p>\n<\/p>\n<h3>3.2.\u00a0Anaconda\u5b89\u88c5<\/h3>\n<p>1\u3001Update System<br \/>\nsudo yum update -y<br \/>\nsudo yum upgrade -y<\/p>\n<p>2\u3001Download Anaconda<br \/>\nwget https:\/\/repo.anaconda.com\/archive\/Anaconda3-2022.10-Linux-x86_64.sh<\/p>\n<p>3\u3001Verify Data Integrity<br \/>\nsha256sum Anaconda3-2022.10-Linux-x86_64.sh<\/p>\n<p>4\u3001Run Anaconda Installation Script<br \/>\nbash Anaconda3-2022.10-Linux-x86_64.sh<\/p>\n<p>\u5b89\u88c5\u76ee\u5f55&#xff1a;\/opt\/anaconda3<\/p>\n<p>\u6ce8&#xff1a;\u5b89\u88c5\u4f4d\u7f6e\u53ef\u4ee5\u5728\u6267\u884c\u5b89\u88c5\u811a\u672c\u7684\u65f6\u5019\u76f4\u63a5\u6307\u5b9a&#xff0c;\u53ef\u4ee5\u8fd9\u6837\u4fee\u6539\u6267\u884c\u5185\u5bb9<br \/>\nbash Anaconda3-2022.10-Linux-x86_64.sh -p \/opt\/anaconda3<\/p>\n<p>Do you wish the installer to initialize Anaconda3 by running conda init?<br \/>\nyes<\/p>\n<p>\u5982\u679c\u6ca1\u6709\u6267\u884c\u521d\u59cb\u5316&#xff0c;\u53ef\u4ee5\u6267\u884c&#xff1a;\/opt\/anaconda3\/bin\/conda init<\/p>\n<p>\u6ce8&#xff1a;\u521d\u59cb\u5316\u65f6&#xff0c;anaconda\u5c06\u914d\u7f6e\u5199\u5165\u4e86~\/.bashrc \u6587\u4ef6&#xff0c;\u76f4\u63a5\u6267\u884c<br \/>\nsource ~\/.bashrc<\/p>\n<p>5\u3001Verify Installation<br \/>\nconda &#8211;version<\/p>\n<p>6\u3001\u914d\u7f6e\u955c\u50cf\u6e90<br \/>\nconda config &#8211;add channels https:\/\/mirrors.tuna.tsinghua.edu.cn\/anaconda\/pkgs\/free\/<br \/>\nconda config &#8211;add channels https:\/\/mirrors.tuna.tsinghua.edu.cn\/anaconda\/pkgs\/main\/<br \/>\nconda config &#8211;set show_channel_urls yes <\/p>\n<h3>3.3.\u4e0b\u8f7dLLaMA-Factory<\/h3>\n<p>\u65b9\u5f0f\u4e00&#xff1a;\u76f4\u63a5\u4e0b\u8f7d<\/p>\n<p>\u5730\u5740&#xff1a;GitHub &#8211; hiyouga\/LLaMA-Factory: Unified Efficient Fine-Tuning of 100&#043; LLMs &amp; VLMs (ACL 2024)<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"700\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab3092b2c5.png\" width=\"912\" \/><\/p>\n<p>\u65b9\u5f0f\u4e8c&#xff1a;\u4f7f\u7528git\u514b\u9686\u9879\u76ee<\/p>\n<p>git clone &#8211;depth 1 https:\/\/github.com\/hiyouga\/LLaMA-Factory.git <\/p>\n<p class=\"img-center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"139\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab30958dab.png\" width=\"744\" \/><\/p>\n<p>\u4e0b\u8f7d\u597d\u7684\u9879\u76ee\u653e\u7f6e\u5728\/data\/service\u76ee\u5f55\u4e0b<\/p>\n<h3>3.4. \u5b89\u88c5\u4f9d\u8d56<\/h3>\n<p>conda create &#8211;name llama_factory  python&#061;3.10<br \/>\nconda activate llama_factory<br \/>\ncd \/data\/service\/LLaMA-Factory<br \/>\npip install -e &#034;.[torch,metrics]&#034; -i https:\/\/pypi.tuna.tsinghua.edu.cn\/simple <\/p>\n<p>\u00a0 PS&#xff1a;\u8f6f\u786c\u4ef6\u8981\u6c42<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"598\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab309641f6.png\" width=\"1018\" \/><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"399\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab3098b90c.png\" width=\"849\" \/><\/p>\n<hr \/>\n<h2>\u56db\u3001\u6280\u672f\u5b9e\u73b0<\/h2>\n<h3>4.1.\u6570\u636e\u51c6\u5907<\/h3>\n<p>\u6709\u4e24\u79cd\u683c\u5f0f\u9009\u62e9&#xff0c;\u5305\u62ecalpaca\u548csharegpt<\/p>\n<p>alpaca\u793a\u4f8b\u683c\u5f0f&#xff1a;<\/p>\n<p>[<br \/>\n  {<br \/>\n    &#034;instruction&#034;: &#034;\u4eba\u7c7b\u6307\u4ee4&#xff08;\u5fc5\u586b&#xff09;&#034;,<br \/>\n    &#034;input&#034;: &#034;\u4eba\u7c7b\u8f93\u5165&#xff08;\u9009\u586b&#xff09;&#034;,<br \/>\n    &#034;output&#034;: &#034;\u6a21\u578b\u56de\u7b54&#xff08;\u5fc5\u586b&#xff09;&#034;,<br \/>\n    &#034;system&#034;: &#034;\u7cfb\u7edf\u63d0\u793a\u8bcd&#xff08;\u9009\u586b&#xff09;&#034;,<br \/>\n    &#034;history&#034;: [<br \/>\n      [&#034;\u7b2c\u4e00\u8f6e\u6307\u4ee4&#xff08;\u9009\u586b&#xff09;&#034;, &#034;\u7b2c\u4e00\u8f6e\u56de\u7b54&#xff08;\u9009\u586b&#xff09;&#034;],<br \/>\n      [&#034;\u7b2c\u4e8c\u8f6e\u6307\u4ee4&#xff08;\u9009\u586b&#xff09;&#034;, &#034;\u7b2c\u4e8c\u8f6e\u56de\u7b54&#xff08;\u9009\u586b&#xff09;&#034;]<br \/>\n    ]<br \/>\n  }<br \/>\n] <\/p>\n<p>\u5bf9\u4e8e\u4e0a\u8ff0\u683c\u5f0f\u7684\u6570\u636e&#xff0c;dataset_info.json\u00a0\u4e2d\u7684\u6570\u636e\u96c6\u63cf\u8ff0\u5e94\u4e3a&#xff1a;<\/p>\n<p>&#034;\u6570\u636e\u96c6\u540d\u79f0&#034;: {<br \/>\n  &#034;file_name&#034;: &#034;data.json&#034;,<br \/>\n  &#034;columns&#034;: {<br \/>\n    &#034;prompt&#034;: &#034;instruction&#034;,<br \/>\n    &#034;query&#034;: &#034;input&#034;,<br \/>\n    &#034;response&#034;: &#034;output&#034;,<br \/>\n    &#034;system&#034;: &#034;system&#034;,<br \/>\n    &#034;history&#034;: &#034;history&#034;<br \/>\n  }<br \/>\n} <\/p>\n<p>sharegpt\u793a\u4f8b\u683c\u5f0f&#xff1a;<\/p>\n<li>\u76f8\u6bd4 alpaca \u683c\u5f0f\u7684\u6570\u636e\u96c6&#xff0c;sharegpt \u683c\u5f0f\u652f\u6301\u66f4\u591a\u7684\u89d2\u8272\u79cd\u7c7b&#xff0c;\u4f8b\u5982 human\u3001gpt\u3001observation\u3001function \u7b49\u7b49\u3002\u5b83\u4eec\u6784\u6210\u4e00\u4e2a\u5bf9\u8c61\u5217\u8868\u5448\u73b0\u5728\u00a0conversations\u00a0\u5217\u4e2d\u3002<\/li>\n<li>\u6ce8\u610f\u5176\u4e2d human \u548c observation \u5fc5\u987b\u51fa\u73b0\u5728\u5947\u6570\u4f4d\u7f6e&#xff0c;gpt \u548c function \u5fc5\u987b\u51fa\u73b0\u5728\u5076\u6570\u4f4d\u7f6e\u3002<\/li>\n<p>[<br \/>\n  {<br \/>\n    &#034;conversations&#034;: [<br \/>\n      {<br \/>\n        &#034;from&#034;: &#034;human&#034;,<br \/>\n        &#034;value&#034;: &#034;\u4eba\u7c7b\u6307\u4ee4&#034;<br \/>\n      },<br \/>\n      {<br \/>\n        &#034;from&#034;: &#034;function_call&#034;,<br \/>\n        &#034;value&#034;: &#034;\u5de5\u5177\u53c2\u6570&#034;<br \/>\n      },<br \/>\n      {<br \/>\n        &#034;from&#034;: &#034;observation&#034;,<br \/>\n        &#034;value&#034;: &#034;\u5de5\u5177\u7ed3\u679c&#034;<br \/>\n      },<br \/>\n      {<br \/>\n        &#034;from&#034;: &#034;gpt&#034;,<br \/>\n        &#034;value&#034;: &#034;\u6a21\u578b\u56de\u7b54&#034;<br \/>\n      }<br \/>\n    ],<br \/>\n    &#034;system&#034;: &#034;\u7cfb\u7edf\u63d0\u793a\u8bcd&#xff08;\u9009\u586b&#xff09;&#034;,<br \/>\n    &#034;tools&#034;: &#034;\u5de5\u5177\u63cf\u8ff0&#xff08;\u9009\u586b&#xff09;&#034;<br \/>\n  }<br \/>\n] <\/p>\n<p>\u5bf9\u4e8e\u4e0a\u8ff0\u683c\u5f0f\u7684\u6570\u636e&#xff0c;dataset_info.json\u00a0\u4e2d\u7684\u6570\u636e\u96c6\u63cf\u8ff0\u5e94\u4e3a&#xff1a;<\/p>\n<p>&#034;\u6570\u636e\u96c6\u540d\u79f0&#034;: {<br \/>\n  &#034;file_name&#034;: &#034;data.json&#034;,<br \/>\n  &#034;formatting&#034;: &#034;sharegpt&#034;,<br \/>\n  &#034;columns&#034;: {<br \/>\n    &#034;messages&#034;: &#034;conversations&#034;,<br \/>\n    &#034;system&#034;: &#034;system&#034;,<br \/>\n    &#034;tools&#034;: &#034;tools&#034;<br \/>\n  }<br \/>\n} <\/p>\n<p>\u4ee5\u4e0b\u91c7\u7528sharegpt\u683c\u5f0f&#xff0c;\u5728\/data\/service\/LLaMA-Factory\/data\/\u76ee\u5f55\u4e0a\u4f20\u6d4b\u8bd5\u6570\u636e&#xff0c;\u6587\u4ef6\u540d\u4e3a<span style=\"color:#fe2c24\">ds_zh_demo.json<\/span>&#xff0c;\u6d4b\u8bd5\u6570\u636e\u793a\u4f8b\u5982\u4e0b&#xff1a;<\/p>\n<p>[<br \/>\n{<br \/>\n&#034;conversations&#034;: [<br \/>\n{<br \/>\n&#034;from&#034;: &#034;user&#034;,<br \/>\n&#034;value&#034;: &#034;\u4ec0\u4e48\u662f\u6df1\u5ea6\u5b66\u4e60&#xff0c;\u5b83\u4e0e\u4f20\u7edf\u673a\u5668\u5b66\u4e60\u6709\u4ec0\u4e48\u533a\u522b&#xff1f;&#034;<br \/>\n},<br \/>\n{<br \/>\n&#034;from&#034;: &#034;assistant&#034;,<br \/>\n&#034;value&#034;: &#034;\u6df1\u5ea6\u5b66\u4e60\u662f\u673a\u5668\u5b66\u4e60\u7684\u4e00\u4e2a\u5b50\u9886\u57df&#xff0c;\u4f7f\u7528\u4eba\u5de5\u795e\u7ecf\u7f51\u7edc&#xff0c;\u5c24\u5176\u662f\u6df1\u5ea6\u795e\u7ecf\u7f51\u7edc&#xff0c;\u6765\u8fdb\u884c\u6570\u636e\u5efa\u6a21\u3002\u4e0e\u4f20\u7edf\u673a\u5668\u5b66\u4e60\u65b9\u6cd5\u76f8\u6bd4&#xff0c;\u6df1\u5ea6\u5b66\u4e60\u80fd\u591f\u81ea\u52a8\u63d0\u53d6\u7279\u5f81&#xff0c;\u65e0\u9700\u624b\u52a8\u7279\u5f81\u5de5\u7a0b&#xff0c;\u9002\u5408\u5904\u7406\u5927\u89c4\u6a21\u6570\u636e\u96c6\u548c\u590d\u6742\u6a21\u5f0f\u3002&#034;<br \/>\n}<br \/>\n]<br \/>\n},<br \/>\n{<br \/>\n&#034;conversations&#034;: [<br \/>\n{<br \/>\n&#034;from&#034;: &#034;user&#034;,<br \/>\n&#034;value&#034;: &#034;\u5982\u4f55\u8bc4\u4f30\u4e00\u4e2a\u673a\u5668\u5b66\u4e60\u6a21\u578b\u7684\u6027\u80fd&#xff1f;&#034;<br \/>\n},<br \/>\n{<br \/>\n&#034;from&#034;: &#034;assistant&#034;,<br \/>\n&#034;value&#034;: &#034;\u8bc4\u4f30\u673a\u5668\u5b66\u4e60\u6a21\u578b\u7684\u6027\u80fd\u901a\u5e38\u4f7f\u7528\u591a\u79cd\u6307\u6807&#xff0c;\u5305\u62ec\u51c6\u786e\u7387\u3001\u7cbe\u786e\u7387\u3001\u53ec\u56de\u7387\u3001F1-score\u3001ROC\u66f2\u7ebf\u548cAUC\u503c\u3002\u9009\u62e9\u5408\u9002\u7684\u6307\u6807\u53d6\u51b3\u4e8e\u5177\u4f53\u4efb\u52a1\u7684\u6027\u8d28\u548c\u76ee\u6807\u3002&#034;<br \/>\n}<br \/>\n]<br \/>\n}<br \/>\n]<\/p>\n<p>\u4fee\u6539\u6570\u636e\u96c6\u63cf\u8ff0\u6587\u4ef6dataset_info.json<\/p>\n<p>vi \/data\/service\/LLaMA-Factory\/data\/dataset_info.json <\/p>\n<p>\u589e\u52a0\u4ee5\u4e0b\u5185\u5bb9&#xff1a;<\/p>\n<p>&#034;ds_zh_demo&#034;: {<br \/>\n&#034;file_name&#034;: &#034;ds_zh_demo.json&#034;,<br \/>\n&#034;formatting&#034;: &#034;sharegpt&#034;,<br \/>\n&#034;columns&#034;: {<br \/>\n  &#034;messages&#034;: &#034;conversations&#034;<br \/>\n},<br \/>\n&#034;tags&#034;: {<br \/>\n  &#034;role_tag&#034;: &#034;from&#034;,<br \/>\n  &#034;content_tag&#034;: &#034;value&#034;,<br \/>\n  &#034;user_tag&#034;: &#034;user&#034;,<br \/>\n  &#034;assistant_tag&#034;: &#034;assistant&#034;<br \/>\n}<br \/>\n}<\/p>\n<h3>4.2.\u914d\u7f6e\u6587\u4ef6\u51c6\u5907<\/h3>\n<p>1) \u5907\u4efd\u539f\u6709\u7684\u914d\u7f6e\u6587\u4ef6<\/p>\n<p>cp \/data\/service\/LLaMA-Factory\/examples\/train_lora\/llama3_lora_sft.yaml \/data\/service\/LLaMA-Factory\/examples\/train_lora\/llama3_lora_sft.yaml.bak <\/p>\n<p>2) \u521b\u5efa\u65b0\u7684\u914d\u7f6e\u6587\u4ef6<\/p>\n<p>mv \/data\/service\/LLaMA-Factory\/examples\/train_lora\/llama3_lora_sft.yaml \/data\/service\/LLaMA-Factory\/examples\/train_lora\/ds_qwen7b_lora_sft.yaml <\/p>\n<p>3) \u4fee\u6539\u914d\u7f6e\u6587\u4ef6\u5185\u5bb9<\/p>\n<p>vi \/data\/service\/LLaMA-Factory\/examples\/train_lora\/ds_qwen7b_lora_sft.yaml <\/p>\n<p>\u00a0 \u5185\u5bb9\u5982\u4e0b&#xff1a;<\/p>\n<p>### model<br \/>\nmodel_name_or_path: \/data\/model\/DeepSeek-R1-Distill-Qwen-7B<br \/>\ntrust_remote_code: true<\/p>\n<p>### method<br \/>\nstage: sft<br \/>\ndo_train: true<br \/>\nfinetuning_type: lora<br \/>\nlora_rank: 8<br \/>\nlora_target: all<\/p>\n<p>### dataset<br \/>\ndataset: ds_zh_demo<br \/>\ntemplate: deepseek3<br \/>\ncutoff_len: 4096<br \/>\nmax_samples: 4019<br \/>\noverwrite_cache: true<br \/>\npreprocessing_num_workers: 16<\/p>\n<p>### output<br \/>\noutput_dir: \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B<br \/>\nlogging_steps: 10<br \/>\nsave_steps: 500<br \/>\nplot_loss: true<br \/>\noverwrite_output_dir: true<\/p>\n<p>### train<br \/>\nper_device_train_batch_size: 1<br \/>\ngradient_accumulation_steps: 8<br \/>\nlearning_rate: 1.0e-4<br \/>\nnum_train_epochs: 1.0<br \/>\nlr_scheduler_type: cosine<br \/>\nwarmup_ratio: 0.1<br \/>\nbf16: true<br \/>\nddp_timeout: 180000000<\/p>\n<p>### eval<br \/>\nval_size: 0.1<br \/>\nper_device_eval_batch_size: 1<br \/>\neval_strategy: steps<br \/>\neval_steps: 500 <\/p>\n<p>\u00a0 \u9700\u8981\u5173\u6ce8\u4ee5\u4e0b\u53c2\u6570<\/p>\n<li>model_name_or_path&#xff1a;\u6a21\u578b\u8def\u5f84<\/li>\n<li>dataset&#xff1a;\u6570\u636e\u96c6\u540d\u79f0&#xff0c;\u5bf9\u5e94\u4e0a\u9762\u58f0\u660e\u7684qwen_zh_demo<\/li>\n<li>template&#xff1a;\u6a21\u7248<\/li>\n<li>cutoff_len&#xff1a;\u63a7\u5236\u8f93\u5165\u5e8f\u5217\u7684\u6700\u5927\u957f\u5ea6<\/li>\n<li>output_dir&#xff1a;\u5fae\u8c03\u540e\u6743\u91cd\u4fdd\u5b58\u8def\u5f84<\/li>\n<li>gradient_accumulation_steps&#xff1a;\u68af\u5ea6\u7d2f\u79ef\u7684\u6b65\u6570&#xff0c;GPU\u8d44\u6e90\u4e0d\u8db3\u65f6\u9700\u8981\u51cf\u5c11\u8be5\u503c<\/li>\n<li>num_train_epochs&#xff1a;\u8bad\u7ec3\u7684\u8f6e\u6570<\/li>\n<h3>4.3.\u542f\u52a8\u5fae\u8c03<\/h3>\n<p>conda activate llama_factory<br \/>\ncd \/data\/service\/LLaMA-Factory<br \/>\nllamafactory-cli train \/data\/service\/LLaMA-Factory\/examples\/train_lora\/ds_qwen7b_lora_sft.yaml<\/p>\n<p># \u540e\u53f0\u8fd0\u884c<br \/>\nnohup llamafactory-cli train \/data\/service\/LLaMA-Factory\/examples\/train_lora\/ds_qwen7b_lora_sft.yaml &gt; output.log 2&gt;&amp;1 &amp; <\/p>\n<h3><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"350\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab309a97d7.png\" width=\"660\" \/><\/h3>\n<h3 style=\"background-color:transparent\">4.4.\u5fae\u8c03\u7ed3\u679c<\/h3>\n<p>[INFO|configuration_utils.py:1052] 2025-02-18 16:39:55,400 &gt;&gt; loading configuration file \/data\/model\/DeepSeek-R1-Distill-Qwen-7B\/generation_config.json<br \/>\n[INFO|configuration_utils.py:1099] 2025-02-18 16:39:55,400 &gt;&gt; Generate config GenerationConfig {<br \/>\n  &#034;bos_token_id&#034;: 151646,<br \/>\n  &#034;do_sample&#034;: true,<br \/>\n  &#034;eos_token_id&#034;: 151643,<br \/>\n  &#034;temperature&#034;: 0.6,<br \/>\n  &#034;top_p&#034;: 0.95<br \/>\n}<\/p>\n<p>[INFO|2025-02-18 16:39:55] llamafactory.model.model_utils.checkpointing:157 &gt;&gt; Gradient checkpointing enabled.<br \/>\n[INFO|2025-02-18 16:39:55] llamafactory.model.model_utils.attention:157 &gt;&gt; Using torch SDPA for faster training and inference.<br \/>\n[INFO|2025-02-18 16:39:55] llamafactory.model.adapter:157 &gt;&gt; Upcasting trainable params to float32.<br \/>\n[INFO|2025-02-18 16:39:55] llamafactory.model.adapter:157 &gt;&gt; Fine-tuning method: LoRA<br \/>\n[INFO|2025-02-18 16:39:55] llamafactory.model.model_utils.misc:157 &gt;&gt; Found linear modules: down_proj,o_proj,up_proj,k_proj,v_proj,q_proj,gate_proj<br \/>\n[INFO|2025-02-18 16:39:55] llamafactory.model.loader:157 &gt;&gt; trainable params: 20,185,088 || all params: 7,635,801,600 || trainable%: 0.2643<br \/>\nDetected kernel version 4.18.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher.<br \/>\n[INFO|trainer.py:667] 2025-02-18 16:39:55,807 &gt;&gt; Using auto half precision backend<br \/>\n[INFO|trainer.py:2243] 2025-02-18 16:39:56,634 &gt;&gt; ***** Running training *****<br \/>\n[INFO|trainer.py:2244] 2025-02-18 16:39:56,634 &gt;&gt;   Num examples &#061; 3,617<br \/>\n[INFO|trainer.py:2245] 2025-02-18 16:39:56,634 &gt;&gt;   Num Epochs &#061; 1<br \/>\n[INFO|trainer.py:2246] 2025-02-18 16:39:56,634 &gt;&gt;   Instantaneous batch size per device &#061; 1<br \/>\n[INFO|trainer.py:2249] 2025-02-18 16:39:56,634 &gt;&gt;   Total train batch size (w. parallel, distributed &amp; accumulation) &#061; 8<br \/>\n[INFO|trainer.py:2250] 2025-02-18 16:39:56,634 &gt;&gt;   Gradient Accumulation steps &#061; 8<br \/>\n[INFO|trainer.py:2251] 2025-02-18 16:39:56,634 &gt;&gt;   Total optimization steps &#061; 452<br \/>\n[INFO|trainer.py:2252] 2025-02-18 16:39:56,638 &gt;&gt;   Number of trainable parameters &#061; 20,185,088<br \/>\n  0%|          | 0\/452 [00:00&lt;?, ?it\/s]\/usr\/local\/miniconda3\/envs\/llama_factory\/lib\/python3.10\/site-packages\/torch\/utils\/checkpoint.py:295: FutureWarning: &#096;torch.cpu.amp.autocast(args&#8230;)&#096; is deprecated. Please use &#096;torch.amp.autocast(&#039;cpu&#039;, args&#8230;)&#096; instead.<br \/>\n  with torch.enable_grad(), device_autocast_ctx, torch.cpu.amp.autocast(**ctx.cpu_autocast_kwargs):  # type: ignore[attr-defined]<br \/>\n100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 452\/452 [4:06:28&lt;00:00, 31.87s\/it][INFO|trainer.py:3705] 2025-02-18 20:46:24,795 &gt;&gt; Saving model checkpoint to \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/checkpoint-452<br \/>\n[INFO|configuration_utils.py:670] 2025-02-18 20:46:24,819 &gt;&gt; loading configuration file \/data\/model\/DeepSeek-R1-Distill-Qwen-7B\/config.json<br \/>\n[INFO|configuration_utils.py:739] 2025-02-18 20:46:24,820 &gt;&gt; Model config Qwen2Config {<br \/>\n  &#034;architectures&#034;: [<br \/>\n    &#034;Qwen2ForCausalLM&#034;<br \/>\n  ],<br \/>\n  &#034;attention_dropout&#034;: 0.0,<br \/>\n  &#034;bos_token_id&#034;: 151643,<br \/>\n  &#034;eos_token_id&#034;: 151643,<br \/>\n  &#034;hidden_act&#034;: &#034;silu&#034;,<br \/>\n  &#034;hidden_size&#034;: 3584,<br \/>\n  &#034;initializer_range&#034;: 0.02,<br \/>\n  &#034;intermediate_size&#034;: 18944,<br \/>\n  &#034;max_position_embeddings&#034;: 131072,<br \/>\n  &#034;max_window_layers&#034;: 28,<br \/>\n  &#034;model_type&#034;: &#034;qwen2&#034;,<br \/>\n  &#034;num_attention_heads&#034;: 28,<br \/>\n  &#034;num_hidden_layers&#034;: 28,<br \/>\n  &#034;num_key_value_heads&#034;: 4,<br \/>\n  &#034;rms_norm_eps&#034;: 1e-06,<br \/>\n  &#034;rope_scaling&#034;: null,<br \/>\n  &#034;rope_theta&#034;: 10000,<br \/>\n  &#034;sliding_window&#034;: null,<br \/>\n  &#034;tie_word_embeddings&#034;: false,<br \/>\n  &#034;torch_dtype&#034;: &#034;bfloat16&#034;,<br \/>\n  &#034;transformers_version&#034;: &#034;4.45.0&#034;,<br \/>\n  &#034;use_cache&#034;: true,<br \/>\n  &#034;use_mrope&#034;: false,<br \/>\n  &#034;use_sliding_window&#034;: false,<br \/>\n  &#034;vocab_size&#034;: 152064<br \/>\n}<\/p>\n<p>[INFO|tokenization_utils_base.py:2649] 2025-02-18 20:46:25,042 &gt;&gt; tokenizer config file saved in \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/checkpoint-452\/tokenizer_config.json<br \/>\n[INFO|tokenization_utils_base.py:2658] 2025-02-18 20:46:25,043 &gt;&gt; Special tokens file saved in \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/checkpoint-452\/special_tokens_map.json<br \/>\n[INFO|trainer.py:2505] 2025-02-18 20:46:25,377 &gt;&gt; <\/p>\n<p>Training completed. Do not forget to share your model on huggingface.co\/models &#061;)<\/p>\n<p>100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 452\/452 [4:06:28&lt;00:00, 32.72s\/it]<br \/>\n[INFO|trainer.py:3705] 2025-02-18 20:46:25,379 &gt;&gt; Saving model checkpoint to \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B<br \/>\n[INFO|configuration_utils.py:670] 2025-02-18 20:46:25,401 &gt;&gt; loading configuration file \/data\/model\/DeepSeek-R1-Distill-Qwen-7B\/config.json<br \/>\n[INFO|configuration_utils.py:739] 2025-02-18 20:46:25,401 &gt;&gt; Model config Qwen2Config {<br \/>\n  &#034;architectures&#034;: [<br \/>\n    &#034;Qwen2ForCausalLM&#034;<br \/>\n  ],<br \/>\n  &#034;attention_dropout&#034;: 0.0,<br \/>\n  &#034;bos_token_id&#034;: 151643,<br \/>\n  &#034;eos_token_id&#034;: 151643,<br \/>\n  &#034;hidden_act&#034;: &#034;silu&#034;,<br \/>\n  &#034;hidden_size&#034;: 3584,<br \/>\n  &#034;initializer_range&#034;: 0.02,<br \/>\n  &#034;intermediate_size&#034;: 18944,<br \/>\n  &#034;max_position_embeddings&#034;: 131072,<br \/>\n  &#034;max_window_layers&#034;: 28,<br \/>\n  &#034;model_type&#034;: &#034;qwen2&#034;,<br \/>\n  &#034;num_attention_heads&#034;: 28,<br \/>\n  &#034;num_hidden_layers&#034;: 28,<br \/>\n  &#034;num_key_value_heads&#034;: 4,<br \/>\n  &#034;rms_norm_eps&#034;: 1e-06,<br \/>\n  &#034;rope_scaling&#034;: null,<br \/>\n  &#034;rope_theta&#034;: 10000,<br \/>\n  &#034;sliding_window&#034;: null,<br \/>\n  &#034;tie_word_embeddings&#034;: false,<br \/>\n  &#034;torch_dtype&#034;: &#034;bfloat16&#034;,<br \/>\n  &#034;transformers_version&#034;: &#034;4.45.0&#034;,<br \/>\n  &#034;use_cache&#034;: true,<br \/>\n  &#034;use_mrope&#034;: false,<br \/>\n  &#034;use_sliding_window&#034;: false,<br \/>\n  &#034;vocab_size&#034;: 152064<br \/>\n}<\/p>\n<p>[INFO|tokenization_utils_base.py:2649] 2025-02-18 20:46:25,556 &gt;&gt; tokenizer config file saved in \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/tokenizer_config.json<br \/>\n[INFO|tokenization_utils_base.py:2658] 2025-02-18 20:46:25,556 &gt;&gt; Special tokens file saved in \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/special_tokens_map.json<br \/>\n{&#039;loss&#039;: 3.6592, &#039;grad_norm&#039;: 0.38773563504219055, &#039;learning_rate&#039;: 2.173913043478261e-05, &#039;epoch&#039;: 0.02}<br \/>\n{&#039;loss&#039;: 3.667, &#039;grad_norm&#039;: 0.698821485042572, &#039;learning_rate&#039;: 4.347826086956522e-05, &#039;epoch&#039;: 0.04}<br \/>\n{&#039;loss&#039;: 3.4784, &#039;grad_norm&#039;: 0.41371676325798035, &#039;learning_rate&#039;: 6.521739130434783e-05, &#039;epoch&#039;: 0.07}<br \/>\n{&#039;loss&#039;: 3.2962, &#039;grad_norm&#039;: 0.4966348111629486, &#039;learning_rate&#039;: 8.695652173913044e-05, &#039;epoch&#039;: 0.09}<br \/>\n{&#039;loss&#039;: 3.0158, &#039;grad_norm&#039;: 0.333425909280777, &#039;learning_rate&#039;: 9.997605179330019e-05, &#039;epoch&#039;: 0.11}<br \/>\n{&#039;loss&#039;: 3.2221, &#039;grad_norm&#039;: 0.3786776065826416, &#039;learning_rate&#039;: 9.970689785771798e-05, &#039;epoch&#039;: 0.13}<br \/>\n{&#039;loss&#039;: 2.8439, &#039;grad_norm&#039;: 0.3683229386806488, &#039;learning_rate&#039;: 9.914027086842322e-05, &#039;epoch&#039;: 0.15}<br \/>\n{&#039;loss&#039;: 3.0528, &#039;grad_norm&#039;: 0.42745739221572876, &#039;learning_rate&#039;: 9.82795618288397e-05, &#039;epoch&#039;: 0.18}<br \/>\n{&#039;loss&#039;: 2.9092, &#039;grad_norm&#039;: 0.45462721586227417, &#039;learning_rate&#039;: 9.712992168898436e-05, &#039;epoch&#039;: 0.2}<br \/>\n{&#039;loss&#039;: 3.1055, &#039;grad_norm&#039;: 0.5547119379043579, &#039;learning_rate&#039;: 9.56982305193869e-05, &#039;epoch&#039;: 0.22}<br \/>\n{&#039;loss&#039;: 2.9412, &#039;grad_norm&#039;: 0.5830215811729431, &#039;learning_rate&#039;: 9.399305633701373e-05, &#039;epoch&#039;: 0.24}<br \/>\n{&#039;loss&#039;: 2.7873, &#039;grad_norm&#039;: 0.5862609148025513, &#039;learning_rate&#039;: 9.202460382960448e-05, &#039;epoch&#039;: 0.27}<br \/>\n{&#039;loss&#039;: 2.8255, &#039;grad_norm&#039;: 0.5828853845596313, &#039;learning_rate&#039;: 8.980465328528219e-05, &#039;epoch&#039;: 0.29}<br \/>\n{&#039;loss&#039;: 2.6266, &#039;grad_norm&#039;: 0.6733331084251404, &#039;learning_rate&#039;: 8.734649009291585e-05, &#039;epoch&#039;: 0.31}<br \/>\n{&#039;loss&#039;: 2.8745, &#039;grad_norm&#039;: 0.6904928684234619, &#039;learning_rate&#039;: 8.46648252351431e-05, &#039;epoch&#039;: 0.33}<br \/>\n{&#039;loss&#039;: 2.8139, &#039;grad_norm&#039;: 0.7874809503555298, &#039;learning_rate&#039;: 8.177570724986628e-05, &#039;epoch&#039;: 0.35}<br \/>\n{&#039;loss&#039;: 2.7818, &#039;grad_norm&#039;: 0.8345168232917786, &#039;learning_rate&#039;: 7.86964261870916e-05, &#039;epoch&#039;: 0.38}<br \/>\n{&#039;loss&#039;: 2.7198, &#039;grad_norm&#039;: 0.8806198239326477, &#039;learning_rate&#039;: 7.544541013588645e-05, &#039;epoch&#039;: 0.4}<br \/>\n{&#039;loss&#039;: 2.7231, &#039;grad_norm&#039;: 0.9481658935546875, &#039;learning_rate&#039;: 7.204211494069292e-05, &#039;epoch&#039;: 0.42}<br \/>\n{&#039;loss&#039;: 2.7371, &#039;grad_norm&#039;: 0.9718573093414307, &#039;learning_rate&#039;: 6.850690776699573e-05, &#039;epoch&#039;: 0.44}<br \/>\n{&#039;loss&#039;: 2.6862, &#039;grad_norm&#039;: 1.2056019306182861, &#039;learning_rate&#039;: 6.486094521315022e-05, &#039;epoch&#039;: 0.46}<br \/>\n{&#039;loss&#039;: 2.4661, &#039;grad_norm&#039;: 1.200085163116455, &#039;learning_rate&#039;: 6.112604669781572e-05, &#039;epoch&#039;: 0.49}<br \/>\n{&#039;loss&#039;: 2.4841, &#039;grad_norm&#039;: 1.1310691833496094, &#039;learning_rate&#039;: 5.732456388071247e-05, &#039;epoch&#039;: 0.51}<br \/>\n{&#039;loss&#039;: 2.3755, &#039;grad_norm&#039;: 1.1279083490371704, &#039;learning_rate&#039;: 5.3479246898159063e-05, &#039;epoch&#039;: 0.53}<br \/>\n{&#039;loss&#039;: 2.5552, &#039;grad_norm&#039;: 1.2654848098754883, &#039;learning_rate&#039;: 4.96131082139099e-05, &#039;epoch&#039;: 0.55}<br \/>\n{&#039;loss&#039;: 2.6197, &#039;grad_norm&#039;: 1.3887016773223877, &#039;learning_rate&#039;: 4.574928490008264e-05, &#039;epoch&#039;: 0.58}<br \/>\n{&#039;loss&#039;: 2.3773, &#039;grad_norm&#039;: 1.3009178638458252, &#039;learning_rate&#039;: 4.1910900172361764e-05, &#039;epoch&#039;: 0.6}<br \/>\n{&#039;loss&#039;: 2.3881, &#039;grad_norm&#039;: 1.346793532371521, &#039;learning_rate&#039;: 3.812092500812646e-05, &#039;epoch&#039;: 0.62}<br \/>\n{&#039;loss&#039;: 2.4821, &#039;grad_norm&#039;: 1.7273674011230469, &#039;learning_rate&#039;: 3.440204067565511e-05, &#039;epoch&#039;: 0.64}<br \/>\n{&#039;loss&#039;: 2.3563, &#039;grad_norm&#039;: 1.529177188873291, &#039;learning_rate&#039;: 3.077650299710653e-05, &#039;epoch&#039;: 0.66}<br \/>\n{&#039;loss&#039;: 2.1308, &#039;grad_norm&#039;: 1.5957469940185547, &#039;learning_rate&#039;: 2.7266009157601224e-05, &#039;epoch&#039;: 0.69}<br \/>\n{&#039;loss&#039;: 2.1709, &#039;grad_norm&#039;: 1.4444897174835205, &#039;learning_rate&#039;: 2.3891567857490372e-05, &#039;epoch&#039;: 0.71}<br \/>\n{&#039;loss&#039;: 2.275, &#039;grad_norm&#039;: 1.5686719417572021, &#039;learning_rate&#039;: 2.067337358489085e-05, &#039;epoch&#039;: 0.73}<br \/>\n{&#039;loss&#039;: 2.2075, &#039;grad_norm&#039;: 1.5931408405303955, &#039;learning_rate&#039;: 1.7630685760908622e-05, &#039;epoch&#039;: 0.75}<br \/>\n{&#039;loss&#039;: 2.1727, &#039;grad_norm&#039;: 1.7681787014007568, &#039;learning_rate&#039;: 1.4781713480810184e-05, &#039;epoch&#039;: 0.77}<br \/>\n{&#039;loss&#039;: 2.3562, &#039;grad_norm&#039;: 1.742925763130188, &#039;learning_rate&#039;: 1.2143506540914128e-05, &#039;epoch&#039;: 0.8}<br \/>\n{&#039;loss&#039;: 2.1187, &#039;grad_norm&#039;: 1.6716198921203613, &#039;learning_rate&#039;: 9.731853403356705e-06, &#039;epoch&#039;: 0.82}<br \/>\n{&#039;loss&#039;: 2.2564, &#039;grad_norm&#039;: 1.915489912033081, &#039;learning_rate&#039;: 7.561186709365653e-06, &#039;epoch&#039;: 0.84}<br \/>\n{&#039;loss&#039;: 2.261, &#039;grad_norm&#039;: 2.132519245147705, &#039;learning_rate&#039;: 5.644496906502233e-06, &#039;epoch&#039;: 0.86}<br \/>\n{&#039;loss&#039;: 2.1632, &#039;grad_norm&#039;: 1.591231107711792, &#039;learning_rate&#039;: 3.9932545067728366e-06, &#039;epoch&#039;: 0.88}<br \/>\n{&#039;loss&#039;: 2.1266, &#039;grad_norm&#039;: 1.584917664527893, &#039;learning_rate&#039;: 2.6173414408598827e-06, &#039;epoch&#039;: 0.91}<br \/>\n{&#039;loss&#039;: 2.2944, &#039;grad_norm&#039;: 1.5982666015625, &#039;learning_rate&#039;: 1.524991919285429e-06, &#039;epoch&#039;: 0.93}<br \/>\n{&#039;loss&#039;: 2.3799, &#039;grad_norm&#039;: 2.1475727558135986, &#039;learning_rate&#039;: 7.227431544266194e-07, &#039;epoch&#039;: 0.95}<br \/>\n{&#039;loss&#039;: 2.1196, &#039;grad_norm&#039;: 1.6714484691619873, &#039;learning_rate&#039;: 2.153962382888841e-07, &#039;epoch&#039;: 0.97}<br \/>\n{&#039;loss&#039;: 2.1427, &#039;grad_norm&#039;: 1.7334465980529785, &#039;learning_rate&#039;: 5.987410165758656e-09, &#039;epoch&#039;: 1.0}<br \/>\n{&#039;train_runtime&#039;: 14788.7396, &#039;train_samples_per_second&#039;: 0.245, &#039;train_steps_per_second&#039;: 0.031, &#039;train_loss&#039;: 2.6206856934370193, &#039;epoch&#039;: 1.0}<br \/>\n***** train metrics *****<br \/>\n  epoch                    &#061;      0.9997<br \/>\n  total_flos               &#061; 100517734GF<br \/>\n  train_loss               &#061;      2.6207<br \/>\n  train_runtime            &#061;  4:06:28.73<br \/>\n  train_samples_per_second &#061;       0.245<br \/>\n  train_steps_per_second   &#061;       0.031<br \/>\nFigure saved at: \/data\/model\/sft\/DeepSeek-R1-Distill-Qwen-7B\/training_loss.png<br \/>\n[WARNING|2025-02-18 20:46:25] llamafactory.extras.ploting:162 &gt;&gt; No metric eval_loss to plot.<br \/>\n[WARNING|2025-02-18 20:46:25] llamafactory.extras.ploting:162 &gt;&gt; No metric eval_accuracy to plot.<br \/>\n[INFO|trainer.py:4021] 2025-02-18 20:46:25,781 &gt;&gt;<br \/>\n***** Running Evaluation *****<br \/>\n[INFO|trainer.py:4023] 2025-02-18 20:46:25,781 &gt;&gt;   Num examples &#061; 402<br \/>\n[INFO|trainer.py:4026] 2025-02-18 20:46:25,781 &gt;&gt;   Batch size &#061; 1<br \/>\n100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 402\/402 [09:03&lt;00:00,  1.35s\/it]t]<br \/>\n[INFO|modelcard.py:449] 2025-02-18 20:55:30,409 &gt;&gt; Dropping the following result as it does not have all the necessary fields:<br \/>\n{&#039;task&#039;: {&#039;name&#039;: &#039;Causal Language Modeling&#039;, &#039;type&#039;: &#039;text-generation&#039;}}<br \/>\n***** eval metrics *****<br \/>\n  epoch                   &#061;     0.9997<br \/>\n  eval_loss               &#061;     2.2648<br \/>\n  eval_runtime            &#061; 0:09:04.62<br \/>\n  eval_samples_per_second &#061;      0.738<br \/>\n  eval_steps_per_second   &#061;      0.738 <\/p>\n<p>\u751f\u6210\u7684\u6743\u91cd\u6587\u4ef6&#xff1a;<\/p>\n<p style=\"text-align:center\"><img loading=\"lazy\" decoding=\"async\" alt=\"\" height=\"291\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011033-681ab309b9e60.png\" width=\"633\" \/><\/p>\n<hr \/>\n<h2>\u4e94\u3001\u9644\u5e26\u8bf4\u660e<\/h2>\n<h3>5.1. dataset_info.json<\/h3>\n<p>\u5305\u542b\u4e86\u6240\u6709\u53ef\u7528\u7684\u6570\u636e\u96c6\u3002\u5982\u679c\u60a8\u5e0c\u671b\u4f7f\u7528\u81ea\u5b9a\u4e49\u6570\u636e\u96c6&#xff0c;\u8bf7\u52a1\u5fc5\u5728\u00a0dataset_info.json\u00a0\u6587\u4ef6\u4e2d\u6dfb\u52a0\u6570\u636e\u96c6\u63cf\u8ff0&#xff0c;\u5e76\u901a\u8fc7\u4fee\u6539\u00a0dataset: \u6570\u636e\u96c6\u540d\u79f0\u00a0\u914d\u7f6e\u6765\u4f7f\u7528\u6570\u636e\u96c6\u3002<\/p>\n<p>&#034;\u6570\u636e\u96c6\u540d\u79f0&#034;: {<br \/>\n  &#034;hf_hub_url&#034;: &#034;Hugging Face \u7684\u6570\u636e\u96c6\u4ed3\u5e93\u5730\u5740&#xff08;\u82e5\u6307\u5b9a&#xff0c;\u5219\u5ffd\u7565 script_url \u548c file_name&#xff09;&#034;,<br \/>\n  &#034;ms_hub_url&#034;: &#034;ModelScope \u7684\u6570\u636e\u96c6\u4ed3\u5e93\u5730\u5740&#xff08;\u82e5\u6307\u5b9a&#xff0c;\u5219\u5ffd\u7565 script_url \u548c file_name&#xff09;&#034;,<br \/>\n  &#034;script_url&#034;: &#034;\u5305\u542b\u6570\u636e\u52a0\u8f7d\u811a\u672c\u7684\u672c\u5730\u6587\u4ef6\u5939\u540d\u79f0&#xff08;\u82e5\u6307\u5b9a&#xff0c;\u5219\u5ffd\u7565 file_name&#xff09;&#034;,<br \/>\n  &#034;file_name&#034;: &#034;\u8be5\u76ee\u5f55\u4e0b\u6570\u636e\u96c6\u6587\u4ef6\u5939\u6216\u6587\u4ef6\u7684\u540d\u79f0&#xff08;\u82e5\u4e0a\u8ff0\u53c2\u6570\u672a\u6307\u5b9a&#xff0c;\u5219\u6b64\u9879\u5fc5\u9700&#xff09;&#034;,<br \/>\n  &#034;formatting&#034;: &#034;\u6570\u636e\u96c6\u683c\u5f0f&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;alpaca&#xff0c;\u53ef\u4ee5\u4e3a alpaca \u6216 sharegpt&#xff09;&#034;,<br \/>\n  &#034;ranking&#034;: &#034;\u662f\u5426\u4e3a\u504f\u597d\u6570\u636e\u96c6&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;False&#xff09;&#034;,<br \/>\n  &#034;subset&#034;: &#034;\u6570\u636e\u96c6\u5b50\u96c6\u7684\u540d\u79f0&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n  &#034;split&#034;: &#034;\u6240\u4f7f\u7528\u7684\u6570\u636e\u96c6\u5207\u5206&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;train&#xff09;&#034;,<br \/>\n  &#034;folder&#034;: &#034;Hugging Face \u4ed3\u5e93\u7684\u6587\u4ef6\u5939\u540d\u79f0&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n  &#034;num_samples&#034;: &#034;\u8be5\u6570\u636e\u96c6\u6240\u4f7f\u7528\u7684\u6837\u672c\u6570\u91cf\u3002&#xff08;\u53ef\u9009&#xff0c;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n  &#034;columns&#xff08;\u53ef\u9009&#xff09;&#034;: {<br \/>\n    &#034;prompt&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u63d0\u793a\u8bcd\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;instruction&#xff09;&#034;,<br \/>\n    &#034;query&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u8bf7\u6c42\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;input&#xff09;&#034;,<br \/>\n    &#034;response&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u56de\u7b54\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;output&#xff09;&#034;,<br \/>\n    &#034;history&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u5386\u53f2\u5bf9\u8bdd\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;messages&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u6d88\u606f\u5217\u8868\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;conversations&#xff09;&#034;,<br \/>\n    &#034;system&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u7cfb\u7edf\u63d0\u793a\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;tools&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u5de5\u5177\u63cf\u8ff0\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;images&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u56fe\u50cf\u8f93\u5165\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;videos&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u89c6\u9891\u8f93\u5165\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;audios&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u97f3\u9891\u8f93\u5165\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;chosen&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u66f4\u4f18\u56de\u7b54\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;rejected&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868\u66f4\u5dee\u56de\u7b54\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;,<br \/>\n    &#034;kto_tag&#034;: &#034;\u6570\u636e\u96c6\u4ee3\u8868 KTO \u6807\u7b7e\u7684\u8868\u5934\u540d\u79f0&#xff08;\u9ed8\u8ba4&#xff1a;None&#xff09;&#034;<br \/>\n  },<br \/>\n  &#034;tags&#xff08;\u53ef\u9009&#xff0c;\u7528\u4e8e sharegpt \u683c\u5f0f&#xff09;&#034;: {<br \/>\n    &#034;role_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u53d1\u9001\u8005\u8eab\u4efd\u7684\u952e\u540d&#xff08;\u9ed8\u8ba4&#xff1a;from&#xff09;&#034;,<br \/>\n    &#034;content_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u6587\u672c\u5185\u5bb9\u7684\u952e\u540d&#xff08;\u9ed8\u8ba4&#xff1a;value&#xff09;&#034;,<br \/>\n    &#034;user_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u7528\u6237\u7684 role_tag&#xff08;\u9ed8\u8ba4&#xff1a;human&#xff09;&#034;,<br \/>\n    &#034;assistant_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u52a9\u624b\u7684 role_tag&#xff08;\u9ed8\u8ba4&#xff1a;gpt&#xff09;&#034;,<br \/>\n    &#034;observation_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u5de5\u5177\u8fd4\u56de\u7ed3\u679c\u7684 role_tag&#xff08;\u9ed8\u8ba4&#xff1a;observation&#xff09;&#034;,<br \/>\n    &#034;function_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u5de5\u5177\u8c03\u7528\u7684 role_tag&#xff08;\u9ed8\u8ba4&#xff1a;function_call&#xff09;&#034;,<br \/>\n    &#034;system_tag&#034;: &#034;\u6d88\u606f\u4e2d\u4ee3\u8868\u7cfb\u7edf\u63d0\u793a\u7684 role_tag&#xff08;\u9ed8\u8ba4&#xff1a;system&#xff0c;\u4f1a\u8986\u76d6 system column&#xff09;&#034;<br \/>\n  }<br \/>\n} <\/p>\n<h3>5.2. \u81ea\u5b9a\u4e49\u5bf9\u8bdd\u6a21\u7248<\/h3>\n<p>\u5728 template.py \u4e2d\u6dfb\u52a0\u81ea\u5df1\u7684\u5bf9\u8bdd\u6a21\u677f\u3002<\/p>\n<p>https:\/\/github.com\/hiyouga\/LLaMA-Factory\/blob\/main\/src\/llamafactory\/data\/template.py<\/p>\n<p># Copyright 2025 the LlamaFactory team.<br \/>\n#<br \/>\n# Licensed under the Apache License, Version 2.0 (the &#034;License&#034;);<br \/>\n# you may not use this file except in compliance with the License.<br \/>\n# You may obtain a copy of the License at<br \/>\n#<br \/>\n#     http:\/\/www.apache.org\/licenses\/LICENSE-2.0<br \/>\n#<br \/>\n# Unless required by applicable law or agreed to in writing, software<br \/>\n# distributed under the License is distributed on an &#034;AS IS&#034; BASIS,<br \/>\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.<br \/>\n# See the License for the specific language governing permissions and<br \/>\n# limitations under the License.<\/p>\n<p>from dataclasses import dataclass<br \/>\nfrom typing import TYPE_CHECKING, Dict, List, Optional, Sequence, Tuple, Type, Union<\/p>\n<p>from typing_extensions import override<\/p>\n<p>from ..extras import logging<br \/>\nfrom ..extras.misc import check_version<br \/>\nfrom .data_utils import Role<br \/>\nfrom .formatter import EmptyFormatter, FunctionFormatter, StringFormatter, ToolFormatter<br \/>\nfrom .mm_plugin import get_mm_plugin<\/p>\n<p>if TYPE_CHECKING:<br \/>\n    from transformers import PreTrainedTokenizer<\/p>\n<p>    from ..hparams import DataArguments<br \/>\n    from .formatter import SLOTS, Formatter<br \/>\n    from .mm_plugin import BasePlugin<br \/>\n    from .tool_utils import FunctionCall<\/p>\n<p>logger &#061; logging.get_logger(__name__)<\/p>\n<p>&#064;dataclass<br \/>\nclass Template:<br \/>\n    format_user: &#034;Formatter&#034;<br \/>\n    format_assistant: &#034;Formatter&#034;<br \/>\n    format_system: &#034;Formatter&#034;<br \/>\n    format_function: &#034;Formatter&#034;<br \/>\n    format_observation: &#034;Formatter&#034;<br \/>\n    format_tools: &#034;Formatter&#034;<br \/>\n    format_prefix: &#034;Formatter&#034;<br \/>\n    default_system: str<br \/>\n    stop_words: List[str]<br \/>\n    thought_words: Tuple[str, str]<br \/>\n    efficient_eos: bool<br \/>\n    replace_eos: bool<br \/>\n    replace_jinja_template: bool<br \/>\n    mm_plugin: &#034;BasePlugin&#034;<\/p>\n<p>    def encode_oneturn(<br \/>\n        self,<br \/>\n        tokenizer: &#034;PreTrainedTokenizer&#034;,<br \/>\n        messages: Sequence[Dict[str, str]],<br \/>\n        system: Optional[str] &#061; None,<br \/>\n        tools: Optional[str] &#061; None,<br \/>\n    ) -&gt; Tuple[List[int], List[int]]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns a single pair of token ids representing prompt and response respectively.<br \/>\n        &#034;&#034;&#034;<br \/>\n        encoded_messages &#061; self._encode(tokenizer, messages, system, tools)<br \/>\n        prompt_ids &#061; []<br \/>\n        for encoded_ids in encoded_messages[:-1]:<br \/>\n            prompt_ids &#043;&#061; encoded_ids<\/p>\n<p>        response_ids &#061; encoded_messages[-1]<br \/>\n        return prompt_ids, response_ids<\/p>\n<p>    def encode_multiturn(<br \/>\n        self,<br \/>\n        tokenizer: &#034;PreTrainedTokenizer&#034;,<br \/>\n        messages: Sequence[Dict[str, str]],<br \/>\n        system: Optional[str] &#061; None,<br \/>\n        tools: Optional[str] &#061; None,<br \/>\n    ) -&gt; List[Tuple[List[int], List[int]]]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns multiple pairs of token ids representing prompts and responses respectively.<br \/>\n        &#034;&#034;&#034;<br \/>\n        encoded_messages &#061; self._encode(tokenizer, messages, system, tools)<br \/>\n        return [(encoded_messages[i], encoded_messages[i &#043; 1]) for i in range(0, len(encoded_messages), 2)]<\/p>\n<p>    def extract_tool(self, content: str) -&gt; Union[str, List[&#034;FunctionCall&#034;]]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Extracts tool message.<br \/>\n        &#034;&#034;&#034;<br \/>\n        return self.format_tools.extract(content)<\/p>\n<p>    def get_stop_token_ids(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; List[int]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns stop token ids.<br \/>\n        &#034;&#034;&#034;<br \/>\n        stop_token_ids &#061; {tokenizer.eos_token_id}<br \/>\n        for token in self.stop_words:<br \/>\n            stop_token_ids.add(tokenizer.convert_tokens_to_ids(token))<\/p>\n<p>        return list(stop_token_ids)<\/p>\n<p>    def _convert_elements_to_ids(self, tokenizer: &#034;PreTrainedTokenizer&#034;, elements: &#034;SLOTS&#034;) -&gt; List[int]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Converts elements to token ids.<br \/>\n        &#034;&#034;&#034;<br \/>\n        token_ids &#061; []<br \/>\n        for elem in elements:<br \/>\n            if isinstance(elem, str):<br \/>\n                if len(elem) !&#061; 0:<br \/>\n                    token_ids &#043;&#061; tokenizer.encode(elem, add_special_tokens&#061;False)<br \/>\n            elif isinstance(elem, dict):<br \/>\n                token_ids &#043;&#061; [tokenizer.convert_tokens_to_ids(elem.get(&#034;token&#034;))]<br \/>\n            elif isinstance(elem, set):<br \/>\n                if &#034;bos_token&#034; in elem and tokenizer.bos_token_id is not None:<br \/>\n                    token_ids &#043;&#061; [tokenizer.bos_token_id]<br \/>\n                elif &#034;eos_token&#034; in elem and tokenizer.eos_token_id is not None:<br \/>\n                    token_ids &#043;&#061; [tokenizer.eos_token_id]<br \/>\n            else:<br \/>\n                raise ValueError(f&#034;Input must be string, set[str] or dict[str, str], got {type(elem)}&#034;)<\/p>\n<p>        return token_ids<\/p>\n<p>    def _encode(<br \/>\n        self,<br \/>\n        tokenizer: &#034;PreTrainedTokenizer&#034;,<br \/>\n        messages: Sequence[Dict[str, str]],<br \/>\n        system: Optional[str],<br \/>\n        tools: Optional[str],<br \/>\n    ) -&gt; List[List[int]]:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Encodes formatted inputs to pairs of token ids.<br \/>\n        Turn 0: prefix &#043; system &#043; query        resp<br \/>\n        Turn t: query                          resp<br \/>\n        &#034;&#034;&#034;<br \/>\n        system &#061; system or self.default_system<br \/>\n        encoded_messages &#061; []<br \/>\n        for i, message in enumerate(messages):<br \/>\n            elements &#061; []<\/p>\n<p>            if i &#061;&#061; 0:<br \/>\n                elements &#043;&#061; self.format_prefix.apply()<br \/>\n                if system or tools:<br \/>\n                    tool_text &#061; self.format_tools.apply(content&#061;tools)[0] if tools else &#034;&#034;<br \/>\n                    elements &#043;&#061; self.format_system.apply(content&#061;(system &#043; tool_text))<\/p>\n<p>            if message[&#034;role&#034;] &#061;&#061; Role.USER.value:<br \/>\n                elements &#043;&#061; self.format_user.apply(content&#061;message[&#034;content&#034;], idx&#061;str(i \/\/ 2))<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.ASSISTANT.value:<br \/>\n                elements &#043;&#061; self.format_assistant.apply(content&#061;message[&#034;content&#034;])<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.OBSERVATION.value:<br \/>\n                elements &#043;&#061; self.format_observation.apply(content&#061;message[&#034;content&#034;])<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.FUNCTION.value:<br \/>\n                elements &#043;&#061; self.format_function.apply(content&#061;message[&#034;content&#034;])<br \/>\n            else:<br \/>\n                raise NotImplementedError(&#034;Unexpected role: {}&#034;.format(message[&#034;role&#034;]))<\/p>\n<p>            encoded_messages.append(self._convert_elements_to_ids(tokenizer, elements))<\/p>\n<p>        return encoded_messages<\/p>\n<p>    &#064;staticmethod<br \/>\n    def _add_or_replace_eos_token(tokenizer: &#034;PreTrainedTokenizer&#034;, eos_token: str) -&gt; None:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Adds or replaces eos token to the tokenizer.<br \/>\n        &#034;&#034;&#034;<br \/>\n        is_added &#061; tokenizer.eos_token_id is None<br \/>\n        num_added_tokens &#061; tokenizer.add_special_tokens({&#034;eos_token&#034;: eos_token})<\/p>\n<p>        if is_added:<br \/>\n            logger.info_rank0(f&#034;Add eos token: {tokenizer.eos_token}.&#034;)<br \/>\n        else:<br \/>\n            logger.info_rank0(f&#034;Replace eos token: {tokenizer.eos_token}.&#034;)<\/p>\n<p>        if num_added_tokens &gt; 0:<br \/>\n            logger.warning_rank0(&#034;New tokens have been added, make sure &#096;resize_vocab&#096; is True.&#034;)<\/p>\n<p>    def fix_special_tokens(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; None:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Adds eos token and pad token to the tokenizer.<br \/>\n        &#034;&#034;&#034;<br \/>\n        stop_words &#061; self.stop_words<br \/>\n        if self.replace_eos:<br \/>\n            if not stop_words:<br \/>\n                raise ValueError(&#034;Stop words are required to replace the EOS token.&#034;)<\/p>\n<p>            self._add_or_replace_eos_token(tokenizer, eos_token&#061;stop_words[0])<br \/>\n            stop_words &#061; stop_words[1:]<\/p>\n<p>        if tokenizer.eos_token_id is None:<br \/>\n            self._add_or_replace_eos_token(tokenizer, eos_token&#061;&#034;&lt;|endoftext|&gt;&#034;)<\/p>\n<p>        if tokenizer.pad_token_id is None:<br \/>\n            tokenizer.pad_token &#061; tokenizer.eos_token<br \/>\n            logger.info_rank0(f&#034;Add pad token: {tokenizer.pad_token}&#034;)<\/p>\n<p>        if stop_words:<br \/>\n            num_added_tokens &#061; tokenizer.add_special_tokens(<br \/>\n                dict(additional_special_tokens&#061;stop_words), replace_additional_special_tokens&#061;False<br \/>\n            )<br \/>\n            logger.info_rank0(&#034;Add {} to stop words.&#034;.format(&#034;,&#034;.join(stop_words)))<br \/>\n            if num_added_tokens &gt; 0:<br \/>\n                logger.warning_rank0(&#034;New tokens have been added, make sure &#096;resize_vocab&#096; is True.&#034;)<\/p>\n<p>    &#064;staticmethod<br \/>\n    def _jinja_escape(content: str) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Escape single quotes in content.<br \/>\n        &#034;&#034;&#034;<br \/>\n        return content.replace(&#034;&#039;&#034;, r&#034;\\\\&#039;&#034;)<\/p>\n<p>    &#064;staticmethod<br \/>\n    def _convert_slots_to_jinja(slots: &#034;SLOTS&#034;, tokenizer: &#034;PreTrainedTokenizer&#034;, placeholder: str &#061; &#034;content&#034;) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Converts slots to jinja template.<br \/>\n        &#034;&#034;&#034;<br \/>\n        slot_items &#061; []<br \/>\n        for slot in slots:<br \/>\n            if isinstance(slot, str):<br \/>\n                slot_pieces &#061; slot.split(&#034;{{content}}&#034;)<br \/>\n                if slot_pieces[0]:<br \/>\n                    slot_items.append(&#034;&#039;&#034; &#043; Template._jinja_escape(slot_pieces[0]) &#043; &#034;&#039;&#034;)<br \/>\n                if len(slot_pieces) &gt; 1:<br \/>\n                    slot_items.append(placeholder)<br \/>\n                    if slot_pieces[1]:<br \/>\n                        slot_items.append(&#034;&#039;&#034; &#043; Template._jinja_escape(slot_pieces[1]) &#043; &#034;&#039;&#034;)<br \/>\n            elif isinstance(slot, set):  # do not use {{ eos_token }} since it may be replaced<br \/>\n                if &#034;bos_token&#034; in slot and tokenizer.bos_token_id is not None:<br \/>\n                    slot_items.append(&#034;&#039;&#034; &#043; tokenizer.bos_token &#043; &#034;&#039;&#034;)<br \/>\n                elif &#034;eos_token&#034; in slot and tokenizer.eos_token_id is not None:<br \/>\n                    slot_items.append(&#034;&#039;&#034; &#043; tokenizer.eos_token &#043; &#034;&#039;&#034;)<br \/>\n            elif isinstance(slot, dict):<br \/>\n                raise ValueError(&#034;Dict is not supported.&#034;)<\/p>\n<p>        return &#034; &#043; &#034;.join(slot_items)<\/p>\n<p>    def _get_jinja_template(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns the jinja template.<br \/>\n        &#034;&#034;&#034;<br \/>\n        prefix &#061; self._convert_slots_to_jinja(self.format_prefix.apply(), tokenizer)<br \/>\n        system &#061; self._convert_slots_to_jinja(self.format_system.apply(), tokenizer, placeholder&#061;&#034;system_message&#034;)<br \/>\n        user &#061; self._convert_slots_to_jinja(self.format_user.apply(), tokenizer)<br \/>\n        assistant &#061; self._convert_slots_to_jinja(self.format_assistant.apply(), tokenizer)<br \/>\n        jinja_template &#061; &#034;&#034;<br \/>\n        if prefix:<br \/>\n            jinja_template &#043;&#061; &#034;{{ &#034; &#043; prefix &#043; &#034; }}&#034;<\/p>\n<p>        if self.default_system:<br \/>\n            jinja_template &#043;&#061; &#034;{% set system_message &#061; &#039;&#034; &#043; self._jinja_escape(self.default_system) &#043; &#034;&#039; %}&#034;<\/p>\n<p>        jinja_template &#043;&#061; (<br \/>\n            &#034;{% if messages[0][&#039;role&#039;] &#061;&#061; &#039;system&#039; %}{% set loop_messages &#061; messages[1:] %}&#034;<br \/>\n            &#034;{% set system_message &#061; messages[0][&#039;content&#039;] %}{% else %}{% set loop_messages &#061; messages %}{% endif %}&#034;<br \/>\n            &#034;{% if system_message is defined %}{{ &#034; &#043; system &#043; &#034; }}{% endif %}&#034;<br \/>\n            &#034;{% for message in loop_messages %}&#034;<br \/>\n            &#034;{% set content &#061; message[&#039;content&#039;] %}&#034;<br \/>\n            &#034;{% if message[&#039;role&#039;] &#061;&#061; &#039;user&#039; %}&#034;<br \/>\n            &#034;{{ &#034; &#043; user &#043; &#034; }}&#034;<br \/>\n            &#034;{% elif message[&#039;role&#039;] &#061;&#061; &#039;assistant&#039; %}&#034;<br \/>\n            &#034;{{ &#034; &#043; assistant &#043; &#034; }}&#034;<br \/>\n            &#034;{% endif %}&#034;<br \/>\n            &#034;{% endfor %}&#034;<br \/>\n        )<br \/>\n        return jinja_template<\/p>\n<p>    def fix_jinja_template(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; None:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Replaces the jinja template in the tokenizer.<br \/>\n        &#034;&#034;&#034;<br \/>\n        if tokenizer.chat_template is None or self.replace_jinja_template:<br \/>\n            try:<br \/>\n                tokenizer.chat_template &#061; self._get_jinja_template(tokenizer)<br \/>\n            except ValueError as e:<br \/>\n                logger.info_rank0(f&#034;Cannot add this chat template to tokenizer: {e}.&#034;)<\/p>\n<p>    &#064;staticmethod<br \/>\n    def _convert_slots_to_ollama(<br \/>\n        slots: &#034;SLOTS&#034;, tokenizer: &#034;PreTrainedTokenizer&#034;, placeholder: str &#061; &#034;content&#034;<br \/>\n    ) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Converts slots to ollama template.<br \/>\n        &#034;&#034;&#034;<br \/>\n        slot_items &#061; []<br \/>\n        for slot in slots:<br \/>\n            if isinstance(slot, str):<br \/>\n                slot_pieces &#061; slot.split(&#034;{{content}}&#034;)<br \/>\n                if slot_pieces[0]:<br \/>\n                    slot_items.append(slot_pieces[0])<br \/>\n                if len(slot_pieces) &gt; 1:<br \/>\n                    slot_items.append(&#034;{{ &#034; &#043; placeholder &#043; &#034; }}&#034;)<br \/>\n                    if slot_pieces[1]:<br \/>\n                        slot_items.append(slot_pieces[1])<br \/>\n            elif isinstance(slot, set):  # do not use {{ eos_token }} since it may be replaced<br \/>\n                if &#034;bos_token&#034; in slot and tokenizer.bos_token_id is not None:<br \/>\n                    slot_items.append(tokenizer.bos_token)<br \/>\n                elif &#034;eos_token&#034; in slot and tokenizer.eos_token_id is not None:<br \/>\n                    slot_items.append(tokenizer.eos_token)<br \/>\n            elif isinstance(slot, dict):<br \/>\n                raise ValueError(&#034;Dict is not supported.&#034;)<\/p>\n<p>        return &#034;&#034;.join(slot_items)<\/p>\n<p>    def _get_ollama_template(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns the ollama template.<br \/>\n        &#034;&#034;&#034;<br \/>\n        prefix &#061; self._convert_slots_to_ollama(self.format_prefix.apply(), tokenizer)<br \/>\n        system &#061; self._convert_slots_to_ollama(self.format_system.apply(), tokenizer, placeholder&#061;&#034;.System&#034;)<br \/>\n        user &#061; self._convert_slots_to_ollama(self.format_user.apply(), tokenizer, placeholder&#061;&#034;.Content&#034;)<br \/>\n        assistant &#061; self._convert_slots_to_ollama(self.format_assistant.apply(), tokenizer, placeholder&#061;&#034;.Content&#034;)<br \/>\n        return (<br \/>\n            f&#034;{prefix}{{{{ if .System }}}}{system}{{{{ end }}}}&#034;<br \/>\n            f&#034;&#034;&#034;{{{{ range .Messages }}}}{{{{ if eq .Role &#034;user&#034; }}}}{user}&#034;&#034;&#034;<br \/>\n            f&#034;&#034;&#034;{{{{ else if eq .Role &#034;assistant&#034; }}}}{assistant}{{{{ end }}}}{{{{ end }}}}&#034;&#034;&#034;<br \/>\n        )<\/p>\n<p>    def get_ollama_modelfile(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; str:<br \/>\n        r&#034;&#034;&#034;<br \/>\n        Returns the ollama modelfile.<\/p>\n<p>        TODO: support function calling.<br \/>\n        &#034;&#034;&#034;<br \/>\n        modelfile &#061; &#034;# ollama modelfile auto-generated by llamafactory\\\\n\\\\n&#034;<br \/>\n        modelfile &#043;&#061; f&#039;FROM .\\\\n\\\\nTEMPLATE &#034;&#034;&#034;{self._get_ollama_template(tokenizer)}&#034;&#034;&#034;\\\\n\\\\n&#039;<\/p>\n<p>        if self.default_system:<br \/>\n            modelfile &#043;&#061; f&#039;SYSTEM &#034;&#034;&#034;{self.default_system}&#034;&#034;&#034;\\\\n\\\\n&#039;<\/p>\n<p>        for stop_token_id in self.get_stop_token_ids(tokenizer):<br \/>\n            modelfile &#043;&#061; f&#039;PARAMETER stop &#034;{tokenizer.convert_ids_to_tokens(stop_token_id)}&#034;\\\\n&#039;<\/p>\n<p>        modelfile &#043;&#061; &#034;PARAMETER num_ctx 4096\\\\n&#034;<br \/>\n        return modelfile<\/p>\n<p>&#064;dataclass<br \/>\nclass Llama2Template(Template):<br \/>\n    &#064;override<br \/>\n    def _encode(<br \/>\n        self,<br \/>\n        tokenizer: &#034;PreTrainedTokenizer&#034;,<br \/>\n        messages: Sequence[Dict[str, str]],<br \/>\n        system: str,<br \/>\n        tools: str,<br \/>\n    ) -&gt; List[List[int]]:<br \/>\n        system &#061; system or self.default_system<br \/>\n        encoded_messages &#061; []<br \/>\n        for i, message in enumerate(messages):<br \/>\n            elements &#061; []<\/p>\n<p>            system_text &#061; &#034;&#034;<br \/>\n            if i &#061;&#061; 0:<br \/>\n                elements &#043;&#061; self.format_prefix.apply()<br \/>\n                if system or tools:<br \/>\n                    tool_text &#061; self.format_tools.apply(content&#061;tools)[0] if tools else &#034;&#034;<br \/>\n                    system_text &#061; self.format_system.apply(content&#061;(system &#043; tool_text))[0]<\/p>\n<p>            if message[&#034;role&#034;] &#061;&#061; Role.USER.value:<br \/>\n                elements &#043;&#061; self.format_user.apply(content&#061;system_text &#043; message[&#034;content&#034;])<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.ASSISTANT.value:<br \/>\n                elements &#043;&#061; self.format_assistant.apply(content&#061;message[&#034;content&#034;])<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.OBSERVATION.value:<br \/>\n                elements &#043;&#061; self.format_observation.apply(content&#061;message[&#034;content&#034;])<br \/>\n            elif message[&#034;role&#034;] &#061;&#061; Role.FUNCTION.value:<br \/>\n                elements &#043;&#061; self.format_function.apply(content&#061;message[&#034;content&#034;])<br \/>\n            else:<br \/>\n                raise NotImplementedError(&#034;Unexpected role: {}&#034;.format(message[&#034;role&#034;]))<\/p>\n<p>            encoded_messages.append(self._convert_elements_to_ids(tokenizer, elements))<\/p>\n<p>        return encoded_messages<\/p>\n<p>    def _get_jinja_template(self, tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; str:<br \/>\n        prefix &#061; self._convert_slots_to_jinja(self.format_prefix.apply(), tokenizer)<br \/>\n        system_message &#061; self._convert_slots_to_jinja(<br \/>\n            self.format_system.apply(), tokenizer, placeholder&#061;&#034;system_message&#034;<br \/>\n        )<br \/>\n        user_message &#061; self._convert_slots_to_jinja(self.format_user.apply(), tokenizer)<br \/>\n        assistant_message &#061; self._convert_slots_to_jinja(self.format_assistant.apply(), tokenizer)<br \/>\n        jinja_template &#061; &#034;&#034;<br \/>\n        if prefix:<br \/>\n            jinja_template &#043;&#061; &#034;{{ &#034; &#043; prefix &#043; &#034; }}&#034;<\/p>\n<p>        if self.default_system:<br \/>\n            jinja_template &#043;&#061; &#034;{% set system_message &#061; &#039;&#034; &#043; self._jinja_escape(self.default_system) &#043; &#034;&#039; %}&#034;<\/p>\n<p>        jinja_template &#043;&#061; (<br \/>\n            &#034;{% if messages[0][&#039;role&#039;] &#061;&#061; &#039;system&#039; %}{% set loop_messages &#061; messages[1:] %}&#034;<br \/>\n            &#034;{% set system_message &#061; messages[0][&#039;content&#039;] %}{% else %}{% set loop_messages &#061; messages %}{% endif %}&#034;<br \/>\n            &#034;{% for message in loop_messages %}&#034;<br \/>\n            &#034;{% if loop.index0 &#061;&#061; 0 and system_message is defined %}&#034;<br \/>\n            &#034;{% set content &#061; &#034; &#043; system_message &#043; &#034; &#043; message[&#039;content&#039;] %}&#034;<br \/>\n            &#034;{% else %}{% set content &#061; message[&#039;content&#039;] %}{% endif %}&#034;<br \/>\n            &#034;{% if message[&#039;role&#039;] &#061;&#061; &#039;user&#039; %}&#034;<br \/>\n            &#034;{{ &#034; &#043; user_message &#043; &#034; }}&#034;<br \/>\n            &#034;{% elif message[&#039;role&#039;] &#061;&#061; &#039;assistant&#039; %}&#034;<br \/>\n            &#034;{{ &#034; &#043; assistant_message &#043; &#034; }}&#034;<br \/>\n            &#034;{% endif %}&#034;<br \/>\n            &#034;{% endfor %}&#034;<br \/>\n        )<br \/>\n        return jinja_template<\/p>\n<p>TEMPLATES: Dict[str, &#034;Template&#034;] &#061; {}<\/p>\n<p>def register_template(<br \/>\n    name: str,<br \/>\n    format_user: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_assistant: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_system: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_function: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_observation: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_tools: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    format_prefix: Optional[&#034;Formatter&#034;] &#061; None,<br \/>\n    default_system: str &#061; &#034;&#034;,<br \/>\n    stop_words: Optional[Sequence[str]] &#061; None,<br \/>\n    thought_words: Optional[Tuple[str, str]] &#061; None,<br \/>\n    efficient_eos: bool &#061; False,<br \/>\n    replace_eos: bool &#061; False,<br \/>\n    replace_jinja_template: bool &#061; False,<br \/>\n    mm_plugin: &#034;BasePlugin&#034; &#061; get_mm_plugin(name&#061;&#034;base&#034;),<br \/>\n    template_class: Type[&#034;Template&#034;] &#061; Template,<br \/>\n) -&gt; None:<br \/>\n    r&#034;&#034;&#034;<br \/>\n    Registers a chat template.<\/p>\n<p>    To add the following chat template:<br \/>\n    &#096;&#096;&#096;<br \/>\n    &lt;s&gt;&lt;user&gt;user prompt here<br \/>\n    &lt;model&gt;model response here&lt;\/s&gt;<br \/>\n    &lt;user&gt;user prompt here<br \/>\n    &lt;model&gt;model response here&lt;\/s&gt;<br \/>\n    &#096;&#096;&#096;<\/p>\n<p>    The corresponding code should be:<br \/>\n    &#096;&#096;&#096;<br \/>\n    register_template(<br \/>\n        name&#061;&#034;custom&#034;,<br \/>\n        format_user&#061;StringFormatter(slots&#061;[&#034;&lt;user&gt;{{content}}\\\\n&lt;model&gt;&#034;]),<br \/>\n        format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;\/s&gt;\\\\n&#034;]),<br \/>\n        format_prefix&#061;EmptyFormatter(&#034;&lt;s&gt;&#034;),<br \/>\n    )<br \/>\n    &#096;&#096;&#096;<br \/>\n    &#034;&#034;&#034;<br \/>\n    if name in TEMPLATES:<br \/>\n        raise ValueError(f&#034;Template {name} already exists.&#034;)<\/p>\n<p>    default_slots &#061; [&#034;{{content}}&#034;] if efficient_eos else [&#034;{{content}}&#034;, {&#034;eos_token&#034;}]<br \/>\n    default_user_formatter &#061; StringFormatter(slots&#061;[&#034;{{content}}&#034;])<br \/>\n    default_assistant_formatter &#061; StringFormatter(slots&#061;default_slots)<br \/>\n    default_function_formatter &#061; FunctionFormatter(slots&#061;default_slots, tool_format&#061;&#034;default&#034;)<br \/>\n    default_tool_formatter &#061; ToolFormatter(tool_format&#061;&#034;default&#034;)<br \/>\n    default_prefix_formatter &#061; EmptyFormatter()<br \/>\n    TEMPLATES[name] &#061; template_class(<br \/>\n        format_user&#061;format_user or default_user_formatter,<br \/>\n        format_assistant&#061;format_assistant or default_assistant_formatter,<br \/>\n        format_system&#061;format_system or default_user_formatter,<br \/>\n        format_function&#061;format_function or default_function_formatter,<br \/>\n        format_observation&#061;format_observation or format_user or default_user_formatter,<br \/>\n        format_tools&#061;format_tools or default_tool_formatter,<br \/>\n        format_prefix&#061;format_prefix or default_prefix_formatter,<br \/>\n        default_system&#061;default_system,<br \/>\n        stop_words&#061;stop_words or [],<br \/>\n        thought_words&#061;thought_words or (&#034;&lt;think&gt;&#034;, &#034;&lt;\/think&gt;&#034;),<br \/>\n        efficient_eos&#061;efficient_eos,<br \/>\n        replace_eos&#061;replace_eos,<br \/>\n        replace_jinja_template&#061;replace_jinja_template,<br \/>\n        mm_plugin&#061;mm_plugin,<br \/>\n    )<\/p>\n<p>def parse_template(tokenizer: &#034;PreTrainedTokenizer&#034;) -&gt; &#034;Template&#034;:<br \/>\n    r&#034;&#034;&#034;<br \/>\n    Extracts a chat template from the tokenizer.<br \/>\n    &#034;&#034;&#034;<\/p>\n<p>    def find_diff(short_str: str, long_str: str) -&gt; str:<br \/>\n        i, j &#061; 0, 0<br \/>\n        diff &#061; &#034;&#034;<br \/>\n        while i &lt; len(short_str) and j &lt; len(long_str):<br \/>\n            if short_str[i] &#061;&#061; long_str[j]:<br \/>\n                i &#043;&#061; 1<br \/>\n                j &#043;&#061; 1<br \/>\n            else:<br \/>\n                diff &#043;&#061; long_str[j]<br \/>\n                j &#043;&#061; 1<\/p>\n<p>        return diff<\/p>\n<p>    prefix &#061; tokenizer.decode(tokenizer.encode(&#034;&#034;))<\/p>\n<p>    messages &#061; [{&#034;role&#034;: &#034;system&#034;, &#034;content&#034;: &#034;{{content}}&#034;}]<br \/>\n    system_slot &#061; tokenizer.apply_chat_template(messages, add_generation_prompt&#061;False, tokenize&#061;False)[len(prefix) :]<\/p>\n<p>    messages &#061; [{&#034;role&#034;: &#034;system&#034;, &#034;content&#034;: &#034;&#034;}, {&#034;role&#034;: &#034;user&#034;, &#034;content&#034;: &#034;{{content}}&#034;}]<br \/>\n    user_slot_empty_system &#061; tokenizer.apply_chat_template(messages, add_generation_prompt&#061;True, tokenize&#061;False)<br \/>\n    user_slot_empty_system &#061; user_slot_empty_system[len(prefix) :]<\/p>\n<p>    messages &#061; [{&#034;role&#034;: &#034;user&#034;, &#034;content&#034;: &#034;{{content}}&#034;}]<br \/>\n    user_slot &#061; tokenizer.apply_chat_template(messages, add_generation_prompt&#061;True, tokenize&#061;False)<br \/>\n    user_slot &#061; user_slot[len(prefix) :]<\/p>\n<p>    messages &#061; [{&#034;role&#034;: &#034;user&#034;, &#034;content&#034;: &#034;{{content}}&#034;}, {&#034;role&#034;: &#034;assistant&#034;, &#034;content&#034;: &#034;{{content}}&#034;}]<br \/>\n    assistant_slot &#061; tokenizer.apply_chat_template(messages, add_generation_prompt&#061;False, tokenize&#061;False)<br \/>\n    assistant_slot &#061; assistant_slot[len(prefix) &#043; len(user_slot) :]<\/p>\n<p>    if len(user_slot) &gt; len(user_slot_empty_system):<br \/>\n        default_system &#061; find_diff(user_slot_empty_system, user_slot)<br \/>\n        sole_system &#061; system_slot.replace(&#034;{{content}}&#034;, default_system, 1)<br \/>\n        user_slot &#061; user_slot[len(sole_system) :]<br \/>\n    else:  # if defaut_system is empty, user_slot_empty_system will be longer than user_slot<br \/>\n        default_system &#061; &#034;&#034;<\/p>\n<p>    return Template(<br \/>\n        format_user&#061;StringFormatter(slots&#061;[user_slot]),<br \/>\n        format_assistant&#061;StringFormatter(slots&#061;[assistant_slot]),<br \/>\n        format_system&#061;StringFormatter(slots&#061;[system_slot]),<br \/>\n        format_function&#061;FunctionFormatter(slots&#061;[assistant_slot], tool_format&#061;&#034;default&#034;),<br \/>\n        format_observation&#061;StringFormatter(slots&#061;[user_slot]),<br \/>\n        format_tools&#061;ToolFormatter(tool_format&#061;&#034;default&#034;),<br \/>\n        format_prefix&#061;EmptyFormatter(slots&#061;[prefix]) if prefix else EmptyFormatter(),<br \/>\n        default_system&#061;default_system,<br \/>\n        stop_words&#061;[],<br \/>\n        thought_words&#061;(&#034;&lt;think&gt;&#034;, &#034;&lt;\/think&gt;&#034;),<br \/>\n        efficient_eos&#061;False,<br \/>\n        replace_eos&#061;False,<br \/>\n        replace_jinja_template&#061;False,<br \/>\n        mm_plugin&#061;get_mm_plugin(name&#061;&#034;base&#034;),<br \/>\n    )<\/p>\n<p>def get_template_and_fix_tokenizer(tokenizer: &#034;PreTrainedTokenizer&#034;, data_args: &#034;DataArguments&#034;) -&gt; &#034;Template&#034;:<br \/>\n    r&#034;&#034;&#034;<br \/>\n    Gets chat template and fixes the tokenizer.<br \/>\n    &#034;&#034;&#034;<br \/>\n    if data_args.template is None:<br \/>\n        if isinstance(tokenizer.chat_template, str):<br \/>\n            logger.warning_rank0(&#034;&#096;template&#096; was not specified, try parsing the chat template from the tokenizer.&#034;)<br \/>\n            template &#061; parse_template(tokenizer)<br \/>\n        else:<br \/>\n            logger.warning_rank0(&#034;&#096;template&#096; was not specified, use &#096;empty&#096; template.&#034;)<br \/>\n            template &#061; TEMPLATES[&#034;empty&#034;]  # placeholder<br \/>\n    else:<br \/>\n        if data_args.template not in TEMPLATES:<br \/>\n            raise ValueError(f&#034;Template {data_args.template} does not exist.&#034;)<\/p>\n<p>        template &#061; TEMPLATES[data_args.template]<\/p>\n<p>    if template.mm_plugin.__class__.__name__ !&#061; &#034;BasePlugin&#034;:<br \/>\n        check_version(&#034;transformers&gt;&#061;4.45.0&#034;)<\/p>\n<p>    if data_args.train_on_prompt and template.efficient_eos:<br \/>\n        raise ValueError(&#034;Current template does not support &#096;train_on_prompt&#096;.&#034;)<\/p>\n<p>    if data_args.tool_format is not None:<br \/>\n        logger.info_rank0(f&#034;Using tool format: {data_args.tool_format}.&#034;)<br \/>\n        default_slots &#061; [&#034;{{content}}&#034;] if template.efficient_eos else [&#034;{{content}}&#034;, {&#034;eos_token&#034;}]<br \/>\n        template.format_function &#061; FunctionFormatter(slots&#061;default_slots, tool_format&#061;data_args.tool_format)<br \/>\n        template.format_tools &#061; ToolFormatter(tool_format&#061;data_args.tool_format)<\/p>\n<p>    template.fix_special_tokens(tokenizer)<br \/>\n    template.fix_jinja_template(tokenizer)<br \/>\n    return template<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;alpaca&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;### Instruction:\\\\n{{content}}\\\\n\\\\n### Response:\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;, {&#034;eos_token&#034;}, &#034;\\\\n\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;Below is an instruction that describes a task. Write a response that appropriately completes the request.\\\\n\\\\n&#034;<br \/>\n    ),<br \/>\n    replace_jinja_template&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;aquila&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}}###Assistant:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}###&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;System: {{content}}###&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious human and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the human&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;\/s&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;atom&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[{&#034;bos_token&#034;}, &#034;Human: {{content}}\\\\n&#034;, {&#034;eos_token&#034;}, {&#034;bos_token&#034;}, &#034;Assistant:&#034;]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n&#034;, {&#034;eos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;baichuan&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;&lt;reserved_102&gt;&#034;}, &#034;{{content}}&#034;, {&#034;token&#034;: &#034;&lt;reserved_103&gt;&#034;}]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;baichuan2&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;reserved_106&gt;{{content}}&lt;reserved_107&gt;&#034;]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;belle&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}}\\\\n\\\\nBelle: &#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;, {&#034;eos_token&#034;}, &#034;\\\\n\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;bluelm&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;[|Human|]:&#034;}, &#034;{{content}}&#034;, {&#034;token&#034;: &#034;[|AI|]:&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;breeze&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST] {{content}} [\/INST] &#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;chatglm2&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[Round {{idx}}]\\\\n\\\\n\u95ee&#xff1a;{{content}}\\\\n\\\\n\u7b54&#xff1a;&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;token&#034;: &#034;[gMASK]&#034;}, {&#034;token&#034;: &#034;sop&#034;}]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;chatglm3&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;&lt;|user|&gt;&#034;}, &#034;\\\\n&#034;, &#034;{{content}}&#034;, {&#034;token&#034;: &#034;&lt;|assistant|&gt;&#034;}]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;\\\\n&#034;, &#034;{{content}}&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;&lt;|system|&gt;&#034;}, &#034;\\\\n&#034;, &#034;{{content}}&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&#034;], tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[{&#034;token&#034;: &#034;&lt;|observation|&gt;&#034;}, &#034;\\\\n&#034;, &#034;{{content}}&#034;, {&#034;token&#034;: &#034;&lt;|assistant|&gt;&#034;}]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;token&#034;: &#034;[gMASK]&#034;}, {&#034;token&#034;: &#034;sop&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|user|&gt;&#034;, &#034;&lt;|observation|&gt;&#034;],<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;chatml&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;tool\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;, &#034;&lt;|im_start|&gt;&#034;],<br \/>\n    replace_eos&#061;True,<br \/>\n    replace_jinja_template&#061;True,<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;chatml_de&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;tool\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    default_system&#061;&#034;Du bist ein freundlicher und hilfsbereiter KI-Assistent.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;, &#034;&lt;|im_start|&gt;&#034;],<br \/>\n    replace_eos&#061;True,<br \/>\n    replace_jinja_template&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;codegeex2&#034;,<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;token&#034;: &#034;[gMASK]&#034;}, {&#034;token&#034;: &#034;sop&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;codegeex4&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&#034;], tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|observation|&gt;\\\\n{{content}}&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[&#034;[gMASK]&lt;sop&gt;&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;\u4f60\u662f\u4e00\u4f4d\u667a\u80fd\u7f16\u7a0b\u52a9\u624b&#xff0c;\u4f60\u53ebCodeGeeX\u3002\u4f60\u4f1a\u4e3a\u7528\u6237\u56de\u7b54\u5173\u4e8e\u7f16\u7a0b\u3001\u4ee3\u7801\u3001\u8ba1\u7b97\u673a\u65b9\u9762\u7684\u4efb\u4f55\u95ee\u9898&#xff0c;&#034;<br \/>\n        &#034;\u5e76\u63d0\u4f9b\u683c\u5f0f\u89c4\u8303\u3001\u53ef\u4ee5\u6267\u884c\u3001\u51c6\u786e\u5b89\u5168\u7684\u4ee3\u7801&#xff0c;\u5e76\u5728\u5fc5\u8981\u65f6\u63d0\u4f9b\u8be6\u7ec6\u7684\u89e3\u91ca\u3002&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|user|&gt;&#034;, &#034;&lt;|observation|&gt;&#034;],<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;cohere&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|START_OF_TURN_TOKEN|&gt;&lt;|USER_TOKEN|&gt;{{content}}&lt;|END_OF_TURN_TOKEN|&gt;&#034;<br \/>\n                &#034;&lt;|START_OF_TURN_TOKEN|&gt;&lt;|CHATBOT_TOKEN|&gt;&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|START_OF_TURN_TOKEN|&gt;&lt;|SYSTEM_TOKEN|&gt;{{content}}&lt;|END_OF_TURN_TOKEN|&gt;&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;cpm&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;\u7528\u6237&gt;{{content}}&lt;AI&gt;&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;cpm3&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;dbrx&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;tool\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are DBRX, created by Databricks. You were last updated in December 2023. &#034;<br \/>\n        &#034;You answer questions based on information available up to that point.\\\\n&#034;<br \/>\n        &#034;YOU PROVIDE SHORT RESPONSES TO SHORT QUESTIONS OR STATEMENTS, but provide thorough &#034;<br \/>\n        &#034;responses to more complex and open-ended questions.\\\\nYou assist with various tasks, &#034;<br \/>\n        &#034;from writing to coding (using markdown for code blocks \u2014 remember to use &#096;&#096;&#096; with &#034;<br \/>\n        &#034;code, JSON, and tables).\\\\n(You do not have real-time data access or code execution &#034;<br \/>\n        &#034;capabilities. You avoid stereotyping and provide balanced perspectives on &#034;<br \/>\n        &#034;controversial topics. You do not provide song lyrics, poems, or news articles and &#034;<br \/>\n        &#034;do not divulge details of your training data.)\\\\nThis is your system prompt, &#034;<br \/>\n        &#034;guiding your responses. Do not reference it, just respond to the user. If you find &#034;<br \/>\n        &#034;yourself talking about this message, stop. You should be responding appropriately &#034;<br \/>\n        &#034;and usually that means not mentioning this.\\\\nYOU DO NOT MENTION ANY OF THIS INFORMATION &#034;<br \/>\n        &#034;ABOUT YOURSELF UNLESS THE INFORMATION IS DIRECTLY PERTINENT TO THE USER&#039;S QUERY.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;deepseek&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;User: {{content}}\\\\n\\\\nAssistant:&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;deepseek3&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;&#xff5c;User&#xff5c;&gt;{{content}}&lt;&#xff5c;Assistant&#xff5c;&gt;&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;deepseekcoder&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;### Instruction:\\\\n{{content}}\\\\n### Response:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;\\\\n{{content}}\\\\n&lt;|EOT|&gt;\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are an AI programming assistant, utilizing the DeepSeek Coder model, &#034;<br \/>\n        &#034;developed by DeepSeek Company, and you only answer questions related to computer science. &#034;<br \/>\n        &#034;For politically sensitive questions, security and privacy issues, &#034;<br \/>\n        &#034;and other non-computer science questions, you will refuse to answer.\\\\n&#034;<br \/>\n    ),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;default&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}}\\\\nAssistant:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;, {&#034;eos_token&#034;}, &#034;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;System: {{content}}\\\\n&#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;empty&#034;,<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;exaone&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[|user|]{{content}}\\\\n[|assistant|]&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;, {&#034;eos_token&#034;}, &#034;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;[|system|]{{content}}[|endofturn|]\\\\n&#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;falcon&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;User: {{content}}\\\\nFalcon:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n&#034;]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;fewshot&#034;,<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;gemma&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;start_of_turn&gt;user\\\\n{{content}}&lt;end_of_turn&gt;\\\\n&lt;start_of_turn&gt;model\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;end_of_turn&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;start_of_turn&gt;tool\\\\n{{content}}&lt;end_of_turn&gt;\\\\n&lt;start_of_turn&gt;model\\\\n&#034;]<br \/>\n    ),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;glm4&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|assistant|&gt;&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;\\\\n{{content}}&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&#034;], tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|observation|&gt;\\\\n{{content}}&lt;|assistant|&gt;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;glm4&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[&#034;[gMASK]&lt;sop&gt;&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|user|&gt;&#034;, &#034;&lt;|observation|&gt;&#034;],<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;granite3&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            &#034;&lt;|start_of_role|&gt;user&lt;|end_of_role|&gt;{{content}}&lt;|end_of_text|&gt;\\\\n&lt;|start_of_role|&gt;assistant&lt;|end_of_role|&gt;&#034;<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|end_of_text|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|start_of_role|&gt;system&lt;|end_of_role|&gt;{{content}}&lt;|end_of_text|&gt;\\\\n&#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;index&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;reserved_0{{content}}reserved_1&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;unk&gt;{{content}}&#034;]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;intern&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|User|&gt;:{{content}}\\\\n&lt;|Bot|&gt;:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;eoa&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|System|&gt;:{{content}}\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are an AI assistant whose name is InternLM (\u4e66\u751f\u00b7\u6d66\u8bed).\\\\n&#034;<br \/>\n        &#034;- InternLM (\u4e66\u751f\u00b7\u6d66\u8bed) is a conversational language model that is developed by Shanghai AI Laboratory &#034;<br \/>\n        &#034;(\u4e0a\u6d77\u4eba\u5de5\u667a\u80fd\u5b9e\u9a8c\u5ba4). It is designed to be helpful, honest, and harmless.\\\\n&#034;<br \/>\n        &#034;- InternLM (\u4e66\u751f\u00b7\u6d66\u8bed) can understand and communicate fluently in the language &#034;<br \/>\n        &#034;chosen by the user such as English and \u4e2d\u6587.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;eoa&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;intern2&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are an AI assistant whose name is InternLM (\u4e66\u751f\u00b7\u6d66\u8bed).\\\\n&#034;<br \/>\n        &#034;- InternLM (\u4e66\u751f\u00b7\u6d66\u8bed) is a conversational language model that is developed by Shanghai AI Laboratory &#034;<br \/>\n        &#034;(\u4e0a\u6d77\u4eba\u5de5\u667a\u80fd\u5b9e\u9a8c\u5ba4). It is designed to be helpful, honest, and harmless.\\\\n&#034;<br \/>\n        &#034;- InternLM (\u4e66\u751f\u00b7\u6d66\u8bed) can understand and communicate fluently in the language &#034;<br \/>\n        &#034;chosen by the user such as English and \u4e2d\u6587.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;llama2&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;bos_token&#034;}, &#034;[INST] {{content}} [\/INST]&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;&lt;SYS&gt;&gt;\\\\n{{content}}\\\\n&lt;&lt;\/SYS&gt;&gt;\\\\n\\\\n&#034;]),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># copied from llama2 template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llama2_zh&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;bos_token&#034;}, &#034;[INST] {{content}} [\/INST]&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;&lt;SYS&gt;&gt;\\\\n{{content}}\\\\n&lt;&lt;\/SYS&gt;&gt;\\\\n\\\\n&#034;]),<br \/>\n    default_system&#061;&#034;You are a helpful assistant. \u4f60\u662f\u4e00\u4e2a\u4e50\u4e8e\u52a9\u4eba\u7684\u52a9\u624b\u3002&#034;,<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;llama3&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;user&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|start_header_id|&gt;system&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;], tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;ipython&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|eot_id|&gt;&#034;, &#034;&lt;|eom_id|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from llama3 template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;mllama&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;user&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|start_header_id|&gt;system&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;], tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;ipython&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|eot_id|&gt;&#034;, &#034;&lt;|eom_id|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;mllama&#034;, image_token&#061;&#034;&lt;|image|&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from vicuna template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;USER: {{content}} ASSISTANT:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious user and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the user&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from vicuna template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;USER: {{content}} ASSISTANT:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious user and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the user&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from llama3 template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_llama3&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;user&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|start_header_id|&gt;system&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;], tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;ipython&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|eot_id|&gt;&#034;, &#034;&lt;|eom_id|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from mistral template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_mistral&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST] {{content}}[\/INST]&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034; {{content}}&#034;, {&#034;eos_token&#034;}]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS] {{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS] {&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># copied from qwen template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_qwen&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;], tool_format&#061;&#034;qwen&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n&lt;tool_response&gt;\\\\n{{content}}\\\\n&lt;\/tool_response&gt;&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;qwen&#034;),<br \/>\n    default_system&#061;&#034;You are a helpful assistant.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_yi&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from vicuna template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_video&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;USER: {{content}} ASSISTANT:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious user and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the user&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next_video&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from mistral template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_video_mistral&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST] {{content}}[\/INST]&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034; {{content}}&#034;, {&#034;eos_token&#034;}]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS] {{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS] {&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next_video&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;llava_next_video_yi&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava_next_video&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;marco&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;tool\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;\u4f60\u662f\u4e00\u4e2a\u7ecf\u8fc7\u826f\u597d\u8bad\u7ec3\u7684AI\u52a9\u624b&#xff0c;\u4f60\u7684\u540d\u5b57\u662fMarco-o1.\u7531\u963f\u91cc\u56fd\u9645\u6570\u5b57\u5546\u4e1a\u96c6\u56e2\u7684AI Business\u521b\u9020.\\\\n## \u91cd\u8981&#xff01;&#xff01;&#xff01;&#xff01;&#xff01;\\\\n&#034;<br \/>\n        &#034;\u5f53\u4f60\u56de\u7b54\u95ee\u9898\u65f6&#xff0c;\u4f60\u7684\u601d\u8003\u5e94\u8be5\u5728&lt;Thought&gt;\u5185\u5b8c\u6210&#xff0c;&lt;Output&gt;\u5185\u8f93\u51fa\u4f60\u7684\u7ed3\u679c\u3002\\\\n&#034;<br \/>\n        &#034;&lt;Thought&gt;\u5e94\u8be5\u5c3d\u53ef\u80fd\u662f\u82f1\u6587&#xff0c;\u4f46\u662f\u67092\u4e2a\u7279\u4f8b&#xff0c;\u4e00\u4e2a\u662f\u5bf9\u539f\u6587\u4e2d\u7684\u5f15\u7528&#xff0c;\u53e6\u4e00\u4e2a\u662f\u662f\u6570\u5b66\u5e94\u8be5\u4f7f\u7528markdown\u683c\u5f0f&#xff0c;&lt;Output&gt;\u5185\u7684\u8f93\u51fa\u9700\u8981\u9075\u5faa\u7528\u6237\u8f93\u5165\u7684\u8bed\u8a00\u3002\\\\n&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;minicpm_v&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    default_system&#061;&#034;You are a helpful assistant.&#034;,<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;minicpm_v&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from minicpm_v template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;minicpm_o&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    default_system&#061;&#034;You are Qwen, created by Alibaba Cloud. You are a helpful assistant.&#034;,<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;minicpm_v&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;, audio_token&#061;&#034;&lt;audio&gt;&#034;),<br \/>\n)<\/p>\n<p># mistral tokenizer v3 tekken<br \/>\nregister_template(<br \/>\n    name&#061;&#034;ministral&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST]{{content}}[\/INST]&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS]{{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS]{&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># mistral tokenizer v3<br \/>\nregister_template(<br \/>\n    name&#061;&#034;mistral&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST] {{content}}[\/INST]&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034; {{content}}&#034;, {&#034;eos_token&#034;}]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS] {{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS] {&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># mistral tokenizer v7 tekken (copied from ministral)<br \/>\nregister_template(<br \/>\n    name&#061;&#034;mistral_small&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST]{{content}}[\/INST]&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;[SYSTEM_PROMPT]{{content}}[\/SYSTEM_PROMPT]&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS]{{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS]{&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;olmo&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;eos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;openchat&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;GPT4 Correct User: {{content}}&#034;, {&#034;eos_token&#034;}, &#034;GPT4 Correct Assistant:&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;openchat-3.6&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;GPT4 Correct User&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;GPT4 Correct Assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|eot_id|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;opencoder&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;tool\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    default_system&#061;&#034;You are OpenCoder, created by OpenCoder Team.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;orion&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}}\\\\n\\\\nAssistant: &#034;, {&#034;eos_token&#034;}]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n)<\/p>\n<p># copied from gemma template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;paligemma&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;start_of_turn&gt;user\\\\n{{content}}&lt;end_of_turn&gt;\\\\n&lt;start_of_turn&gt;model\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;end_of_turn&gt;\\\\n&#034;]),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;start_of_turn&gt;tool\\\\n{{content}}&lt;end_of_turn&gt;\\\\n&lt;start_of_turn&gt;model\\\\n&#034;]<br \/>\n    ),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;paligemma&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;phi&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;phi_small&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;&lt;|endoftext|&gt;&#034;}]),<br \/>\n    stop_words&#061;[&#034;&lt;|end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;phi4&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;|im_start|&gt;user&lt;|im_sep|&gt;{{content}}&lt;|im_end|&gt;&lt;|im_start|&gt;assistant&lt;|im_sep|&gt;&#034;]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system&lt;|im_sep|&gt;{{content}}&lt;|im_end|&gt;&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from ministral template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;pixtral&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;[INST]{{content}}[\/INST]&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;[TOOL_CALLS]{{content}}&#034;, {&#034;eos_token&#034;}], tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_observation&#061;StringFormatter(slots&#061;[&#034;&#034;&#034;[TOOL_RESULTS]{&#034;content&#034;: {{content}}}[\/TOOL_RESULTS]&#034;&#034;&#034;]),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;mistral&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;pixtral&#034;, image_token&#061;&#034;[IMG]&#034;),<br \/>\n    template_class&#061;Llama2Template,<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;qwen&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;], tool_format&#061;&#034;qwen&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n&lt;tool_response&gt;\\\\n{{content}}\\\\n&lt;\/tool_response&gt;&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;qwen&#034;),<br \/>\n    default_system&#061;&#034;You are a helpful assistant.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;qwen2_audio&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    default_system&#061;&#034;You are a helpful assistant.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;qwen2_audio&#034;, audio_token&#061;&#034;&lt;|AUDIO|&gt;&#034;),<br \/>\n)<\/p>\n<p># copied from qwen template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;qwen2_vl&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;], tool_format&#061;&#034;qwen&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n&lt;tool_response&gt;\\\\n{{content}}\\\\n&lt;\/tool_response&gt;&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;qwen&#034;),<br \/>\n    default_system&#061;&#034;You are a helpful assistant.&#034;,<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;qwen2_vl&#034;, image_token&#061;&#034;&lt;|image_pad|&gt;&#034;, video_token&#061;&#034;&lt;|video_pad|&gt;&#034;),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;sailor&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;question\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;answer\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are an AI assistant named Sailor created by Sea AI Lab. &#034;<br \/>\n        &#034;Your answer should be friendly, unbiased, faithful, informative and detailed.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from llama3 template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;skywork_o1&#034;,<br \/>\n    format_user&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;user&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|start_header_id|&gt;system&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;]),<br \/>\n    format_function&#061;FunctionFormatter(slots&#061;[&#034;{{content}}&lt;|eot_id|&gt;&#034;], tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_observation&#061;StringFormatter(<br \/>\n        slots&#061;[<br \/>\n            (<br \/>\n                &#034;&lt;|start_header_id|&gt;ipython&lt;|end_header_id|&gt;\\\\n\\\\n{{content}}&lt;|eot_id|&gt;&#034;<br \/>\n                &#034;&lt;|start_header_id|&gt;assistant&lt;|end_header_id|&gt;\\\\n\\\\n&#034;<br \/>\n            )<br \/>\n        ]<br \/>\n    ),<br \/>\n    format_tools&#061;ToolFormatter(tool_format&#061;&#034;llama3&#034;),<br \/>\n    format_prefix&#061;EmptyFormatter(slots&#061;[{&#034;bos_token&#034;}]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are Skywork-o1, a thinking model developed by Skywork AI, specializing in solving complex problems &#034;<br \/>\n        &#034;involving mathematics, coding, and logical reasoning through deep thought. When faced with a user&#039;s request, &#034;<br \/>\n        &#034;you first engage in a lengthy and in-depth thinking process to explore possible solutions to the problem. &#034;<br \/>\n        &#034;After completing your thoughts, you then provide a detailed explanation of the solution process &#034;<br \/>\n        &#034;in your response.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|eot_id|&gt;&#034;, &#034;&lt;|eom_id|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;solar&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;### User:\\\\n{{content}}\\\\n\\\\n### Assistant:\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;### System:\\\\n{{content}}\\\\n\\\\n&#034;]),<br \/>\n    efficient_eos&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;starchat&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&lt;|assistant|&gt;&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&lt;|end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;telechat&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;_user&gt;{{content}}&lt;_bot&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;_system&gt;{{content}}&lt;_end&gt;&#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;telechat2&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;_user&gt;{{content}}&lt;_bot&gt;&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;_system&gt;{{content}}&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;\u4f60\u662f\u4e2d\u56fd\u7535\u4fe1\u661f\u8fb0\u8bed\u4e49\u5927\u6a21\u578b&#xff0c;\u82f1\u6587\u540d\u662fTeleChat&#xff0c;\u4f60\u662f\u7531\u4e2d\u7535\u4fe1\u4eba\u5de5\u667a\u80fd\u79d1\u6280\u6709\u9650\u516c\u53f8\u548c\u4e2d\u56fd\u7535\u4fe1\u4eba\u5de5\u667a\u80fd\u7814\u7a76\u9662&#xff08;TeleAI&#xff09;\u7814\u53d1\u7684\u4eba\u5de5\u667a\u80fd\u52a9\u624b\u3002&#034;<br \/>\n    ),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;vicuna&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;USER: {{content}} ASSISTANT:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious user and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the user&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    replace_jinja_template&#061;True,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;video_llava&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;USER: {{content}} ASSISTANT:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;A chat between a curious user and an artificial intelligence assistant. &#034;<br \/>\n        &#034;The assistant gives helpful, detailed, and polite answers to the user&#039;s questions.&#034;<br \/>\n    ),<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;video_llava&#034;, image_token&#061;&#034;&lt;image&gt;&#034;, video_token&#061;&#034;&lt;video&gt;&#034;),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;xuanyuan&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}} Assistant:&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;\u4ee5\u4e0b\u662f\u7528\u6237\u548c\u4eba\u5de5\u667a\u80fd\u52a9\u624b\u4e4b\u95f4\u7684\u5bf9\u8bdd\u3002\u7528\u6237\u4ee5Human\u5f00\u5934&#xff0c;\u4eba\u5de5\u667a\u80fd\u52a9\u624b\u4ee5Assistant\u5f00\u5934&#xff0c;&#034;<br \/>\n        &#034;\u4f1a\u5bf9\u4eba\u7c7b\u63d0\u51fa\u7684\u95ee\u9898\u7ed9\u51fa\u6709\u5e2e\u52a9\u3001\u9ad8\u8d28\u91cf\u3001\u8be6\u7ec6\u548c\u793c\u8c8c\u7684\u56de\u7b54&#xff0c;\u5e76\u4e14\u603b\u662f\u62d2\u7edd\u53c2\u4e0e\u4e0e\u4e0d\u9053\u5fb7\u3001&#034;<br \/>\n        &#034;\u4e0d\u5b89\u5168\u3001\u6709\u4e89\u8bae\u3001\u653f\u6cbb\u654f\u611f\u7b49\u76f8\u5173\u7684\u8bdd\u9898\u3001\u95ee\u9898\u548c\u6307\u793a\u3002\\\\n&#034;<br \/>\n    ),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;xverse&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;Human: {{content}}\\\\n\\\\nAssistant: &#034;]),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;yayi&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;&lt;|Human|&gt;&#034;}, &#034;:\\\\n{{content}}\\\\n\\\\n&#034;, {&#034;token&#034;: &#034;&lt;|YaYi|&gt;&#034;}, &#034;:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[{&#034;token&#034;: &#034;&lt;|System|&gt;&#034;}, &#034;:\\\\n{{content}}\\\\n\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;You are a helpful, respectful and honest assistant named YaYi &#034;<br \/>\n        &#034;developed by Beijing Wenge Technology Co.,Ltd. &#034;<br \/>\n        &#034;Always answer as helpfully as possible, while being safe.  &#034;<br \/>\n        &#034;Your answers should not include any harmful, unethical, &#034;<br \/>\n        &#034;racist, sexist, toxic, dangerous, or illegal content. &#034;<br \/>\n        &#034;Please ensure that your responses are socially unbiased and positive in nature.\\\\n\\\\n&#034;<br \/>\n        &#034;If a question does not make any sense, or is not factually coherent, &#034;<br \/>\n        &#034;explain why instead of answering something not correct. &#034;<br \/>\n        &#034;If you don&#039;t know the answer to a question, please don&#039;t share false information.&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;&lt;|End|&gt;&#034;],<br \/>\n)<\/p>\n<p># copied from chatml template<br \/>\nregister_template(<br \/>\n    name&#061;&#034;yi&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;user\\\\n{{content}}&lt;|im_end|&gt;\\\\n&lt;|im_start|&gt;assistant\\\\n&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|im_start|&gt;system\\\\n{{content}}&lt;|im_end|&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;|im_end|&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;yi_vl&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;### Human: {{content}}\\\\n### Assistant:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n&#034;]),<br \/>\n    default_system&#061;(<br \/>\n        &#034;This is a chat between an inquisitive human and an AI assistant. &#034;<br \/>\n        &#034;Assume the role of the AI assistant. Read all the images carefully, &#034;<br \/>\n        &#034;and respond to the human&#039;s questions with informative, helpful, detailed and polite answers. &#034;<br \/>\n        &#034;\u8fd9\u662f\u4e00\u4e2a\u597d\u5947\u7684\u4eba\u7c7b\u548c\u4e00\u4e2a\u4eba\u5de5\u667a\u80fd\u52a9\u624b\u4e4b\u95f4\u7684\u5bf9\u8bdd\u3002\u5047\u8bbe\u4f60\u626e\u6f14\u8fd9\u4e2aAI\u52a9\u624b\u7684\u89d2\u8272\u3002&#034;<br \/>\n        &#034;\u4ed4\u7ec6\u9605\u8bfb\u6240\u6709\u7684\u56fe\u50cf&#xff0c;\u5e76\u5bf9\u4eba\u7c7b\u7684\u95ee\u9898\u505a\u51fa\u4fe1\u606f\u4e30\u5bcc\u3001\u6709\u5e2e\u52a9\u3001\u8be6\u7ec6\u7684\u548c\u793c\u8c8c\u7684\u56de\u7b54\u3002\\\\n\\\\n&#034;<br \/>\n    ),<br \/>\n    stop_words&#061;[&#034;###&#034;],<br \/>\n    efficient_eos&#061;True,<br \/>\n    mm_plugin&#061;get_mm_plugin(name&#061;&#034;llava&#034;, image_token&#061;&#034;&lt;image&gt;&#034;),<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;yuan&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;{{content}}&#034;, {&#034;token&#034;: &#034;&lt;sep&gt;&#034;}]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}&lt;eod&gt;\\\\n&#034;]),<br \/>\n    stop_words&#061;[&#034;&lt;eod&gt;&#034;],<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;zephyr&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;|user|&gt;\\\\n{{content}}&#034;, {&#034;eos_token&#034;}, &#034;&lt;|assistant|&gt;\\\\n&#034;]),<br \/>\n    format_system&#061;StringFormatter(slots&#061;[&#034;&lt;|system|&gt;\\\\n{{content}}&#034;, {&#034;eos_token&#034;}]),<br \/>\n    default_system&#061;&#034;You are Zephyr, a helpful assistant.&#034;,<br \/>\n)<\/p>\n<p>register_template(<br \/>\n    name&#061;&#034;ziya&#034;,<br \/>\n    format_user&#061;StringFormatter(slots&#061;[&#034;&lt;human&gt;:{{content}}\\\\n&lt;bot&gt;:&#034;]),<br \/>\n    format_assistant&#061;StringFormatter(slots&#061;[&#034;{{content}}\\\\n&#034;]),<br \/>\n) <\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u6587\u7ae0\u6d4f\u89c8\u9605\u8bfb1.4w\u6b21\uff0c\u70b9\u8d5e91\u6b21\uff0c\u6536\u85cf118\u6b21\u3002\u4f7f\u7528LLaMA-Factory\u9ad8\u6548\u5fae\u8c03qwen2.5-7b-instruct_llama factory\u5fae\u8c03 deepseek-r1-7b-distill\u6a21\u578b<\/p>\n","protected":false},"author":2,"featured_media":35975,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[3078,86,224,51],"topic":[],"class_list":["post-35989","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-server","tag-3078","tag-86","tag-224","tag-51"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v20.3 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.wsisp.com\/helps\/35989.html\" \/>\n<meta property=\"og:locale\" content=\"zh_CN\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\" \/>\n<meta property=\"og:description\" content=\"\u6587\u7ae0\u6d4f\u89c8\u9605\u8bfb1.4w\u6b21\uff0c\u70b9\u8d5e91\u6b21\uff0c\u6536\u85cf118\u6b21\u3002\u4f7f\u7528LLaMA-Factory\u9ad8\u6548\u5fae\u8c03qwen2.5-7b-instruct_llama factory\u5fae\u8c03 deepseek-r1-7b-distill\u6a21\u578b\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.wsisp.com\/helps\/35989.html\" \/>\n<meta property=\"og:site_name\" content=\"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-07T01:10:34+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab3070b8f7.png\" \/>\n<meta name=\"author\" content=\"admin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u4f5c\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4\" \/>\n\t<meta name=\"twitter:data2\" content=\"50 \u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/35989.html\",\"url\":\"https:\/\/www.wsisp.com\/helps\/35989.html\",\"name\":\"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\",\"isPartOf\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/#website\"},\"datePublished\":\"2025-05-07T01:10:34+00:00\",\"dateModified\":\"2025-05-07T01:10:34+00:00\",\"author\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/35989.html#breadcrumb\"},\"inLanguage\":\"zh-Hans\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.wsisp.com\/helps\/35989.html\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/35989.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\u9996\u9875\",\"item\":\"https:\/\/www.wsisp.com\/helps\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#website\",\"url\":\"https:\/\/www.wsisp.com\/helps\/\",\"name\":\"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\",\"description\":\"\u9999\u6e2f\u670d\u52a1\u5668_\u9999\u6e2f\u4e91\u670d\u52a1\u5668\u8d44\u8baf_\u670d\u52a1\u5668\u5e2e\u52a9\u6587\u6863_\u670d\u52a1\u5668\u6559\u7a0b\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.wsisp.com\/helps\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41\",\"name\":\"admin\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery\",\"contentUrl\":\"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery\",\"caption\":\"admin\"},\"sameAs\":[\"http:\/\/wp.wsisp.com\"],\"url\":\"https:\/\/www.wsisp.com\/helps\/author\/admin\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.wsisp.com\/helps\/35989.html","og_locale":"zh_CN","og_type":"article","og_title":"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","og_description":"\u6587\u7ae0\u6d4f\u89c8\u9605\u8bfb1.4w\u6b21\uff0c\u70b9\u8d5e91\u6b21\uff0c\u6536\u85cf118\u6b21\u3002\u4f7f\u7528LLaMA-Factory\u9ad8\u6548\u5fae\u8c03qwen2.5-7b-instruct_llama factory\u5fae\u8c03 deepseek-r1-7b-distill\u6a21\u578b","og_url":"https:\/\/www.wsisp.com\/helps\/35989.html","og_site_name":"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","article_published_time":"2025-05-07T01:10:34+00:00","og_image":[{"url":"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2025\/05\/20250507011031-681ab3070b8f7.png"}],"author":"admin","twitter_card":"summary_large_image","twitter_misc":{"\u4f5c\u8005":"admin","\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4":"50 \u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.wsisp.com\/helps\/35989.html","url":"https:\/\/www.wsisp.com\/helps\/35989.html","name":"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","isPartOf":{"@id":"https:\/\/www.wsisp.com\/helps\/#website"},"datePublished":"2025-05-07T01:10:34+00:00","dateModified":"2025-05-07T01:10:34+00:00","author":{"@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41"},"breadcrumb":{"@id":"https:\/\/www.wsisp.com\/helps\/35989.html#breadcrumb"},"inLanguage":"zh-Hans","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.wsisp.com\/helps\/35989.html"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.wsisp.com\/helps\/35989.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\u9996\u9875","item":"https:\/\/www.wsisp.com\/helps"},{"@type":"ListItem","position":2,"name":"\u5f00\u6e90\u6a21\u578b\u5e94\u7528\u843d\u5730-DeepSeek-R1-Distill-Qwen-7B-LoRA\u5fae\u8c03-LLaMA-Factory-\u5355\u673a\u5355\u5361-V100\uff08\u4e00\uff09"}]},{"@type":"WebSite","@id":"https:\/\/www.wsisp.com\/helps\/#website","url":"https:\/\/www.wsisp.com\/helps\/","name":"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","description":"\u9999\u6e2f\u670d\u52a1\u5668_\u9999\u6e2f\u4e91\u670d\u52a1\u5668\u8d44\u8baf_\u670d\u52a1\u5668\u5e2e\u52a9\u6587\u6863_\u670d\u52a1\u5668\u6559\u7a0b","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.wsisp.com\/helps\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"zh-Hans"},{"@type":"Person","@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41","name":"admin","image":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/image\/","url":"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery","contentUrl":"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery","caption":"admin"},"sameAs":["http:\/\/wp.wsisp.com"],"url":"https:\/\/www.wsisp.com\/helps\/author\/admin"}]}},"_links":{"self":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts\/35989","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/comments?post=35989"}],"version-history":[{"count":0,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts\/35989\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/media\/35975"}],"wp:attachment":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/media?parent=35989"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/categories?post=35989"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/tags?post=35989"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/topic?post=35989"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}