{"id":61768,"date":"2026-01-18T15:05:38","date_gmt":"2026-01-18T07:05:38","guid":{"rendered":"https:\/\/www.wsisp.com\/helps\/61768.html"},"modified":"2026-01-18T15:05:38","modified_gmt":"2026-01-18T07:05:38","slug":"llama-factory%e4%bd%bf%e7%94%a8","status":"publish","type":"post","link":"https:\/\/www.wsisp.com\/helps\/61768.html","title":{"rendered":"LLaMA-Factory\u4f7f\u7528"},"content":{"rendered":"<\/p>\n<h4>\u6587\u7ae0\u76ee\u5f55<\/h4>\n<ul>\n<li>\u4e00\u3001LLAMA-Factory\u7b80\u4ecb<\/li>\n<li>\u4e8c\u3001\u5b89\u88c5LLaMA-Factory<\/li>\n<li>\u4e09\u3001\u51c6\u5907\u8bad\u7ec3\u6570\u636e<\/li>\n<li>\u56db\u3001\u6a21\u578b\u8bad\u7ec3<\/li>\n<li>\n<ul>\n<li>1. \u6a21\u578b\u4e0b\u8f7d<\/li>\n<li>2. \u5168\u91cf\u5fae\u8c03<\/li>\n<li>3.lora\u5fae\u8c03<\/li>\n<li>4.QLora\u5fae\u8c03<\/li>\n<\/ul>\n<\/li>\n<li>\u4e94\u3001\u5408\u5e76\u6a21\u578b\u6743\u91cd<\/li>\n<li>\n<ul>\n<li>1.\u6a21\u578b\u5408\u5e76<\/li>\n<li>2.\u6d4b\u8bd5<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<h2>\u4e00\u3001LLAMA-Factory\u7b80\u4ecb<\/h2>\n<p>LLaMA-Factory\u662f\u4e00\u4e2a\u7b80\u5355\u6613\u7528\u4e14\u9ad8\u6548\u7684\u5927\u6a21\u578b\u8bad\u7ec3\u6846\u67b6&#xff0c;\u652f\u6301\u4e0a\u767e\u79cd\u5927\u6a21\u578b\u7684\u8bad\u7ec3&#xff0c;\u6846\u67b6\u7279\u6027\u4e3b\u8981\u5305\u62ec&#xff1a;<\/p>\n<ul>\n<li>\u6a21\u578b\u79cd\u7c7b&#xff1a;LLaMA\u3001LLaVA\u3001Mistral\u3001Mixtral-MoE\u3001Qwen\u3001Yi\u3001Gemma\u3001Baichuan\u3001ChatGLM\u3001Phi \u7b49\u7b49\u3002<\/li>\n<li>\u8bad\u7ec3\u7b97\u6cd5&#xff1a;&#xff08;\u589e\u91cf&#xff09;\u9884\u8bad\u7ec3\u3001&#xff08;\u591a\u6a21\u6001&#xff09;\u6307\u4ee4\u76d1\u7763\u5fae\u8c03\u3001\u5956\u52b1\u6a21\u578b\u8bad\u7ec3\u3001PPO \u8bad\u7ec3\u3001DPO \u8bad\u7ec3\u3001KTO \u8bad\u7ec3\u3001ORPO \u8bad\u7ec3\u7b49\u7b49\u3002<\/li>\n<li>\u8fd0\u7b97\u7cbe\u5ea6&#xff1a;16\u6bd4\u7279\u5168\u53c2\u6570\u5fae\u8c03\u3001\u51bb\u7ed3\u5fae\u8c03\u3001LoRA\u5fae\u8c03\u548c\u57fa\u4e8eAQLM\/AWQ\/GPTQ\/LLM.int8\/HQQ\/EETQ\u7684\u2154\/\u2158\/6\/8\u6bd4\u7279QLoRA \u5fae\u8c03\u3002<\/li>\n<li>\u4f18\u5316\u7b97\u6cd5&#xff1a;GaLore\u3001BAdam\u3001DoRA\u3001LongLoRA\u3001LLaMA Pro\u3001Mixture-of-Depths\u3001LoRA&#043;\u3001LoftQ\u548cPiSSA\u3002<\/li>\n<li>\u52a0\u901f\u7b97\u5b50&#xff1a;FlashAttention-2\u548cUnsloth\u3002<\/li>\n<li>\u63a8\u7406\u5f15\u64ce&#xff1a;Transformers\u548cvLLM\u3002<\/li>\n<li>\u5b9e\u9a8c\u9762\u677f&#xff1a;LlamaBoard\u3001TensorBoard\u3001Wandb\u3001MLflow\u7b49\u7b49\u3002<\/li>\n<\/ul>\n<p>\u672c\u6587\u5c06\u4ecb\u7ecd\u5982\u4f55\u4f7f\u7528LLAMA-Factory\u5bf9Qwen2.5\u7cfb\u5217\u5927\u6a21\u578b\u8fdb\u884c\u5fae\u8c03&#xff08;Qwen1.5\u7cfb\u5217\u6a21\u578b\u4e5f\u9002\u7528&#xff09;&#xff0c;\u66f4\u591a\u7279\u6027\u8bf7\u53c2\u8003https:\/\/github.com\/hiyouga\/LlamaFactory<\/p>\n<h2>\u4e8c\u3001\u5b89\u88c5LLaMA-Factory<\/h2>\n<p>LLaMA-Factory\u7684github\u5730\u5740\u4e3a&#xff1a;https:\/\/github.com\/hiyouga\/LLaMA-Factory \u3002\u4e3a\u9632\u6b62\u9879\u76ee\u66f4\u65b0\u5e26\u6765\u8f6f\u4ef6\u7248\u672c\u4e0d\u9002\u914d&#xff0c;\u6211\u4eec\u4e0b\u9762\u5b89\u88c5\u4e00\u4e2a\u5386\u53f2\u7248\u672c\u3002<\/p>\n<ul>\n<li>\u5728\u4f7f\u7528AutoDL\u514b\u9686git\u4ed3\u5e93\u65f6&#xff0c;\u901f\u5ea6\u8f83\u6162&#xff0c;\u53ef\u4ee5\u8fd0\u884c\u5982\u4e0b\u547d\u4ee4\u3002<\/li>\n<\/ul>\n<p>source \/etc\/network_turbo<\/p>\n<ul>\n<li>\u4e0b\u8f7d\u5e76\u5b89\u88c5LLaMA-Factory&#xff1a;<\/li>\n<\/ul>\n<p>cd \/root\/autodl-tmp git clone &#8211;depth 1 https:\/\/github.com\/Jiangnanjiezi\/LLaMA-Factory.git cd LLaMA-Factory pip install -e \u201c.[torch,metrics]\u201d -i https:\/\/mirrors.aliyun.com\/pypi\/simple\/<\/p>\n<ul>\n<li>\u5b89\u88c5\u5b8c\u6210\u540e&#xff0c;\u6267\u884c llamafactory-cli version&#xff0c;\u82e5\u51fa\u73b0\u4ee5\u4e0b\u63d0\u793a&#xff0c;\u5219\u8868\u660e\u5b89\u88c5\u6210\u529f&#xff1a; <img decoding=\"async\" src=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2026\/01\/20260118070537-696c8641213c7.png\" alt=\"\u5728\u8fd9\u91cc\u63d2\u5165\u56fe\u7247\u63cf\u8ff0\" \/><\/li>\n<\/ul>\n<h2>\u4e09\u3001\u51c6\u5907\u8bad\u7ec3\u6570\u636e<\/h2>\n<p>\u8bad\u7ec3\u6570\u636e\u5e94\u4fdd\u5b58\u4e3ajson\u6587\u4ef6&#xff0c;\u6587\u4ef6\u4e3a&#xff1a;qwen_dataset.json\u3002\u9700\u8981\u5c06\u5176\u653e\u5230 autodl-tmp\/LLaMA-Factory\/data \u4e0b\u3002<\/p>\n<p>\u5176\u5185\u5bb9\u793a\u4f8b\u5982\u4e0b&#xff1a;<\/p>\n<p><span class=\"token punctuation\">[<\/span><br \/>\n  <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;instruction&#034;<\/span>: <span class=\"token string\">&#034;\u8bf7\u63d0\u53d6\u4ee5\u4e0b\u5185\u5bb9\u4e2d\u7684\u6458\u8981\u4fe1\u606f&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;input&#034;<\/span>: <span class=\"token string\">&#034;\u4fdd\u6301\u8eab\u4f53\u5065\u5eb7\u7684\u4e94\u4e2a\u65b9\u6cd5&#xff1a;\\\\n\\\\n1. \u6bcf\u5929\u81f3\u5c11\u996e\u75288\u676f\u6c34&#xff0c;\u4fc3\u8fdb\u65b0\u9648\u4ee3\u8c22\\\\n2. \u6bcf\u5468\u8fdb\u884c150\u5206\u949f\u4e2d\u7b49\u5f3a\u5ea6\u8fd0\u52a8&#xff0c;\u5982\u5feb\u8d70\u6216\u6e38\u6cf3\\\\n3. \u4fdd\u8bc17-9\u5c0f\u65f6\u9ad8\u8d28\u91cf\u7761\u7720&#xff0c;\u907f\u514d\u71ac\u591c\\\\n4. \u996e\u98df\u4e2d\u589e\u52a0\u852c\u83dc\u6c34\u679c\u6bd4\u4f8b&#xff0c;\u51cf\u5c11\u6cb9\u70b8\u98df\u54c1\\\\n5. \u5b9a\u671f\u4f53\u68c0&#xff0c;\u76d1\u6d4b\u8840\u538b\u3001\u8840\u7cd6\u7b49\u6307\u6807&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;output&#034;<\/span>: <span class=\"token string\">&#034;\u591a\u559d\u6c34\u3001\u89c4\u5f8b\u8fd0\u52a8\u3001\u5145\u8db3\u7761\u7720\u3001\u5747\u8861\u996e\u98df\u3001\u5b9a\u671f\u4f53\u68c0&#034;<\/span><br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;instruction&#034;<\/span>: <span class=\"token string\">&#034;\u8bf7\u63d0\u53d6\u4ee5\u4e0b\u5185\u5bb9\u4e2d\u7684\u6458\u8981\u4fe1\u606f&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;input&#034;<\/span>: <span class=\"token string\">&#034;\u63d0\u9ad8\u5b66\u4e60\u6548\u7387\u7684\u4e09\u4e2a\u6280\u5de7&#xff1a;\\\\n\\\\n1. \u4f7f\u7528\u756a\u8304\u5de5\u4f5c\u6cd5&#xff0c;\u6bcf25\u5206\u949f\u4e13\u6ce8\u540e\u4f11\u606f5\u5206\u949f\\\\n2. \u5efa\u7acb\u601d\u7ef4\u5bfc\u56fe\u6574\u7406\u77e5\u8bc6\u6846\u67b6\\\\n3. \u7761\u524d\u590d\u4e60\u91cd\u70b9\u5185\u5bb9\u52a0\u5f3a\u8bb0\u5fc6&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;output&#034;<\/span>: <span class=\"token string\">&#034;\u756a\u8304\u5de5\u4f5c\u6cd5\u3001\u601d\u7ef4\u5bfc\u56fe\u3001\u7761\u524d\u590d\u4e60&#034;<\/span><br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;instruction&#034;<\/span>: <span class=\"token string\">&#034;\u8bf7\u63d0\u53d6\u4ee5\u4e0b\u5185\u5bb9\u4e2d\u7684\u6458\u8981\u4fe1\u606f&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;input&#034;<\/span>: <span class=\"token string\">&#034;\u65c5\u884c\u5fc5\u5907\u7269\u54c1\u6e05\u5355&#xff1a;\\\\n1. \u62a4\u7167\/\u8eab\u4efd\u8bc1\u539f\u4ef6\u53ca\u590d\u5370\u4ef6\\\\n2. \u4fbf\u643a\u5145\u7535\u5b9d\u548c\u8f6c\u6362\u63d2\u5934\\\\n3. \u5e38\u7528\u836f\u54c1&#xff08;\u9000\u70e7\u836f\u3001\u521b\u53ef\u8d34&#xff09;\\\\n4. \u8f7b\u4fbf\u6298\u53e0\u96e8\u4f1e\\\\n5. \u5206\u88c5\u6d17\u6f31\u7528\u54c1&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;output&#034;<\/span>: <span class=\"token string\">&#034;\u8bc1\u4ef6\u3001\u5145\u7535\u8bbe\u5907\u3001\u836f\u54c1\u3001\u96e8\u5177\u3001\u6d17\u6f31\u5305&#034;<\/span><br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;instruction&#034;<\/span>: <span class=\"token string\">&#034;\u8bf7\u63d0\u53d6\u4ee5\u4e0b\u5185\u5bb9\u4e2d\u7684\u6458\u8981\u4fe1\u606f&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;input&#034;<\/span>: <span class=\"token string\">&#034;\u804c\u573a\u6c9f\u901a\u56db\u5927\u539f\u5219&#xff1a;\\\\n\u2460 \u660e\u786e\u6c9f\u901a\u76ee\u6807\\\\n\u2461 \u4f7f\u7528\u91d1\u5b57\u5854\u8868\u8fbe\u7ed3\u6784\\\\n\u2462 \u6ce8\u610f\u975e\u8bed\u8a00\u4fe1\u53f7&#xff08;\u773c\u795e\/\u59ff\u6001&#xff09;\\\\n\u2463 \u53ca\u65f6\u786e\u8ba4\u4fe1\u606f\u7406\u89e3\u5ea6&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;output&#034;<\/span>: <span class=\"token string\">&#034;\u76ee\u6807\u660e\u786e\u3001\u7ed3\u6784\u5316\u8868\u8fbe\u3001\u975e\u8bed\u8a00\u4ea4\u6d41\u3001\u4fe1\u606f\u786e\u8ba4&#034;<\/span><br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><\/p>\n<p>\u5728LLaMA-Factory\u6587\u4ef6\u5939\u4e0b\u7684data\/dataset_info.json\u6587\u4ef6\u4e2d\u6ce8\u518c\u81ea\u5b9a\u4e49\u7684\u8bad\u7ec3\u6570\u636e&#xff0c;\u5728\u6587\u4ef6\u4e2d\u6dfb\u52a0\u5982\u4e0b\u914d\u7f6e\u4fe1\u606f&#xff1a;<\/p>\n<p> <span class=\"token string\">&#034;qwen_dataset&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;file_name&#034;<\/span>: <span class=\"token string\">&#034;qwen_dataset.json&#034;<\/span><br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<h2>\u56db\u3001\u6a21\u578b\u8bad\u7ec3<\/h2>\n<h3>1. \u6a21\u578b\u4e0b\u8f7d<\/h3>\n<ul>\n<li>\u5b89\u88c5modelscope<\/li>\n<\/ul>\n<p>pip install modelscope<\/p>\n<ul>\n<li>\u4e0b\u8f7dQwen2.5<\/li>\n<\/ul>\n<p>mkdir <span class=\"token operator\">&#8211;<\/span>p <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<br \/>\ncd <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<\/p>\n<p><span class=\"token comment\"># \u4e0b\u8f7d\u6a21\u578b<\/span><br \/>\n<span class=\"token comment\"># modelscope download &#8211;model Qwen\/Qwen2.5-7B &#8211;local_dir .\/<\/span><br \/>\n<span class=\"token comment\"># \u56e0\u4e3a7B\u6a21\u578b\u4e0b\u8f7d\u592a\u6162&#xff0c;\u5e76\u4e14\u5fae\u8c03\u6240\u5360\u663e\u5b58\u4e5f\u5927&#xff0c;\u6240\u4ee5\u75281.8B\u6a21\u578b\u6765\u6f14\u793a<\/span><br \/>\nmodelscope download <span class=\"token operator\">&#8212;<\/span>model Qwen\/Qwen2<span class=\"token punctuation\">.<\/span>5-1<span class=\"token punctuation\">.<\/span>5B <span class=\"token operator\">&#8212;<\/span>local_dir <span class=\"token punctuation\">.<\/span><span class=\"token operator\">\/<\/span><\/p>\n<h3>2. \u5168\u91cf\u5fae\u8c03<\/h3>\n<p>\u5728LLaMA-Factory\u6587\u4ef6\u5939\u4e0b&#xff0c;\u521b\u5efa qwen2.5-7b-full-sft.yaml \u914d\u7f6e\u6587\u4ef6&#xff0c;\u7528\u4e8e\u8bbe\u7f6e\u5168\u91cf\u53c2\u6570\u8bad\u7ec3\u7684\u914d\u7f6e\u3002<\/p>\n<p><span class=\"token comment\">### \u6a21\u578b\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9884\u8bad\u7ec3\u6a21\u578b\u7684\u672c\u5730\u8def\u5f84\u6216HuggingFace\u6a21\u578bID<\/span><br \/>\nmodel_name_or_path: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<br \/>\n<span class=\"token comment\"># \u5fc5\u987b\u5f00\u542f\u4ee5\u52a0\u8f7d\u5305\u542b\u81ea\u5b9a\u4e49\u4ee3\u7801\u7684\u6a21\u578b&#xff08;\u5982Qwen\/ChatGLM\u7b49&#xff09;<\/span><br \/>\ntrust_remote_code: true  <\/p>\n<p><span class=\"token comment\">### \u65b9\u6cd5\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u5fae\u8c03\u9636\u6bb5&#xff1a;\u76d1\u7763\u5f0f\u5fae\u8c03 (Supervised Fine-Tuning)<\/span><br \/>\nstage: sft<br \/>\n<span class=\"token comment\"># \u662f\u5426\u6267\u884c\u8bad\u7ec3\u9636\u6bb5<\/span><br \/>\ndo_train: true<br \/>\n<span class=\"token comment\"># \u5fae\u8c03\u7c7b\u578b&#xff1a;\u5168\u53c2\u6570\u5fae\u8c03&#xff08;\u53ef\u9009\u503c&#xff1a;full\/lora\/qlora&#xff09;<\/span><br \/>\nfinetuning_type: full<br \/>\n<span class=\"token comment\"># DeepSpeed\u914d\u7f6e\u6587\u4ef6\u8def\u5f84&#xff08;\u4f7f\u7528ZeRO Stage 3\u4f18\u5316\u7b56\u7565&#xff09;<\/span><br \/>\ndeepspeed: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/examples\/deepspeed\/ds_z3_config<span class=\"token punctuation\">.<\/span>json<\/p>\n<p><span class=\"token comment\">### \u6570\u636e\u96c6\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6570\u636e\u96c6\u540d\u79f0&#xff08;\u9700\u4e0edata\u76ee\u5f55\u4e0b\u7684\u6570\u636e\u96c6\u540d\u79f0\u5bf9\u5e94&#xff09;<\/span><br \/>\ndataset: qwen_dataset<br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6a21\u677f\u683c\u5f0f&#xff08;\u4e0e\u6a21\u578b\u67b6\u6784\u5339\u914d&#xff09;<\/span><br \/>\ntemplate: qwen<br \/>\n<span class=\"token comment\"># \u8f93\u5165\u5e8f\u5217\u6700\u5927\u957f\u5ea6&#xff08;\u5355\u4f4d&#xff1a;token&#xff09;<\/span><br \/>\ncutoff_len: 1024<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u7684\u7f13\u5b58\u6587\u4ef6&#xff08;\u5efa\u8bae\u6570\u636e\u96c6\u4fee\u6539\u540e\u542f\u7528&#xff09;<\/span><br \/>\noverwrite_cache: true<br \/>\n<span class=\"token comment\"># \u6570\u636e\u9884\u5904\u7406\u7684\u5e76\u884c\u8fdb\u7a0b\u6570&#xff08;\u5efa\u8bae\u8bbe\u7f6e\u4e3aCPU\u6838\u5fc3\u6570\u768450-70%&#xff09;<\/span><br \/>\npreprocessing_num_workers: 16<\/p>\n<p><span class=\"token comment\">### \u8f93\u51fa\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u6a21\u578b\u548c\u65e5\u5fd7\u7684\u8f93\u51fa\u76ee\u5f55<\/span><br \/>\noutput_dir: saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full<br \/>\n<span class=\"token comment\"># \u6bcf\u9694\u591a\u5c11\u8bad\u7ec3\u6b65\u8bb0\u5f55\u4e00\u6b21\u65e5\u5fd7<\/span><br \/>\nlogging_steps: 10<br \/>\n<span class=\"token comment\"># \u6bcf\u9694\u591a\u5c11\u8bad\u7ec3\u6b65\u4fdd\u5b58\u4e00\u6b21\u6a21\u578b<\/span><br \/>\nsave_steps: 100<br \/>\n<span class=\"token comment\"># \u662f\u5426\u751f\u6210\u8bad\u7ec3\u635f\u5931\u66f2\u7ebf\u56fe<\/span><br \/>\nplot_loss: true<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u8f93\u51fa\u76ee\u5f55&#xff08;\u5efa\u8bae\u65b0\u8bad\u7ec3\u65f6\u542f\u7528&#xff09;<\/span><br \/>\noverwrite_output_dir: true<\/p>\n<p><span class=\"token comment\">### \u8bad\u7ec3\u53c2\u6570<\/span><br \/>\n<span class=\"token comment\"># \u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f&#xff08;\u5b9e\u9645batch_size &#061; \u6b64\u503c * gradient_accumulation_steps * GPU\u6570\u91cf&#xff09;<\/span><br \/>\nper_device_train_batch_size: 1<br \/>\n<span class=\"token comment\"># \u68af\u5ea6\u7d2f\u79ef\u6b65\u6570&#xff08;\u7528\u4e8e\u6a21\u62df\u66f4\u5927batch_size&#xff09;<\/span><br \/>\ngradient_accumulation_steps: 16<br \/>\n<span class=\"token comment\"># \u521d\u59cb\u5b66\u4e60\u7387&#xff08;\u9002\u54087B\u7ea7\u522b\u6a21\u578b\u7684\u5178\u578b\u503c&#xff09;<\/span><br \/>\nlearning_rate: 1<span class=\"token punctuation\">.<\/span>0e-5<br \/>\n<span class=\"token comment\"># \u8bad\u7ec3\u603b\u8f6e\u6570<\/span><br \/>\nnum_train_epochs: 1<span class=\"token punctuation\">.<\/span>0<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u8c03\u5ea6\u7b56\u7565&#xff08;\u4f59\u5f26\u9000\u706b&#xff09;<\/span><br \/>\nlr_scheduler_type: cosine<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u9884\u70ed\u6bd4\u4f8b&#xff08;\u524d10%\u7684step\u7528\u4e8e\u7ebf\u6027\u9884\u70ed&#xff09;<\/span><br \/>\nwarmup_ratio: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u542f\u7528BF16\u6df7\u5408\u7cbe\u5ea6\u8bad\u7ec3&#xff08;\u9700\u8981Ampere\u67b6\u6784\u4ee5\u4e0aGPU&#xff09;<\/span><br \/>\nbf16: true<br \/>\n<span class=\"token comment\"># \u5206\u5e03\u5f0f\u8bad\u7ec3\u8d85\u65f6\u65f6\u95f4&#xff08;\u5355\u4f4d&#xff1a;\u6beb\u79d2&#xff09;<\/span><br \/>\nddp_timeout: 180000000  <span class=\"token comment\"># \u7ea650\u5c0f\u65f6<\/span><\/p>\n<p><span class=\"token comment\">### \u8bc4\u4f30\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9a8c\u8bc1\u96c6\u5212\u5206\u6bd4\u4f8b&#xff08;\u4ece\u8bad\u7ec3\u96c6\u5212\u5206&#xff09;<\/span><br \/>\nval_size: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u65f6\u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f<\/span><br \/>\nper_device_eval_batch_size: 1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u7b56\u7565&#xff08;\u6309\u8bad\u7ec3\u6b65\u6570\u95f4\u9694\u8bc4\u4f30&#xff09;<\/span><br \/>\neval_strategy: steps<br \/>\n<span class=\"token comment\"># \u6bcf\u9694\u591a\u5c11\u8bad\u7ec3\u6b65\u6267\u884c\u4e00\u6b21\u8bc4\u4f30<\/span><br \/>\neval_steps: 500<\/p>\n<p>deepspeed\u7684\u914d\u7f6e&#xff1a;<\/p>\n<p><span class=\"token punctuation\">{<\/span><br \/>\n  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5168\u5c40\u8bad\u7ec3\u6279\u6b21\u5927\u5c0f&#xff08;\u81ea\u52a8\u8ba1\u7b97\u4e3a&#xff1a;micro_batch <span class=\"token operator\">*<\/span> gpu_num <span class=\"token operator\">*<\/span> gradient_accumulation&#xff09;<br \/>\n  <span class=\"token string\">&#034;train_batch_size&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5355GPU\u7684\u5fae\u6279\u6b21\u5927\u5c0f&#xff08;\u6839\u636e\u663e\u5b58\u81ea\u52a8\u8c03\u6574&#xff09;<br \/>\n  <span class=\"token string\">&#034;train_micro_batch_size_per_gpu&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u68af\u5ea6\u7d2f\u79ef\u6b65\u6570&#xff08;\u81ea\u52a8\u5339\u914dmicro_batch\u914d\u7f6e&#xff09;<br \/>\n  <span class=\"token string\">&#034;gradient_accumulation_steps&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u68af\u5ea6\u88c1\u526a\u9608\u503c&#xff08;\u81ea\u52a8\u7981\u7528\u6216\u8bbe\u7f6e\u9ed8\u8ba41<span class=\"token punctuation\">.<\/span>0&#xff09;<br \/>\n  <span class=\"token string\">&#034;gradient_clipping&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5141\u8bb8\u672a\u7ecf\u5b98\u65b9\u6d4b\u8bd5\u7684\u4f18\u5316\u5668&#xff08;\u9700\u8c28\u614e\u5f00\u542f&#xff09;<br \/>\n  <span class=\"token string\">&#034;zero_allow_untested_optimizer&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> FP16\u6df7\u5408\u7cbe\u5ea6\u914d\u7f6e<br \/>\n  <span class=\"token string\">&#034;fp16&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;enabled&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span>        <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u81ea\u52a8\u6839\u636e\u786c\u4ef6\u517c\u5bb9\u6027\u542f\u7528<br \/>\n    <span class=\"token string\">&#034;loss_scale&#034;<\/span>: 0<span class=\"token punctuation\">,<\/span>          <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u52a8\u6001\u635f\u5931\u7f29\u653e&#xff08;0\u8868\u793a\u81ea\u52a8\u8c03\u6574&#xff09;<br \/>\n    <span class=\"token string\">&#034;loss_scale_window&#034;<\/span>: 1000<span class=\"token punctuation\">,<\/span><span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u7f29\u653e\u8c03\u6574\u7a97\u53e3\u5927\u5c0f&#xff08;1000\u6b21\u8fed\u4ee3&#xff09;<br \/>\n    <span class=\"token string\">&#034;initial_scale_power&#034;<\/span>: 16<span class=\"token punctuation\">,<\/span><span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u521d\u59cb\u7f29\u653e\u6bd4\u4f8b2^16<br \/>\n    <span class=\"token string\">&#034;hysteresis&#034;<\/span>: 2<span class=\"token punctuation\">,<\/span>          <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u7f29\u653e\u5bb9\u5dee&#xff08;\u9632\u6b62\u9891\u7e41\u8c03\u6574&#xff09;<br \/>\n    <span class=\"token string\">&#034;min_loss_scale&#034;<\/span>: 1       <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u6700\u5c0f\u7f29\u653e\u6bd4\u4f8b<br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> BF16\u6df7\u5408\u7cbe\u5ea6\u914d\u7f6e&#xff08;\u4e0eFP16\u4e8c\u9009\u4e00&#xff09;<br \/>\n  <span class=\"token string\">&#034;bf16&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;enabled&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span>         <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5728\u652f\u6301BF16\u7684GPU\u4e0a\u81ea\u52a8\u542f\u7528<br \/>\n  <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> ZeRO\u4f18\u5316\u7b56\u7565&#xff08;Stage3\u5b8c\u6574\u914d\u7f6e&#xff09;<br \/>\n  <span class=\"token string\">&#034;zero_optimization&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n    <span class=\"token string\">&#034;stage&#034;<\/span>: 3<span class=\"token punctuation\">,<\/span>               <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u6700\u9ad8\u4f18\u5316\u7b49\u7ea7&#xff08;\u53c2\u6570<span class=\"token operator\">\/<\/span>\u68af\u5ea6<span class=\"token operator\">\/<\/span>\u4f18\u5316\u5668\u72b6\u6001\u5206\u7247&#xff09;<\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u4f18\u5316\u5668\u72b6\u6001\u5378\u8f7d\u5230CPU<br \/>\n    <span class=\"token string\">&#034;offload_optimizer&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n      <span class=\"token string\">&#034;device&#034;<\/span>: <span class=\"token string\">&#034;cpu&#034;<\/span><span class=\"token punctuation\">,<\/span>        <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5378\u8f7d\u5230CPU\u5185\u5b58<br \/>\n      <span class=\"token string\">&#034;pin_memory&#034;<\/span>: true      <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u4f7f\u7528\u9501\u9875\u5185\u5b58\u52a0\u901f\u4f20\u8f93<br \/>\n    <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u6a21\u578b\u53c2\u6570\u5378\u8f7d\u5230CPU<br \/>\n    <span class=\"token string\">&#034;offload_param&#034;<\/span>: <span class=\"token punctuation\">{<\/span><br \/>\n      <span class=\"token string\">&#034;device&#034;<\/span>: <span class=\"token string\">&#034;cpu&#034;<\/span><span class=\"token punctuation\">,<\/span>        <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u5b58\u50a8\u5230CPU\u5185\u5b58<br \/>\n      <span class=\"token string\">&#034;pin_memory&#034;<\/span>: true      <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u4f7f\u7528DMA\u52a0\u901f\u6570\u636e\u4f20\u8f93<br \/>\n    <span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">,<\/span><\/p>\n<p>    <span class=\"token string\">&#034;overlap_comm&#034;<\/span>: false<span class=\"token punctuation\">,<\/span>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u7981\u7528\u901a\u4fe1\u8ba1\u7b97\u91cd\u53e0&#xff08;\u63d0\u5347\u7a33\u5b9a\u6027&#xff09;<br \/>\n    <span class=\"token string\">&#034;contiguous_gradients&#034;<\/span>: true<span class=\"token punctuation\">,<\/span> <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u4fdd\u6301\u68af\u5ea6\u5185\u5b58\u8fde\u7eed&#xff08;\u4f18\u5316\u663e\u5b58&#xff09;<\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u5206\u7ec4\u914d\u7f6e<br \/>\n    <span class=\"token string\">&#034;sub_group_size&#034;<\/span>: 1e9<span class=\"token punctuation\">,<\/span>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5355\u53c2\u6570\u7ec4\u6700\u5927\u5c3a\u5bf8&#xff08;\u9ed8\u8ba41B\u9632\u6b62\u5206\u7ec4&#xff09;<\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u901a\u4fe1\u7f13\u51b2\u533a\u81ea\u52a8\u8c03\u6574<br \/>\n    <span class=\"token string\">&#034;reduce_bucket_size&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span>      <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> AllReduce\u7f13\u51b2\u533a\u5927\u5c0f<br \/>\n    <span class=\"token string\">&#034;stage3_prefetch_bucket_size&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span> <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u9884\u53d6\u7f13\u51b2\u533a<\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u6301\u4e45\u5316\u9608\u503c<br \/>\n    <span class=\"token string\">&#034;stage3_param_persistence_threshold&#034;<\/span>: <span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span> <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u9a7b\u7559GPU\u7684\u9608\u503c<\/p>\n<p>    <span class=\"token string\">&#034;stage3_max_live_parameters&#034;<\/span>: 1e9<span class=\"token punctuation\">,<\/span> <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u6700\u5927\u9a7b\u7559\u53c2\u6570\u6570\u91cf<br \/>\n    <span class=\"token string\">&#034;stage3_max_reuse_distance&#034;<\/span>: 1e9<span class=\"token punctuation\">,<\/span> <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u53c2\u6570\u91cd\u7528\u8ddd\u79bb\u9608\u503c<\/p>\n<p>    <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u6a21\u578b\u4fdd\u5b58\u65f6\u6536\u96c616\u4f4d\u6743\u91cd<br \/>\n    <span class=\"token string\">&#034;stage3_gather_16bit_weights_on_model_save&#034;<\/span>: true<br \/>\n  <span class=\"token punctuation\">}<\/span><br \/>\n<span class=\"token punctuation\">}<\/span><\/p>\n<p>\u5f00\u59cb\u8bad\u7ec3&#xff1a; \u5207\u6362\u5230qwen2.5-7b-full-sft.yaml\u6240\u5728\u7684\u8def\u5f84&#xff0c;\u6267\u884c\u4e0b\u9762\u7684\u547d\u4ee4\u3002<\/p>\n<p><span class=\"token comment\"># \u5f3a\u5236\u4f7f\u7528torchrun\u8fdb\u884c\u5206\u5e03\u5f0f\u8bad\u7ec3\u521d\u59cb\u5316&#xff08;\u9002\u7528\u4e8e\u591aGPU\/TPU\u73af\u5883&#xff09;<\/span><br \/>\n<span class=\"token comment\"># \u73af\u5883\u53d8\u91cf\u8bf4\u660e&#xff1a;<\/span><br \/>\n<span class=\"token comment\">#   &#8211; FORCE_TORCHRUN&#061;1 : \u5f3a\u5236\u4f7f\u7528PyTorch\u7684torchrun\u547d\u4ee4\u6765\u542f\u52a8\u5206\u5e03\u5f0f\u8bad\u7ec3<\/span><br \/>\n<span class=\"token comment\">#                         &#xff08;\u5f53\u81ea\u52a8\u68c0\u6d4b\u5931\u8d25\u6216\u9700\u8981\u663e\u5f0f\u63a7\u5236\u5206\u5e03\u5f0f\u8bad\u7ec3\u65f6\u4f7f\u7528&#xff09;<\/span><br \/>\n<span class=\"token comment\">#                         &#xff08;\u9700\u786e\u4fdd\u5df2\u6b63\u786e\u5b89\u88c5torch&gt;&#061;1.8.0&#xff09;<\/span><\/p>\n<p><span class=\"token comment\"># \u6267\u884cLLaMA Factory\u8bad\u7ec3\u6d41\u7a0b<\/span><br \/>\n<span class=\"token comment\"># \u547d\u4ee4\u7ed3\u6784&#xff1a;<\/span><br \/>\n<span class=\"token comment\">#   llamafactory-cli : \u4e3b\u7a0b\u5e8f\u5165\u53e3&#xff08;\u57fa\u4e8ePython Fire\u7684CLI\u5de5\u5177&#xff09;<\/span><br \/>\n<span class=\"token comment\">#   train            : \u5b50\u547d\u4ee4&#xff0c;\u6307\u5b9a\u6267\u884c\u8bad\u7ec3\u4efb\u52a1<\/span><br \/>\n<span class=\"token comment\">#   qwen2.5-7b-full-sft.yaml : \u8bad\u7ec3\u914d\u7f6e\u6587\u4ef6\u8def\u5f84&#xff08;\u5305\u542b\u6a21\u578b\/\u6570\u636e\/\u8bad\u7ec3\u53c2\u6570&#xff09;<\/span><br \/>\nFORCE_TORCHRUN&#061;1 llamafactory-<span class=\"token function\">cli<\/span> train qwen2<span class=\"token punctuation\">.<\/span>5-7b-full-sft<span class=\"token punctuation\">.<\/span>yaml<\/p>\n<p>\u8bad\u7ec3\u7ed3\u679c&#xff1a;<\/p>\n<p><span class=\"token namespace\">[INFO|trainer.py:2519]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt; <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running training <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2520]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Num examples &#061; 54<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2521]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Num Epochs &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2522]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Instantaneous batch size per device &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2525]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Total train batch size <span class=\"token punctuation\">(<\/span>w<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">parallel<\/span><span class=\"token punctuation\">,<\/span> distributed &amp; accumulation<span class=\"token punctuation\">)<\/span> &#061; 32<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2526]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Gradient Accumulation steps &#061; 16<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2527]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>373 &gt;&gt;   Total optimization steps &#061; 2<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2528]<\/span> 2025-11-15 00:36:01<span class=\"token punctuation\">,<\/span>374 &gt;&gt;   Number of trainable parameters &#061; 1<span class=\"token punctuation\">,<\/span>543<span class=\"token punctuation\">,<\/span>714<span class=\"token punctuation\">,<\/span>304<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 2\/2 <span class=\"token punctuation\">[<\/span>00:24&lt;00:00<span class=\"token punctuation\">,<\/span> 11<span class=\"token punctuation\">.<\/span>65s\/it<span class=\"token punctuation\">]<\/span><span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:36:28<span class=\"token punctuation\">,<\/span>898 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:491]<\/span> 2025-11-15 00:36:28<span class=\"token punctuation\">,<\/span>901 &gt;&gt; Configuration saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:757]<\/span> 2025-11-15 00:36:28<span class=\"token punctuation\">,<\/span>902 &gt;&gt; Configuration saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/generation_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|modeling_utils.py:4181]<\/span> 2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>246 &gt;&gt; Model weights saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/model<span class=\"token punctuation\">.<\/span>safetensors<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>247 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>248 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>248 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>422<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[logging.py:107:log_dist]<\/span> <span class=\"token namespace\">[Rank 0]<\/span> <span class=\"token namespace\">[Torch]<\/span> Checkpoint global_step2 is about to be saved!<br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>428<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[logging.py:107:log_dist]<\/span> <span class=\"token namespace\">[Rank 0]<\/span> Saving model checkpoint: saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/zero_pp_rank_0_mp_rank_00_model_states<span class=\"token punctuation\">.<\/span>pt<br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>428<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[torch_checkpoint_engine.py:21:save]<\/span> <span class=\"token namespace\">[Torch]<\/span> Saving saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/zero_pp_rank_0_mp_rank_00_model_states<span class=\"token punctuation\">.<\/span>pt<span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>438<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[torch_checkpoint_engine.py:23:save]<\/span> <span class=\"token namespace\">[Torch]<\/span> Saved saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/zero_pp_rank_0_mp_rank_00_model_states<span class=\"token punctuation\">.<\/span>pt<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:33<span class=\"token punctuation\">,<\/span>439<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[torch_checkpoint_engine.py:21:save]<\/span> <span class=\"token namespace\">[Torch]<\/span> Saving saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/bf16_zero_pp_rank_0_mp_rank_00_optim_states<span class=\"token punctuation\">.<\/span>pt<span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:47<span class=\"token punctuation\">,<\/span>668<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[torch_checkpoint_engine.py:23:save]<\/span> <span class=\"token namespace\">[Torch]<\/span> Saved saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/bf16_zero_pp_rank_0_mp_rank_00_optim_states<span class=\"token punctuation\">.<\/span>pt<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:47<span class=\"token punctuation\">,<\/span>669<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[engine.py:3701:_save_zero_checkpoint]<\/span> zero checkpoint saved saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/checkpoint-2\/global_step2\/bf16_zero_pp_rank_0_mp_rank_00_optim_states<span class=\"token punctuation\">.<\/span>pt<br \/>\n<span class=\"token punctuation\">[<\/span>2025-11-15 00:36:47<span class=\"token punctuation\">,<\/span>673<span class=\"token punctuation\">]<\/span> <span class=\"token namespace\">[INFO]<\/span> <span class=\"token namespace\">[torch_checkpoint_engine.py:33:commit]<\/span> <span class=\"token namespace\">[Torch]<\/span> Checkpoint global_step2 is ready now!<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2810]<\/span> 2025-11-15 00:36:47<span class=\"token punctuation\">,<\/span>675 &gt;&gt; <\/p>\n<p>Training completed<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">Do<\/span> not forget to share your model on huggingface<span class=\"token punctuation\">.<\/span>co\/models &#061;<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;train_runtime&#039;<\/span>: 46<span class=\"token punctuation\">.<\/span>3009<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_samples_per_second&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>166<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_steps_per_second&#039;<\/span>: 0<span class=\"token punctuation\">.<\/span>043<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_loss&#039;<\/span>: 3<span class=\"token punctuation\">.<\/span>873927593231201<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;epoch&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">}<\/span><br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 2\/2 <span class=\"token punctuation\">[<\/span>00:46&lt;00:00<span class=\"token punctuation\">,<\/span> 23<span class=\"token punctuation\">.<\/span>15s\/it<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:36:49<span class=\"token punctuation\">,<\/span>886 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:491]<\/span> 2025-11-15 00:36:49<span class=\"token punctuation\">,<\/span>889 &gt;&gt; Configuration saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:757]<\/span> 2025-11-15 00:36:49<span class=\"token punctuation\">,<\/span>889 &gt;&gt; Configuration saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/generation_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|modeling_utils.py:4181]<\/span> 2025-11-15 00:36:52<span class=\"token punctuation\">,<\/span>910 &gt;&gt; Model weights saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/model<span class=\"token punctuation\">.<\/span>safetensors<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:36:52<span class=\"token punctuation\">,<\/span>910 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:36:52<span class=\"token punctuation\">,<\/span>911 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:36:52<span class=\"token punctuation\">,<\/span>911 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/full\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> train metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                    &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  total_flos               &#061;        4GF<br \/>\n  train_loss               &#061;     3<span class=\"token punctuation\">.<\/span>8739<br \/>\n  train_runtime            &#061; 0:00:46<span class=\"token punctuation\">.<\/span>30<br \/>\n  train_samples_per_second &#061;      1<span class=\"token punctuation\">.<\/span>166<br \/>\n  train_steps_per_second   &#061;      0<span class=\"token punctuation\">.<\/span>043<br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:36:53]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:36:53]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:36:53]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_accuracy to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4643]<\/span> 2025-11-15 00:36:53<span class=\"token punctuation\">,<\/span>090 &gt;&gt;<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running Evaluation <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4645]<\/span> 2025-11-15 00:36:53<span class=\"token punctuation\">,<\/span>090 &gt;&gt;   Num examples &#061; 7<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4648]<\/span> 2025-11-15 00:36:53<span class=\"token punctuation\">,<\/span>090 &gt;&gt;   Batch size &#061; 1<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 4\/4 <span class=\"token punctuation\">[<\/span>00:00&lt;00:00<span class=\"token punctuation\">,<\/span>  9<span class=\"token punctuation\">.<\/span>30it\/s<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> eval metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                   &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  eval_loss               &#061;     3<span class=\"token punctuation\">.<\/span>5243<br \/>\n  eval_runtime            &#061; 0:00:00<span class=\"token punctuation\">.<\/span>71<br \/>\n  eval_samples_per_second &#061;      9<span class=\"token punctuation\">.<\/span>774<br \/>\n  eval_steps_per_second   &#061;      5<span class=\"token punctuation\">.<\/span>585<br \/>\n<span class=\"token namespace\">[INFO|modelcard.py:456]<\/span> 2025-11-15 00:36:53<span class=\"token punctuation\">,<\/span>806 &gt;&gt; Dropping the following result as it does not have all the necessary fields:<br \/>\n<span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;task&#039;<\/span>: <span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;name&#039;<\/span>: <span class=\"token string\">&#039;Causal Language Modeling&#039;<\/span><span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;type&#039;<\/span>: <span class=\"token string\">&#039;text-generation&#039;<\/span><span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">}<\/span><\/p>\n<h3>3.lora\u5fae\u8c03<\/h3>\n<p>\u5728LLaMA-Factory\u6587\u4ef6\u5939\u4e0b&#xff0c;\u521b\u5efa qwen2.5-7b-lora-sft.yaml \u914d\u7f6e\u6587\u4ef6&#xff0c;\u7528\u4e8e\u8bbe\u7f6elora\u5fae\u8c03\u7684\u914d\u7f6e\u3002<\/p>\n<p><span class=\"token comment\">### \u6a21\u578b\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9884\u8bad\u7ec3\u6a21\u578b\u7684\u672c\u5730\u8def\u5f84\u6216HuggingFace\u6a21\u578bID<\/span><br \/>\nmodel_name_or_path: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<br \/>\n<span class=\"token comment\"># \u5fc5\u987b\u5f00\u542f\u4ee5\u52a0\u8f7d\u5305\u542b\u81ea\u5b9a\u4e49\u4ee3\u7801\u7684\u6a21\u578b&#xff08;\u5982Qwen\/ChatGLM\u7b49&#xff09;<\/span><br \/>\ntrust_remote_code: true  <\/p>\n<p><span class=\"token comment\">### \u8bad\u7ec3\u65b9\u6cd5<\/span><br \/>\n<span class=\"token comment\"># \u8bad\u7ec3\u9636\u6bb5&#xff1a;\u76d1\u7763\u5f0f\u5fae\u8c03&#xff08;Supervised Fine-Tuning&#xff09;<\/span><br \/>\nstage: sft<br \/>\n<span class=\"token comment\"># \u662f\u5426\u542f\u7528\u8bad\u7ec3\u6a21\u5f0f<\/span><br \/>\ndo_train: true<br \/>\n<span class=\"token comment\"># \u5fae\u8c03\u7c7b\u578b&#xff1a;LoRA&#xff08;\u4f4e\u79e9\u9002\u914d&#xff09;<\/span><br \/>\nfinetuning_type: lora<br \/>\n<span class=\"token comment\"># LoRA\u4f5c\u7528\u7684\u76ee\u6807\u5c42&#xff08;all\u8868\u793a\u6240\u6709\u7ebf\u6027\u5c42&#xff09;<\/span><br \/>\nlora_target: all<br \/>\n<span class=\"token comment\"># LoRA\u7684\u79e9&#xff08;\u77e9\u9635\u5206\u89e3\u7ef4\u5ea6&#xff09;<\/span><br \/>\nlora_rank: 16<br \/>\n<span class=\"token comment\"># LoRA\u7684\u03b1\u503c&#xff08;\u7f29\u653e\u56e0\u5b50&#xff0c;\u901a\u5e38\u7b49\u4e8erank&#xff09;<\/span><br \/>\nlora_alpha: 16<br \/>\n<span class=\"token comment\"># LoRA\u5c42\u7684dropout\u7387&#xff08;\u9632\u6b62\u8fc7\u62df\u5408&#xff09;<\/span><br \/>\nlora_dropout: 0<span class=\"token punctuation\">.<\/span>05<\/p>\n<p><span class=\"token comment\">### \u6570\u636e\u96c6\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6570\u636e\u96c6\u540d\u79f0&#xff08;\u5bf9\u5e94data\u76ee\u5f55\u4e0b\u7684\u6570\u636e\u96c6\u6587\u4ef6\u5939&#xff09;<\/span><br \/>\ndataset: alpaca_zh_demo<br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6a21\u677f\u683c\u5f0f&#xff08;\u9700\u4e0e\u6a21\u578b\u5339\u914d&#xff0c;\u5982qwen\/llama\/chatglm&#xff09;<\/span><br \/>\ntemplate: qwen<br \/>\n<span class=\"token comment\"># \u8f93\u5165\u5e8f\u5217\u6700\u5927\u957f\u5ea6&#xff08;\u5355\u4f4d&#xff1a;token&#xff09;<\/span><br \/>\ncutoff_len: 1024<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u7684\u9884\u5904\u7406\u7f13\u5b58<\/span><br \/>\noverwrite_cache: true<br \/>\n<span class=\"token comment\"># \u6570\u636e\u9884\u5904\u7406\u7684\u5e76\u884c\u8fdb\u7a0b\u6570&#xff08;\u5efa\u8bae\u8bbe\u7f6e\u4e3aCPU\u6838\u5fc3\u6570\u768450-70%&#xff09;<\/span><br \/>\npreprocessing_num_workers: 16<\/p>\n<p><span class=\"token comment\">### \u8f93\u51fa\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u6a21\u578b\u548c\u65e5\u5fd7\u7684\u8f93\u51fa\u76ee\u5f55<\/span><br \/>\noutput_dir: saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft<br \/>\n<span class=\"token comment\"># \u6bcf\u9694100\u8bad\u7ec3\u6b65\u8bb0\u5f55\u4e00\u6b21\u65e5\u5fd7<\/span><br \/>\nlogging_steps: 100<br \/>\n<span class=\"token comment\"># \u6bcf\u9694100\u8bad\u7ec3\u6b65\u4fdd\u5b58\u4e00\u6b21\u6a21\u578b<\/span><br \/>\nsave_steps: 100<br \/>\n<span class=\"token comment\"># \u662f\u5426\u751f\u6210\u8bad\u7ec3\u635f\u5931\u66f2\u7ebf\u56fe<\/span><br \/>\nplot_loss: true<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u8f93\u51fa\u76ee\u5f55&#xff08;\u65b0\u8bad\u7ec3\u65f6\u5efa\u8bae\u5f00\u542f&#xff09;<\/span><br \/>\noverwrite_output_dir: true<\/p>\n<p><span class=\"token comment\">### \u8bad\u7ec3\u53c2\u6570<\/span><br \/>\n<span class=\"token comment\"># \u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f&#xff08;\u5b9e\u9645\u603bbatch_size &#061; \u6b64\u503c * gradient_accumulation_steps * GPU\u6570&#xff09;<\/span><br \/>\nper_device_train_batch_size: 1<br \/>\n<span class=\"token comment\"># \u68af\u5ea6\u7d2f\u79ef\u6b65\u6570&#xff08;\u7528\u4e8e\u6a21\u62df\u66f4\u5927batch_size&#xff0c;\u6b64\u5904\u7b49\u6548\u603bbatch_size&#061;16*GPU\u6570&#xff09;<\/span><br \/>\ngradient_accumulation_steps: 16<br \/>\n<span class=\"token comment\"># \u521d\u59cb\u5b66\u4e60\u7387&#xff08;LoRA\u5fae\u8c03\u7684\u5178\u578b\u5b66\u4e60\u7387\u8303\u56f4&#xff1a;1e-4 ~ 5e-4&#xff09;<\/span><br \/>\nlearning_rate: 1<span class=\"token punctuation\">.<\/span>0e-4<br \/>\n<span class=\"token comment\"># \u8bad\u7ec3\u603b\u8f6e\u6570<\/span><br \/>\nnum_train_epochs: 1<span class=\"token punctuation\">.<\/span>0<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u8c03\u5ea6\u7b56\u7565&#xff08;\u4f59\u5f26\u9000\u706b&#xff09;<\/span><br \/>\nlr_scheduler_type: cosine<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u9884\u70ed\u6bd4\u4f8b&#xff08;\u524d10%\u7684step\u7528\u4e8e\u7ebf\u6027\u9884\u70ed&#xff09;<\/span><br \/>\nwarmup_ratio: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u542f\u7528BF16\u6df7\u5408\u7cbe\u5ea6&#xff08;\u9700Ampere\u67b6\u6784\u4ee5\u4e0aGPU&#xff0c;\u5982A100\/3090&#xff09;<\/span><br \/>\nbf16: true<br \/>\n<span class=\"token comment\"># \u5206\u5e03\u5f0f\u8bad\u7ec3\u8d85\u65f6\u65f6\u95f4&#xff08;\u5355\u4f4d&#xff1a;\u6beb\u79d2&#xff0c;\u6b64\u5904\u7ea650\u5c0f\u65f6&#xff09;<\/span><br \/>\nddp_timeout: 180000000<\/p>\n<p><span class=\"token comment\">### \u8bc4\u4f30\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9a8c\u8bc1\u96c6\u5212\u5206\u6bd4\u4f8b&#xff08;\u4ece\u8bad\u7ec3\u96c6\u5212\u520610%\u4f5c\u4e3a\u9a8c\u8bc1\u96c6&#xff09;<\/span><br \/>\nval_size: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u65f6\u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f<\/span><br \/>\nper_device_eval_batch_size: 1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u7b56\u7565&#xff1a;\u6309\u8bad\u7ec3\u6b65\u6570\u95f4\u9694\u8bc4\u4f30<\/span><br \/>\neval_strategy: steps<br \/>\n<span class=\"token comment\"># \u6bcf\u9694500\u8bad\u7ec3\u6b65\u6267\u884c\u4e00\u6b21\u9a8c\u8bc1<\/span><br \/>\neval_steps: 500<\/p>\n<p>\u5f00\u59cb\u8bad\u7ec3&#xff1a;<\/p>\n<p><span class=\"token comment\">#   llamafactory-cli : \u4e3b\u7a0b\u5e8f\u5165\u53e3<\/span><br \/>\n<span class=\"token comment\">#   train           : \u5b50\u547d\u4ee4&#xff0c;\u6307\u5b9a\u6267\u884c\u8bad\u7ec3\u4efb\u52a1<\/span><br \/>\n<span class=\"token comment\">#   qwen2.5-7b-lora-sft.yaml : YAML\u683c\u5f0f\u7684\u914d\u7f6e\u6587\u4ef6\u8def\u5f84&#xff08;\u5305\u542b\u5b8c\u6574\u7684\u8bad\u7ec3\u53c2\u6570&#xff09;<\/span><br \/>\nllamafactory-<span class=\"token function\">cli<\/span> train qwen2<span class=\"token punctuation\">.<\/span>5-7b-lora-sft<span class=\"token punctuation\">.<\/span>yaml<\/p>\n<p>\u8bad\u7ec3\u7ed3\u679c\u4e3a&#xff1a;<\/p>\n<p><span class=\"token namespace\">[INFO|trainer.py:2519]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt; <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running training <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2520]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Num examples &#061; 900<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2521]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Num Epochs &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2522]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Instantaneous batch size per device &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2525]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Total train batch size <span class=\"token punctuation\">(<\/span>w<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">parallel<\/span><span class=\"token punctuation\">,<\/span> distributed &amp; accumulation<span class=\"token punctuation\">)<\/span> &#061; 32<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2526]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Gradient Accumulation steps &#061; 16<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2527]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>504 &gt;&gt;   Total optimization steps &#061; 29<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2528]<\/span> 2025-11-15 00:39:43<span class=\"token punctuation\">,<\/span>507 &gt;&gt;   Number of trainable parameters &#061; 18<span class=\"token punctuation\">,<\/span>464<span class=\"token punctuation\">,<\/span>768<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 29\/29 <span class=\"token punctuation\">[<\/span>01:18&lt;00:00<span class=\"token punctuation\">,<\/span>  1<span class=\"token punctuation\">.<\/span>99s\/it<span class=\"token punctuation\">]<\/span><span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>102 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/checkpoint-29<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:763]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>121 &gt;&gt; loading configuration file <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:839]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>122 &gt;&gt; Model config Qwen2Config <span class=\"token punctuation\">{<\/span><br \/>\n  <span class=\"token string\">&#034;architectures&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;Qwen2ForCausalLM&#034;<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;attention_dropout&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;bos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;dtype&#034;<\/span>: <span class=\"token string\">&#034;bfloat16&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;eos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_act&#034;<\/span>: <span class=\"token string\">&#034;silu&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_size&#034;<\/span>: 1536<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;initializer_range&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>02<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;intermediate_size&#034;<\/span>: 8960<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;layer_types&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_position_embeddings&#034;<\/span>: 131072<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_window_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;model_type&#034;<\/span>: <span class=\"token string\">&#034;qwen2&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_attention_heads&#034;<\/span>: 12<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_hidden_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_key_value_heads&#034;<\/span>: 2<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rms_norm_eps&#034;<\/span>: 1e-06<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_scaling&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_theta&#034;<\/span>: 1000000<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;sliding_window&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;tie_word_embeddings&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;transformers_version&#034;<\/span>: <span class=\"token string\">&#034;4.57.1&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_cache&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_mrope&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_sliding_window&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;vocab_size&#034;<\/span>: 151936<br \/>\n<span class=\"token punctuation\">}<\/span><\/p>\n<p><span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>220 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/checkpoint-29\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>220 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/checkpoint-29\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>221 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/checkpoint-29\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2810]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>515 &gt;&gt; <\/p>\n<p>Training completed<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">Do<\/span> not forget to share your model on huggingface<span class=\"token punctuation\">.<\/span>co\/models &#061;<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;train_runtime&#039;<\/span>: 79<span class=\"token punctuation\">.<\/span>008<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_samples_per_second&#039;<\/span>: 11<span class=\"token punctuation\">.<\/span>391<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_steps_per_second&#039;<\/span>: 0<span class=\"token punctuation\">.<\/span>367<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_loss&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>657024120462352<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;epoch&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">}<\/span><br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 29\/29 <span class=\"token punctuation\">[<\/span>01:18&lt;00:00<span class=\"token punctuation\">,<\/span>  2<span class=\"token punctuation\">.<\/span>72s\/it<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>518 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:763]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>537 &gt;&gt; loading configuration file <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:839]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>538 &gt;&gt; Model config Qwen2Config <span class=\"token punctuation\">{<\/span><br \/>\n  <span class=\"token string\">&#034;architectures&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;Qwen2ForCausalLM&#034;<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;attention_dropout&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;bos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;dtype&#034;<\/span>: <span class=\"token string\">&#034;bfloat16&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;eos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_act&#034;<\/span>: <span class=\"token string\">&#034;silu&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_size&#034;<\/span>: 1536<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;initializer_range&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>02<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;intermediate_size&#034;<\/span>: 8960<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;layer_types&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_position_embeddings&#034;<\/span>: 131072<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_window_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;model_type&#034;<\/span>: <span class=\"token string\">&#034;qwen2&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_attention_heads&#034;<\/span>: 12<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_hidden_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_key_value_heads&#034;<\/span>: 2<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rms_norm_eps&#034;<\/span>: 1e-06<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_scaling&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_theta&#034;<\/span>: 1000000<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;sliding_window&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;tie_word_embeddings&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;transformers_version&#034;<\/span>: <span class=\"token string\">&#034;4.57.1&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_cache&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_mrope&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_sliding_window&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;vocab_size&#034;<\/span>: 151936<br \/>\n<span class=\"token punctuation\">}<\/span><\/p>\n<p><span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>611 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>611 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>611 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> train metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                    &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  total_flos               &#061;  1054627GF<br \/>\n  train_loss               &#061;      1<span class=\"token punctuation\">.<\/span>657<br \/>\n  train_runtime            &#061; 0:01:19<span class=\"token punctuation\">.<\/span>00<br \/>\n  train_samples_per_second &#061;     11<span class=\"token punctuation\">.<\/span>391<br \/>\n  train_steps_per_second   &#061;      0<span class=\"token punctuation\">.<\/span>367<br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:41:02]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:41:02]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:41:02]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_accuracy to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4643]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>752 &gt;&gt;<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running Evaluation <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4645]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>752 &gt;&gt;   Num examples &#061; 100<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4648]<\/span> 2025-11-15 00:41:02<span class=\"token punctuation\">,<\/span>752 &gt;&gt;   Batch size &#061; 1<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 50\/50 <span class=\"token punctuation\">[<\/span>00:01&lt;00:00<span class=\"token punctuation\">,<\/span> 31<span class=\"token punctuation\">.<\/span>55it\/s<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> eval metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                   &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  eval_loss               &#061;     1<span class=\"token punctuation\">.<\/span>6728<br \/>\n  eval_runtime            &#061; 0:00:01<span class=\"token punctuation\">.<\/span>62<br \/>\n  eval_samples_per_second &#061;     61<span class=\"token punctuation\">.<\/span>354<br \/>\n  eval_steps_per_second   &#061;     30<span class=\"token punctuation\">.<\/span>677<br \/>\n<span class=\"token namespace\">[INFO|modelcard.py:456]<\/span> 2025-11-15 00:41:04<span class=\"token punctuation\">,<\/span>381 &gt;&gt; Dropping the following result as it does not have all the necessary fields:<br \/>\n<span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;task&#039;<\/span>: <span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;name&#039;<\/span>: <span class=\"token string\">&#039;Causal Language Modeling&#039;<\/span><span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;type&#039;<\/span>: <span class=\"token string\">&#039;text-generation&#039;<\/span><span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">}<\/span><\/p>\n<h3>4.QLora\u5fae\u8c03<\/h3>\n<p>\u5728LLaMA-Factory\u6587\u4ef6\u5939\u4e0b&#xff0c;\u521b\u5efa qwen2.5-7b-qlora-sft.yaml \u914d\u7f6e\u6587\u4ef6&#xff0c;\u7528\u4e8e\u8bbe\u7f6eqlora\u5fae\u8c03\u7684\u914d\u7f6e\u3002<\/p>\n<p><span class=\"token comment\">### \u6a21\u578b\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9884\u8bad\u7ec3\u6a21\u578b\u7684\u672c\u5730\u8def\u5f84\u6216HuggingFace\u6a21\u578bID&#xff08;\u9700\u786e\u4fdd\u8def\u5f84\u6b63\u786e&#xff09;<\/span><br \/>\nmodel_name_or_path: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<br \/>\n<span class=\"token comment\"># \u5fc5\u987b\u5f00\u542f\u4ee5\u52a0\u8f7d\u5305\u542b\u81ea\u5b9a\u4e49\u4ee3\u7801\u7684\u6a21\u578b&#xff08;\u5982Qwen\/ChatGLM\u7b49&#xff09;<\/span><br \/>\ntrust_remote_code: true  <\/p>\n<p><span class=\"token comment\">### \u8bad\u7ec3\u65b9\u6cd5<\/span><br \/>\n<span class=\"token comment\"># \u8bad\u7ec3\u9636\u6bb5&#xff1a;\u76d1\u7763\u5f0f\u5fae\u8c03&#xff08;Supervised Fine-Tuning&#xff09;<\/span><br \/>\nstage: sft<br \/>\n<span class=\"token comment\"># \u662f\u5426\u542f\u7528\u8bad\u7ec3\u6a21\u5f0f<\/span><br \/>\ndo_train: true<br \/>\n<span class=\"token comment\"># \u5fae\u8c03\u7c7b\u578b&#xff1a;QLoRA&#xff08;\u91cf\u5316\u4f4e\u79e9\u9002\u914d&#xff09;<\/span><br \/>\nfinetuning_type: lora<br \/>\n<span class=\"token comment\"># QLoRA\u4f5c\u7528\u7684\u76ee\u6807\u5c42&#xff08;all\u8868\u793a\u6240\u6709\u7ebf\u6027\u5c42&#xff09;<\/span><br \/>\nlora_target: all<br \/>\n<span class=\"token comment\"># \u91cf\u5316\u4f4d\u6570&#xff08;4-bit\u91cf\u5316&#xff09;<\/span><br \/>\nquantization_bit: 4<br \/>\n<span class=\"token comment\"># \u91cf\u5316\u65b9\u6cd5&#xff08;\u4f7f\u7528bitsandbytes\u5e93\u5b9e\u73b0&#xff09;<\/span><br \/>\nquantization_method: bitsandbytes<br \/>\n<span class=\"token comment\"># QLoRA\u7684\u79e9&#xff08;\u77e9\u9635\u5206\u89e3\u7ef4\u5ea6&#xff09;<\/span><br \/>\nlora_rank: 16<br \/>\n<span class=\"token comment\"># QLoRA\u7684\u03b1\u503c&#xff08;\u7f29\u653e\u56e0\u5b50&#xff0c;\u901a\u5e38\u7b49\u4e8erank&#xff09;<\/span><br \/>\nlora_alpha: 16<br \/>\n<span class=\"token comment\"># QLoRA\u5c42\u7684dropout\u7387&#xff08;\u9632\u6b62\u8fc7\u62df\u5408&#xff09;<\/span><br \/>\nlora_dropout: 0<span class=\"token punctuation\">.<\/span>05<\/p>\n<p><span class=\"token comment\">### \u6570\u636e\u96c6\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6570\u636e\u96c6\u540d\u79f0&#xff08;\u5bf9\u5e94data\u76ee\u5f55\u4e0b\u7684\u6570\u636e\u96c6\u6587\u4ef6\u5939&#xff09;<\/span><br \/>\ndataset: alpaca_zh_demo<br \/>\n<span class=\"token comment\"># \u4f7f\u7528\u7684\u6a21\u677f\u683c\u5f0f&#xff08;\u9700\u4e0e\u6a21\u578b\u67b6\u6784\u5339\u914d&#xff09;<\/span><br \/>\ntemplate: qwen<br \/>\n<span class=\"token comment\"># \u8f93\u5165\u5e8f\u5217\u6700\u5927\u957f\u5ea6&#xff08;\u5355\u4f4d&#xff1a;token&#xff09;<\/span><br \/>\ncutoff_len: 1024<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u7684\u9884\u5904\u7406\u7f13\u5b58&#xff08;\u6570\u636e\u96c6\u4fee\u6539\u540e\u9700\u542f\u7528&#xff09;<\/span><br \/>\noverwrite_cache: true<br \/>\n<span class=\"token comment\"># \u6570\u636e\u9884\u5904\u7406\u7684\u5e76\u884c\u8fdb\u7a0b\u6570&#xff08;\u5efa\u8bae\u8bbe\u7f6e\u4e3aCPU\u6838\u5fc3\u6570\u768450-70%&#xff09;<\/span><br \/>\npreprocessing_num_workers: 16<\/p>\n<p><span class=\"token comment\">### \u8f93\u51fa\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u6a21\u578b\u548c\u65e5\u5fd7\u7684\u8f93\u51fa\u76ee\u5f55&#xff08;QLoRA\u68c0\u67e5\u70b9\u4fdd\u5b58\u8def\u5f84&#xff09;<\/span><br \/>\noutput_dir: saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft<br \/>\n<span class=\"token comment\"># \u6bcf\u9694100\u8bad\u7ec3\u6b65\u8bb0\u5f55\u4e00\u6b21\u65e5\u5fd7<\/span><br \/>\nlogging_steps: 100<br \/>\n<span class=\"token comment\"># \u6bcf\u9694100\u8bad\u7ec3\u6b65\u4fdd\u5b58\u4e00\u6b21\u6a21\u578b<\/span><br \/>\nsave_steps: 100<br \/>\n<span class=\"token comment\"># \u662f\u5426\u751f\u6210\u8bad\u7ec3\u635f\u5931\u66f2\u7ebf\u56fe&#xff08;\u4fdd\u5b58\u5728output_dir\/loss.png&#xff09;<\/span><br \/>\nplot_loss: true<br \/>\n<span class=\"token comment\"># \u662f\u5426\u8986\u76d6\u5df2\u6709\u8f93\u51fa\u76ee\u5f55&#xff08;\u65b0\u8bad\u7ec3\u65f6\u5efa\u8bae\u5f00\u542f&#xff09;<\/span><br \/>\noverwrite_output_dir: true<\/p>\n<p><span class=\"token comment\">### \u8bad\u7ec3\u53c2\u6570<\/span><br \/>\n<span class=\"token comment\"># \u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f&#xff08;\u5b9e\u9645\u603bbatch_size &#061; \u6b64\u503c * gradient_accumulation_steps * GPU\u6570&#xff09;<\/span><br \/>\nper_device_train_batch_size: 1<br \/>\n<span class=\"token comment\"># \u68af\u5ea6\u7d2f\u79ef\u6b65\u6570&#xff08;\u7528\u4e8e\u6a21\u62df\u66f4\u5927batch_size&#xff0c;\u6b64\u5904\u7b49\u6548\u603bbatch_size&#061;16*GPU\u6570&#xff09;<\/span><br \/>\ngradient_accumulation_steps: 16<br \/>\n<span class=\"token comment\"># \u521d\u59cb\u5b66\u4e60\u7387&#xff08;QLoRA\u5178\u578b\u5b66\u4e60\u7387\u8303\u56f4&#xff1a;1e-4 ~ 5e-4&#xff09;<\/span><br \/>\nlearning_rate: 1<span class=\"token punctuation\">.<\/span>0e-4<br \/>\n<span class=\"token comment\"># \u8bad\u7ec3\u603b\u8f6e\u6570<\/span><br \/>\nnum_train_epochs: 1<span class=\"token punctuation\">.<\/span>0<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u8c03\u5ea6\u7b56\u7565&#xff08;\u4f59\u5f26\u9000\u706b&#xff09;<\/span><br \/>\nlr_scheduler_type: cosine<br \/>\n<span class=\"token comment\"># \u5b66\u4e60\u7387\u9884\u70ed\u6bd4\u4f8b&#xff08;\u524d10%\u7684step\u7528\u4e8e\u7ebf\u6027\u9884\u70ed&#xff09;<\/span><br \/>\nwarmup_ratio: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u542f\u7528BF16\u6df7\u5408\u7cbe\u5ea6&#xff08;\u9700Ampere\u67b6\u6784\u4ee5\u4e0aGPU&#xff0c;\u5982A100\/3090&#xff09;<\/span><br \/>\nbf16: true<br \/>\n<span class=\"token comment\"># \u5206\u5e03\u5f0f\u8bad\u7ec3\u8d85\u65f6\u65f6\u95f4&#xff08;\u5355\u4f4d&#xff1a;\u6beb\u79d2&#xff0c;\u6b64\u5904\u7ea650\u5c0f\u65f6&#xff09;<\/span><br \/>\nddp_timeout: 180000000<\/p>\n<p><span class=\"token comment\">### \u8bc4\u4f30\u914d\u7f6e<\/span><br \/>\n<span class=\"token comment\"># \u9a8c\u8bc1\u96c6\u5212\u5206\u6bd4\u4f8b&#xff08;\u4ece\u8bad\u7ec3\u96c6\u5212\u520610%\u4f5c\u4e3a\u9a8c\u8bc1\u96c6&#xff09;<\/span><br \/>\nval_size: 0<span class=\"token punctuation\">.<\/span>1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u65f6\u6bcf\u4e2aGPU\u7684\u6279\u6b21\u5927\u5c0f<\/span><br \/>\nper_device_eval_batch_size: 1<br \/>\n<span class=\"token comment\"># \u8bc4\u4f30\u7b56\u7565&#xff1a;\u6309\u8bad\u7ec3\u6b65\u6570\u95f4\u9694\u8bc4\u4f30<\/span><br \/>\neval_strategy: steps<br \/>\n<span class=\"token comment\"># \u6bcf\u9694500\u8bad\u7ec3\u6b65\u6267\u884c\u4e00\u6b21\u9a8c\u8bc1<\/span><br \/>\neval_steps: 500<\/p>\n<p>QLoRA\u8bad\u7ec3&#xff1a;<\/p>\n<p><span class=\"token comment\">#   llamafactory-cli : \u4e3b\u7a0b\u5e8f\u5165\u53e3<\/span><br \/>\n<span class=\"token comment\">#   train           : \u5b50\u547d\u4ee4&#xff0c;\u6307\u5b9a\u6267\u884c\u8bad\u7ec3\u4efb\u52a1<\/span><br \/>\n<span class=\"token comment\">#   qwen2.5-7b-lora-sft.yaml : YAML\u683c\u5f0f\u7684\u914d\u7f6e\u6587\u4ef6\u8def\u5f84&#xff08;\u5305\u542b\u5b8c\u6574\u7684\u8bad\u7ec3\u53c2\u6570&#xff09;<\/span><br \/>\nllamafactory-<span class=\"token function\">cli<\/span> train qwen2<span class=\"token punctuation\">.<\/span>5-7b-qlora-sft<span class=\"token punctuation\">.<\/span>yaml<\/p>\n<p>\u8bad\u7ec3\u7ed3\u679c\u5982\u4e0b&#xff1a;<\/p>\n<p><span class=\"token namespace\">[INFO|trainer.py:2519]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt; <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running training <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2520]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Num examples &#061; 900<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2521]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Num Epochs &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2522]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Instantaneous batch size per device &#061; 1<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2525]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Total train batch size <span class=\"token punctuation\">(<\/span>w<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">parallel<\/span><span class=\"token punctuation\">,<\/span> distributed &amp; accumulation<span class=\"token punctuation\">)<\/span> &#061; 32<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2526]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Gradient Accumulation steps &#061; 16<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2527]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>249 &gt;&gt;   Total optimization steps &#061; 29<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2528]<\/span> 2025-11-15 00:43:46<span class=\"token punctuation\">,<\/span>254 &gt;&gt;   Number of trainable parameters &#061; 18<span class=\"token punctuation\">,<\/span>464<span class=\"token punctuation\">,<\/span>768<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 29\/29 <span class=\"token punctuation\">[<\/span>01:20&lt;00:00<span class=\"token punctuation\">,<\/span>  1<span class=\"token punctuation\">.<\/span>98s\/it<span class=\"token punctuation\">]<\/span><span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>653 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/checkpoint-29<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:763]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>673 &gt;&gt; loading configuration file <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:839]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>674 &gt;&gt; Model config Qwen2Config <span class=\"token punctuation\">{<\/span><br \/>\n  <span class=\"token string\">&#034;architectures&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;Qwen2ForCausalLM&#034;<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;attention_dropout&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;bos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;dtype&#034;<\/span>: <span class=\"token string\">&#034;bfloat16&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;eos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_act&#034;<\/span>: <span class=\"token string\">&#034;silu&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_size&#034;<\/span>: 1536<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;initializer_range&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>02<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;intermediate_size&#034;<\/span>: 8960<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;layer_types&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_position_embeddings&#034;<\/span>: 131072<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_window_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;model_type&#034;<\/span>: <span class=\"token string\">&#034;qwen2&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_attention_heads&#034;<\/span>: 12<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_hidden_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_key_value_heads&#034;<\/span>: 2<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rms_norm_eps&#034;<\/span>: 1e-06<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_scaling&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_theta&#034;<\/span>: 1000000<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;sliding_window&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;tie_word_embeddings&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;transformers_version&#034;<\/span>: <span class=\"token string\">&#034;4.57.1&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_cache&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_mrope&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_sliding_window&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;vocab_size&#034;<\/span>: 151936<br \/>\n<span class=\"token punctuation\">}<\/span><\/p>\n<p><span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>761 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/checkpoint-29\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>761 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/checkpoint-29\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:45:06<span class=\"token punctuation\">,<\/span>761 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/checkpoint-29\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:2810]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>051 &gt;&gt; <\/p>\n<p>Training completed<span class=\"token punctuation\">.<\/span> <span class=\"token keyword\">Do<\/span> not forget to share your model on huggingface<span class=\"token punctuation\">.<\/span>co\/models &#061;<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;train_runtime&#039;<\/span>: 80<span class=\"token punctuation\">.<\/span>7972<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_samples_per_second&#039;<\/span>: 11<span class=\"token punctuation\">.<\/span>139<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_steps_per_second&#039;<\/span>: 0<span class=\"token punctuation\">.<\/span>359<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;train_loss&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>6571868370319236<span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;epoch&#039;<\/span>: 1<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">}<\/span><br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 29\/29 <span class=\"token punctuation\">[<\/span>01:20&lt;00:00<span class=\"token punctuation\">,<\/span>  2<span class=\"token punctuation\">.<\/span>78s\/it<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4309]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>054 &gt;&gt; Saving model checkpoint to saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:763]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>073 &gt;&gt; loading configuration file <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B\/config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|configuration_utils.py:839]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>074 &gt;&gt; Model config Qwen2Config <span class=\"token punctuation\">{<\/span><br \/>\n  <span class=\"token string\">&#034;architectures&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;Qwen2ForCausalLM&#034;<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;attention_dropout&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;bos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;dtype&#034;<\/span>: <span class=\"token string\">&#034;bfloat16&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;eos_token_id&#034;<\/span>: 151643<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_act&#034;<\/span>: <span class=\"token string\">&#034;silu&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;hidden_size&#034;<\/span>: 1536<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;initializer_range&#034;<\/span>: 0<span class=\"token punctuation\">.<\/span>02<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;intermediate_size&#034;<\/span>: 8960<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;layer_types&#034;<\/span>: <span class=\"token punctuation\">[<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token string\">&#034;full_attention&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    <span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><span class=\"token punctuation\">.<\/span><br \/>\n  <span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_position_embeddings&#034;<\/span>: 131072<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;max_window_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;model_type&#034;<\/span>: <span class=\"token string\">&#034;qwen2&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_attention_heads&#034;<\/span>: 12<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_hidden_layers&#034;<\/span>: 28<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;num_key_value_heads&#034;<\/span>: 2<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rms_norm_eps&#034;<\/span>: 1e-06<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_scaling&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;rope_theta&#034;<\/span>: 1000000<span class=\"token punctuation\">.<\/span>0<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;sliding_window&#034;<\/span>: null<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;tie_word_embeddings&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;transformers_version&#034;<\/span>: <span class=\"token string\">&#034;4.57.1&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_cache&#034;<\/span>: true<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_mrope&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;use_sliding_window&#034;<\/span>: false<span class=\"token punctuation\">,<\/span><br \/>\n  <span class=\"token string\">&#034;vocab_size&#034;<\/span>: 151936<br \/>\n<span class=\"token punctuation\">}<\/span><\/p>\n<p><span class=\"token namespace\">[INFO|tokenization_utils_base.py:2421]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>136 &gt;&gt; chat template saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/chat_template<span class=\"token punctuation\">.<\/span>jinja<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2590]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>136 &gt;&gt; tokenizer config file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/tokenizer_config<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token namespace\">[INFO|tokenization_utils_base.py:2599]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>137 &gt;&gt; Special tokens file saved in saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/qlora\/sft\/special_tokens_map<span class=\"token punctuation\">.<\/span>json<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> train metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                    &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  total_flos               &#061;  1054627GF<br \/>\n  train_loss               &#061;     1<span class=\"token punctuation\">.<\/span>6572<br \/>\n  train_runtime            &#061; 0:01:20<span class=\"token punctuation\">.<\/span>79<br \/>\n  train_samples_per_second &#061;     11<span class=\"token punctuation\">.<\/span>139<br \/>\n  train_steps_per_second   &#061;      0<span class=\"token punctuation\">.<\/span>359<br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:45:07]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:45:07]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_loss to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[WARNING|2025-11-15 00:45:07]<\/span> llamafactory<span class=\"token punctuation\">.<\/span>extras<span class=\"token punctuation\">.<\/span>ploting:148 &gt;&gt; No metric eval_accuracy to plot<span class=\"token punctuation\">.<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4643]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>276 &gt;&gt;<br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> Running Evaluation <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4645]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>277 &gt;&gt;   Num examples &#061; 100<br \/>\n<span class=\"token namespace\">[INFO|trainer.py:4648]<\/span> 2025-11-15 00:45:07<span class=\"token punctuation\">,<\/span>277 &gt;&gt;   Batch size &#061; 1<br \/>\n100%<span class=\"token punctuation\">|<\/span>\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588<span class=\"token punctuation\">|<\/span> 50\/50 <span class=\"token punctuation\">[<\/span>00:01&lt;00:00<span class=\"token punctuation\">,<\/span> 31<span class=\"token punctuation\">.<\/span>85it\/s<span class=\"token punctuation\">]<\/span><br \/>\n<span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span> eval metrics <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span><br \/>\n  epoch                   &#061;        1<span class=\"token punctuation\">.<\/span>0<br \/>\n  eval_loss               &#061;     1<span class=\"token punctuation\">.<\/span>6738<br \/>\n  eval_runtime            &#061; 0:00:01<span class=\"token punctuation\">.<\/span>61<br \/>\n  eval_samples_per_second &#061;     61<span class=\"token punctuation\">.<\/span>919<br \/>\n  eval_steps_per_second   &#061;      30<span class=\"token punctuation\">.<\/span>96<br \/>\n<span class=\"token namespace\">[INFO|modelcard.py:456]<\/span> 2025-11-15 00:45:08<span class=\"token punctuation\">,<\/span>890 &gt;&gt; Dropping the following result as it does not have all the necessary fields:<br \/>\n<span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;task&#039;<\/span>: <span class=\"token punctuation\">{<\/span><span class=\"token string\">&#039;name&#039;<\/span>: <span class=\"token string\">&#039;Causal Language Modeling&#039;<\/span><span class=\"token punctuation\">,<\/span> <span class=\"token string\">&#039;type&#039;<\/span>: <span class=\"token string\">&#039;text-generation&#039;<\/span><span class=\"token punctuation\">}<\/span><span class=\"token punctuation\">}<\/span><\/p>\n<p>\u4f7f\u7528\u4e0a\u8ff0\u8bad\u7ec3\u914d\u7f6e&#xff0c;\u5404\u4e2a\u65b9\u6cd5\u5b9e\u6d4b\u7684\u663e\u5b58\u5360\u7528\u5982\u4e0b\u3002\u8bad\u7ec3\u4e2d\u7684\u663e\u5b58\u5360\u7528\u4e0e\u8bad\u7ec3\u53c2\u6570\u914d\u7f6e\u606f\u606f\u76f8\u5173&#xff0c;\u53ef\u6839\u636e\u81ea\u8eab\u5b9e\u9645\u9700\u6c42\u8fdb\u884c\u8bbe\u7f6e\u3002<\/p>\n<ul>\n<li>\u5168\u91cf\u53c2\u6570\u8bad\u7ec3&#xff1a;42.18GB<\/li>\n<li>LoRA\u8bad\u7ec3&#xff1a;20.17GB<\/li>\n<li>QLoRA\u8bad\u7ec3: 10.97GB<\/li>\n<\/ul>\n<h2>\u4e94\u3001\u5408\u5e76\u6a21\u578b\u6743\u91cd<\/h2>\n<h3>1.\u6a21\u578b\u5408\u5e76<\/h3>\n<p>\u5982\u679c\u91c7\u7528LoRA\u6216\u8005QLoRA\u8fdb\u884c\u8bad\u7ec3&#xff0c;\u811a\u672c\u53ea\u4fdd\u5b58\u5bf9\u5e94\u7684LoRA\u6743\u91cd&#xff0c;\u9700\u8981\u5408\u5e76\u6743\u91cd\u624d\u80fd\u8fdb\u884c\u63a8\u7406\u3002\u5168\u91cf\u53c2\u6570\u8bad\u7ec3\u65e0\u9700\u6267\u884c\u6b64\u6b65\u9aa4\u3002\u4e0b\u9762\u5c06LoRA\u5fae\u8c03\u7684\u6743\u91cd\u548c\u9884\u8bad\u7ec3\u6a21\u578b\u8fdb\u884c\u5408\u5e76\u3002\u6ce8\u610f&#xff1a;\u5982\u679c\u662fQLoRA\u5fae\u8c03\u7684\u6743\u91cd\u9700\u8981\u548c\u4f7f\u7528NF4\u65b9\u5f0f\u91cf\u5316\u540e\u7684\u9884\u8bad\u7ec3\u6a21\u578b\u8fdb\u884c\u5408\u5e76\u3002<\/p>\n<p>\u5fae\u8c03\u7684\u547d\u4ee4\u5982\u4e0b&#xff1a;<\/p>\n<p>llamafactory-<span class=\"token function\">cli<\/span> export qwen2<span class=\"token punctuation\">.<\/span>5-7b-<span class=\"token function\">merge-lora<\/span><span class=\"token punctuation\">.<\/span>yaml<\/p>\n<p>\u5176\u4e2d qwen2.5-7b-merge-lora.yaml \u4e2d\u914d\u7f6e\u5982\u4e0b&#xff1a;<\/p>\n<p><span class=\"token comment\">### model<\/span><br \/>\nmodel_name_or_path: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/Qwen2<span class=\"token punctuation\">.<\/span>5-7B<br \/>\nadapter_name_or_path: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/saves\/qwen2<span class=\"token punctuation\">.<\/span>5-7b\/lora\/sft<br \/>\ntemplate: qwen<br \/>\nfinetuning_type: lora<br \/>\ntrust_remote_code: true  <span class=\"token operator\">\/<\/span><span class=\"token operator\">\/<\/span> \u5fc5\u987b\u5f00\u542f<\/p>\n<p><span class=\"token comment\">### export<\/span><br \/>\nexport_dir: <span class=\"token operator\">\/<\/span>root\/autodl-tmp\/LLaMA-Factory\/models\/qwen2<span class=\"token punctuation\">.<\/span>5-7b-sft-lora-merged<br \/>\nexport_size: 2<br \/>\nexport_device: cpu<br \/>\nexport_legacy_format: false<\/p>\n<p>\u6743\u91cd\u5408\u5e76\u7684\u90e8\u5206\u53c2\u6570\u8bf4\u660e&#xff1a;<\/p>\n<table>\n<tr>\u53c2\u6570\u8bf4\u660e<\/tr>\n<tbody>\n<tr>\n<td align=\"center\">model_name_or_path<\/td>\n<td align=\"center\">\u9884\u8bad\u7ec3\u6a21\u578b\u7684\u540d\u79f0\u6216\u8def\u5f84<\/td>\n<\/tr>\n<tr>\n<td align=\"center\">template<\/td>\n<td align=\"center\">\u6a21\u578b\u7c7b\u578b<\/td>\n<\/tr>\n<tr>\n<td align=\"center\">export_dir<\/td>\n<td align=\"center\">\u5bfc\u51fa\u8def\u5f84<\/td>\n<\/tr>\n<tr>\n<td align=\"center\">export_size<\/td>\n<td align=\"center\">\u6700\u5927\u5bfc\u51fa\u6a21\u578b\u6587\u4ef6\u5927\u5c0f<\/td>\n<\/tr>\n<tr>\n<td align=\"center\">export_device<\/td>\n<td align=\"center\">\u5bfc\u51fa\u8bbe\u5907<\/td>\n<\/tr>\n<tr>\n<td align=\"center\">export_legacy_format<\/td>\n<td align=\"center\">\u662f\u5426\u4f7f\u7528\u65e7\u683c\u5f0f\u5bfc\u51fa<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>\u6ce8\u610f&#xff1a;<\/p>\n<ul>\n<li>\u5408\u5e76Qwen2.5\u6a21\u578b\u6743\u91cd&#xff0c;\u52a1\u5fc5\u5c06template\u8bbe\u4e3aqwen&#xff1b;\u65e0\u8bbaLoRA\u8fd8\u662fQLoRA\u8bad\u7ec3&#xff0c;\u5408\u5e76\u6743\u91cd\u65f6&#xff0c;finetuning_type\u5747\u4e3alora\u3002<\/li>\n<li>adapter_name_or_path\u9700\u8981\u4e0e\u5fae\u8c03\u4e2d\u7684\u9002\u914d\u5668\u8f93\u51fa\u8def\u5f84output_dir\u76f8\u5bf9\u5e94\u3002<\/li>\n<\/ul>\n<h3>2.\u6d4b\u8bd5<\/h3>\n<p>inference.py \u6587\u4ef6\u5185\u5bb9\u5982\u4e0b&#xff1a;<\/p>\n<p>import time<br \/>\n<span class=\"token keyword\">from<\/span> transformers import AutoModelForCausalLM<span class=\"token punctuation\">,<\/span> AutoTokenizer<\/p>\n<p><span class=\"token comment\"># \u52a0\u8f7d tokenizer \u548c model<\/span><br \/>\ntokenizer &#061; AutoTokenizer<span class=\"token punctuation\">.<\/span>from_pretrained<span class=\"token punctuation\">(<\/span><br \/>\n    <span class=\"token string\">&#034;\/root\/autodl-tmp\/LLaMA-Factory\/models\/qwen2.5-7b-sft-lora-merged&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    trust_remote_code&#061;True<br \/>\n<span class=\"token punctuation\">)<\/span><\/p>\n<p>model &#061; AutoModelForCausalLM<span class=\"token punctuation\">.<\/span>from_pretrained<span class=\"token punctuation\">(<\/span><br \/>\n    <span class=\"token string\">&#034;\/root\/autodl-tmp\/LLaMA-Factory\/models\/qwen2.5-7b-sft-lora-merged&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    device_map&#061;<span class=\"token string\">&#034;auto&#034;<\/span><span class=\"token punctuation\">,<\/span><br \/>\n    trust_remote_code&#061;True<br \/>\n<span class=\"token punctuation\">)<\/span><span class=\"token punctuation\">.<\/span>eval<span class=\"token punctuation\">(<\/span><span class=\"token punctuation\">)<\/span><\/p>\n<p>prompt &#061; <span class=\"token string\">&#034;\u4f60\u597d&#034;<\/span><br \/>\ninputs &#061; tokenizer<span class=\"token punctuation\">(<\/span>prompt<span class=\"token punctuation\">,<\/span> return_tensors&#061;<span class=\"token string\">&#034;pt&#034;<\/span><span class=\"token punctuation\">)<\/span><span class=\"token punctuation\">.<\/span>to<span class=\"token punctuation\">(<\/span>model<span class=\"token punctuation\">.<\/span>device<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token comment\"># \u8bb0\u5f55\u751f\u6210\u5f00\u59cb\u65f6\u95f4<\/span><br \/>\nstart_time &#061; time<span class=\"token punctuation\">.<\/span>time<span class=\"token punctuation\">(<\/span><span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token comment\"># \u4f7f\u7528 generate \u751f\u6210\u6587\u672c<\/span><br \/>\noutputs &#061; model<span class=\"token punctuation\">.<\/span>generate<span class=\"token punctuation\">(<\/span><br \/>\n    <span class=\"token operator\">*<\/span><span class=\"token operator\">*<\/span>inputs<span class=\"token punctuation\">,<\/span><br \/>\n    max_new_tokens&#061;128<span class=\"token punctuation\">,<\/span><br \/>\n    do_sample&#061;True<span class=\"token punctuation\">,<\/span><br \/>\n    temperature&#061;0<span class=\"token punctuation\">.<\/span>3<span class=\"token punctuation\">,<\/span><br \/>\n    top_p&#061;0<span class=\"token punctuation\">.<\/span>4<br \/>\n<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token comment\"># \u8bb0\u5f55\u751f\u6210\u7ed3\u675f\u65f6\u95f4<\/span><br \/>\nend_time &#061; time<span class=\"token punctuation\">.<\/span>time<span class=\"token punctuation\">(<\/span><span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token comment\"># \u89e3\u7801\u8f93\u51fa<\/span><br \/>\nresponse &#061; tokenizer<span class=\"token punctuation\">.<\/span>decode<span class=\"token punctuation\">(<\/span>outputs<span class=\"token punctuation\">[<\/span>0<span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">,<\/span> skip_special_tokens&#061;True<span class=\"token punctuation\">)<\/span><br \/>\nprint<span class=\"token punctuation\">(<\/span><span class=\"token string\">&#034;\u751f\u6210\u7ed3\u679c&#xff1a;&#034;<\/span><span class=\"token punctuation\">,<\/span> response<span class=\"token punctuation\">)<\/span><\/p>\n<p><span class=\"token comment\"># \u7edf\u8ba1\u751f\u6210\u901f\u5ea6<\/span><br \/>\nnum_generated_tokens &#061; outputs<span class=\"token punctuation\">.<\/span>shape<span class=\"token punctuation\">[<\/span>1<span class=\"token punctuation\">]<\/span> <span class=\"token operator\">&#8211;<\/span> inputs<span class=\"token punctuation\">[<\/span><span class=\"token string\">&#039;input_ids&#039;<\/span><span class=\"token punctuation\">]<\/span><span class=\"token punctuation\">.<\/span>shape<span class=\"token punctuation\">[<\/span>1<span class=\"token punctuation\">]<\/span>  <span class=\"token comment\"># \u65b0\u751f\u6210\u7684token\u6570\u91cf<\/span><br \/>\nelapsed_time &#061; end_time <span class=\"token operator\">&#8211;<\/span> start_time<br \/>\ntokens_per_second &#061; num_generated_tokens <span class=\"token operator\">\/<\/span> elapsed_time <span class=\"token keyword\">if<\/span> elapsed_time &gt; 0 <span class=\"token keyword\">else<\/span> 0<\/p>\n<p>print<span class=\"token punctuation\">(<\/span>f<span class=\"token string\">&#034;\u751f\u6210\u4e86 {num_generated_tokens} \u4e2a token&#xff0c;\u7528\u65f6 {elapsed_time:.2f} \u79d2&#xff0c;\u901f\u5ea6\u7ea6\u4e3a {tokens_per_second:.2f} token\/s&#034;<\/span><span class=\"token punctuation\">)<\/span><\/p>\n<p>\u7ed3\u679c\u5982\u4e0b&#xff1a;<\/p>\n<p>\u751f\u6210\u7ed3\u679c&#xff1a; \u4f60\u597d&#xff0c;\u6211\u6709\u4e00\u4e2a\u95ee\u9898\u60f3\u95ee\u3002<br \/>\n\u60a8\u597d&#xff0c;\u8bf7\u95ee\u6709\u4ec0\u4e48\u95ee\u9898\u9700\u8981\u5e2e\u52a9\u5417&#xff1f;<\/p>\n<p>\u6211\u6700\u8fd1\u611f\u5230\u5f88\u7126\u8651&#xff0c;\u6709\u4ec0\u4e48\u65b9\u6cd5\u53ef\u4ee5\u7f13\u89e3\u5417&#xff1f;<br \/>\n\u7126\u8651\u662f\u4e00\u79cd\u5e38\u89c1\u7684\u5fc3\u7406\u95ee\u9898&#xff0c;\u60a8\u53ef\u4ee5\u5c1d\u8bd5\u8fdb\u884c\u6df1\u547c\u5438\u3001\u51a5\u60f3\u3001\u8fd0\u52a8\u3001\u4e0e\u670b\u53cb\u804a\u5929\u7b49\u65b9\u5f0f\u6765\u7f13\u89e3\u7126\u8651\u3002\u540c\u65f6&#xff0c;\u4e5f\u53ef\u4ee5\u8003\u8651\u5bfb\u6c42\u4e13\u4e1a\u5fc3\u7406\u54a8\u8be2\u5e08\u7684\u5e2e\u52a9\u3002<br \/>\n\u751f\u6210\u4e86 64 \u4e2a token&#xff0c;\u7528\u65f6 2<span class=\"token punctuation\">.<\/span>17 \u79d2&#xff0c;\u901f\u5ea6\u7ea6\u4e3a 29<span class=\"token punctuation\">.<\/span>46 token\/s<\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u6587\u7ae0\u76ee\u5f55 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb\u4e8c\u3001\u5b89\u88c5LLaMA-Factory\u4e09\u3001\u51c6\u5907\u8bad\u7ec3\u6570\u636e\u56db\u3001\u6a21\u578b\u8bad\u7ec31. \u6a21\u578b\u4e0b\u8f7d2. \u5168\u91cf\u5fae\u8c033.lora\u5fae\u8c034.QLora\u5fae\u8c03 \u4e94\u3001\u5408\u5e76\u6a21\u578b\u6743\u91cd1.\u6a21\u578b\u5408\u5e762.\u6d4b\u8bd5 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb<br \/>\nLLaMA-Factory\u662f\u4e00\u4e2a\u7b80\u5355\u6613\u7528\u4e14\u9ad8\u6548\u7684\u5927\u6a21\u578b\u8bad\u7ec3\u6846\u67b6&#xff0c;\u652f\u6301\u4e0a\u767e\u79cd\u5927\u6a21\u578b\u7684\u8bad\u7ec3&#xff0c;\u6846\u67b6\u7279\u6027\u4e3b\u8981\u5305\u62ec&#xff1a;<br \/>\n\u6a21\u578b\u79cd\u7c7b&#xff1a;LLaMA\u3001LLaVA\u3001Mistral\u3001Mixtral-M<\/p>\n","protected":false},"author":2,"featured_media":61767,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[6546,347,2068,2681,50,224,51],"topic":[],"class_list":["post-61768","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-server","tag-gpt-3","tag-llama","tag-nlp","tag-vllm","tag-50","tag-224","tag-51"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v20.3 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.wsisp.com\/helps\/61768.html\" \/>\n<meta property=\"og:locale\" content=\"zh_CN\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\" \/>\n<meta property=\"og:description\" content=\"\u6587\u7ae0\u76ee\u5f55 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb\u4e8c\u3001\u5b89\u88c5LLaMA-Factory\u4e09\u3001\u51c6\u5907\u8bad\u7ec3\u6570\u636e\u56db\u3001\u6a21\u578b\u8bad\u7ec31. \u6a21\u578b\u4e0b\u8f7d2. \u5168\u91cf\u5fae\u8c033.lora\u5fae\u8c034.QLora\u5fae\u8c03 \u4e94\u3001\u5408\u5e76\u6a21\u578b\u6743\u91cd1.\u6a21\u578b\u5408\u5e762.\u6d4b\u8bd5 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb LLaMA-Factory\u662f\u4e00\u4e2a\u7b80\u5355\u6613\u7528\u4e14\u9ad8\u6548\u7684\u5927\u6a21\u578b\u8bad\u7ec3\u6846\u67b6&#xff0c;\u652f\u6301\u4e0a\u767e\u79cd\u5927\u6a21\u578b\u7684\u8bad\u7ec3&#xff0c;\u6846\u67b6\u7279\u6027\u4e3b\u8981\u5305\u62ec&#xff1a; \u6a21\u578b\u79cd\u7c7b&#xff1a;LLaMA\u3001LLaVA\u3001Mistral\u3001Mixtral-M\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.wsisp.com\/helps\/61768.html\" \/>\n<meta property=\"og:site_name\" content=\"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-18T07:05:38+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2026\/01\/20260118070537-696c8641213c7.png\" \/>\n<meta name=\"author\" content=\"admin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u4f5c\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4\" \/>\n\t<meta name=\"twitter:data2\" content=\"17 \u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/61768.html\",\"url\":\"https:\/\/www.wsisp.com\/helps\/61768.html\",\"name\":\"LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\",\"isPartOf\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/#website\"},\"datePublished\":\"2026-01-18T07:05:38+00:00\",\"dateModified\":\"2026-01-18T07:05:38+00:00\",\"author\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.wsisp.com\/helps\/61768.html#breadcrumb\"},\"inLanguage\":\"zh-Hans\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.wsisp.com\/helps\/61768.html\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/61768.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\u9996\u9875\",\"item\":\"https:\/\/www.wsisp.com\/helps\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"LLaMA-Factory\u4f7f\u7528\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#website\",\"url\":\"https:\/\/www.wsisp.com\/helps\/\",\"name\":\"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3\",\"description\":\"\u9999\u6e2f\u670d\u52a1\u5668_\u9999\u6e2f\u4e91\u670d\u52a1\u5668\u8d44\u8baf_\u670d\u52a1\u5668\u5e2e\u52a9\u6587\u6863_\u670d\u52a1\u5668\u6559\u7a0b\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.wsisp.com\/helps\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41\",\"name\":\"admin\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery\",\"contentUrl\":\"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery\",\"caption\":\"admin\"},\"sameAs\":[\"http:\/\/wp.wsisp.com\"],\"url\":\"https:\/\/www.wsisp.com\/helps\/author\/admin\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.wsisp.com\/helps\/61768.html","og_locale":"zh_CN","og_type":"article","og_title":"LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","og_description":"\u6587\u7ae0\u76ee\u5f55 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb\u4e8c\u3001\u5b89\u88c5LLaMA-Factory\u4e09\u3001\u51c6\u5907\u8bad\u7ec3\u6570\u636e\u56db\u3001\u6a21\u578b\u8bad\u7ec31. \u6a21\u578b\u4e0b\u8f7d2. \u5168\u91cf\u5fae\u8c033.lora\u5fae\u8c034.QLora\u5fae\u8c03 \u4e94\u3001\u5408\u5e76\u6a21\u578b\u6743\u91cd1.\u6a21\u578b\u5408\u5e762.\u6d4b\u8bd5 \u4e00\u3001LLAMA-Factory\u7b80\u4ecb LLaMA-Factory\u662f\u4e00\u4e2a\u7b80\u5355\u6613\u7528\u4e14\u9ad8\u6548\u7684\u5927\u6a21\u578b\u8bad\u7ec3\u6846\u67b6&#xff0c;\u652f\u6301\u4e0a\u767e\u79cd\u5927\u6a21\u578b\u7684\u8bad\u7ec3&#xff0c;\u6846\u67b6\u7279\u6027\u4e3b\u8981\u5305\u62ec&#xff1a; \u6a21\u578b\u79cd\u7c7b&#xff1a;LLaMA\u3001LLaVA\u3001Mistral\u3001Mixtral-M","og_url":"https:\/\/www.wsisp.com\/helps\/61768.html","og_site_name":"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","article_published_time":"2026-01-18T07:05:38+00:00","og_image":[{"url":"https:\/\/www.wsisp.com\/helps\/wp-content\/uploads\/2026\/01\/20260118070537-696c8641213c7.png"}],"author":"admin","twitter_card":"summary_large_image","twitter_misc":{"\u4f5c\u8005":"admin","\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4":"17 \u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.wsisp.com\/helps\/61768.html","url":"https:\/\/www.wsisp.com\/helps\/61768.html","name":"LLaMA-Factory\u4f7f\u7528 - \u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","isPartOf":{"@id":"https:\/\/www.wsisp.com\/helps\/#website"},"datePublished":"2026-01-18T07:05:38+00:00","dateModified":"2026-01-18T07:05:38+00:00","author":{"@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41"},"breadcrumb":{"@id":"https:\/\/www.wsisp.com\/helps\/61768.html#breadcrumb"},"inLanguage":"zh-Hans","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.wsisp.com\/helps\/61768.html"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.wsisp.com\/helps\/61768.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\u9996\u9875","item":"https:\/\/www.wsisp.com\/helps"},{"@type":"ListItem","position":2,"name":"LLaMA-Factory\u4f7f\u7528"}]},{"@type":"WebSite","@id":"https:\/\/www.wsisp.com\/helps\/#website","url":"https:\/\/www.wsisp.com\/helps\/","name":"\u7f51\u7855\u4e92\u8054\u5e2e\u52a9\u4e2d\u5fc3","description":"\u9999\u6e2f\u670d\u52a1\u5668_\u9999\u6e2f\u4e91\u670d\u52a1\u5668\u8d44\u8baf_\u670d\u52a1\u5668\u5e2e\u52a9\u6587\u6863_\u670d\u52a1\u5668\u6559\u7a0b","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.wsisp.com\/helps\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"zh-Hans"},{"@type":"Person","@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/358e386c577a3ab51c4493330a20ad41","name":"admin","image":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.wsisp.com\/helps\/#\/schema\/person\/image\/","url":"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery","contentUrl":"https:\/\/gravatar.wp-china-yes.net\/avatar\/?s=96&d=mystery","caption":"admin"},"sameAs":["http:\/\/wp.wsisp.com"],"url":"https:\/\/www.wsisp.com\/helps\/author\/admin"}]}},"_links":{"self":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts\/61768","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/comments?post=61768"}],"version-history":[{"count":0,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/posts\/61768\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/media\/61767"}],"wp:attachment":[{"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/media?parent=61768"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/categories?post=61768"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/tags?post=61768"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/www.wsisp.com\/helps\/wp-json\/wp\/v2\/topic?post=61768"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}