<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Ollama on Weiming's Blog</title><link>https://axerzone.cn/tags/ollama/</link><description>Recent content in Ollama on Weiming's Blog</description><generator>Hugo -- gohugo.io</generator><language>zh-cn</language><copyright>© 2026 Weiming</copyright><lastBuildDate>Fri, 05 Dec 2025 00:00:00 +0000</lastBuildDate><atom:link href="https://axerzone.cn/tags/ollama/index.xml" rel="self" type="application/rss+xml"/><item><title>Ollama 本地部署大模型</title><link>https://axerzone.cn/posts/ollama-local-llm/</link><pubDate>Fri, 05 Dec 2025 00:00:00 +0000</pubDate><guid>https://axerzone.cn/posts/ollama-local-llm/</guid><description>&lt;h1 class="relative group"&gt;用 Ollama 在本地跑大模型，真的可以
 &lt;div id="用-ollama-在本地跑大模型真的可以" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#%e7%94%a8-ollama-%e5%9c%a8%e6%9c%ac%e5%9c%b0%e8%b7%91%e5%a4%a7%e6%a8%a1%e5%9e%8b%e7%9c%9f%e7%9a%84%e5%8f%af%e4%bb%a5" aria-label="锚点"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h1&gt;

&lt;h2 class="relative group"&gt;Ollama 是什么
 &lt;div id="ollama-是什么" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#ollama-%e6%98%af%e4%bb%80%e4%b9%88" aria-label="锚点"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;一直觉得跑大模型得有好几块 A100 才行，直到有人跟我说&amp;quot;试试 Ollama&amp;quot;。&lt;/p&gt;</description></item></channel></rss>