ollama-local

Local LLM inference with Ollama. Use when setting up local models for development, CI pipelines, or cost reduction. Covers model selection, LangChain integration, and performance tuning.

$ 설치

git clone https://github.com/yonatangross/skillforge-claude-plugin /tmp/skillforge-claude-plugin && cp -r /tmp/skillforge-claude-plugin/.claude/skills/ollama-local ~/.claude/skills/skillforge-claude-plugin

// tip: Run this command in your terminal to install the skill