Obsidian with Ollama

Описание к видео Obsidian with Ollama

Instead of using ChatGPT for running tasks, we can protect our precious notes and ideas with Ollama, an open-source project that lets you run powerful language models locally on your machine for free.

I cover how to install Ollama, set it up with Obsidian's Copilot plugin, and use it for AI-powered tasks like summarization, explanation, translation, and template generation – all while keeping your data private and avoiding subscription fees.

P.S:
The command to run Ollama as a local server, make sure to use
"OLLAMA_ORIGINS=app://obsidian.md* ollama serve"

Timestamps:
00:00 Intro
0:36 What is local LLM?
1:32 What is Ollama?
2:04 Install Ollama
2:26 Ollama commands!
3:09 Open up the command palette
4:30 Obsidian setup for using Ollama
5:06 Note about using the right models based on resource
5:34 Use case!
6:04 Outro

- - - - - - - - - - - - - - - - - - -
Connect with me
❤️ Newsletter: https://joonhyeokahn.substack.com/
❤️ LinkedIn:   / joonhyeok-ahn  
❤️ Instagram:   / writer_dev123  
❤️ Threads: https://www.threads.net/@writer_dev123-
- - - - - - - - - - - - - - - - -

Комментарии

Информация по комментариям в разработке