Ollama: Run LLMs Locally On Your Computer (Fast and Easy)

Описание к видео Ollama: Run LLMs Locally On Your Computer (Fast and Easy)

With Ollama, you can run local, open-source LLMs on your own computer easily and for free. This tutorial walks through how to install and use Ollama, how to access it via a local REST API, and how to use it in a Python app (using a client library like Langchain).

👉 Links
🔗 Ollama GitHub: https://github.com/ollama
🔗 LLM Library: https://ollama.com/library
🔗 RAG + Langchain Python Project:    • RAG + Langchain Python Project: Easy ...  

📚 Chapters
00:00 How To Run LLMs Locally
01:07 Install Ollama
02:45 Ollama Server and API
04:15 Using Ollama Via Langchain

Комментарии

Информация по комментариям в разработке