Installing Ollama to Customize My Own LLM

Описание к видео Installing Ollama to Customize My Own LLM

Ollama is the easiest tool to get started running LLMs on your own hardware. In my first video, I explore how to use Ollama to download popular models like Phi and Mistral, chat with them directly in the terminal, use the API to respond to HTTP requests, and finally customize our own model based on Phi to be more fun to talk to.

Watch my other Ollama videos -    • Get Started with Ollama  

Links:
Code from video - https://decoder.sh/videos/installing-...
Ollama - https://ollama.ai
Phi Model - https://ollama.ai/library/phi
More great LLM content -    / @matthew_berman  

Timestamps:
00:00 - Intro
00:29 - What is Ollama?
00:41 - Installation
00:53 - Using Ollama CLI
02:06 - Chatting with Phi
02:41 - Ollama API
04:36 - Inspecting Phi's Modelfile
06:27 - Creating our own modelfile
07:34 - Creating the model
08:25 - Running our new model
08:48 - Closing words

Комментарии

Информация по комментариям в разработке