Ollama Function Calling: LangChain & Llama 3.1 🦙

Описание к видео Ollama Function Calling: LangChain & Llama 3.1 🦙

In this video, I introduce the tool support feature within Ollama that enables local function calling on your machine. We'll explore the easy setup, deployment options (local and cloud), and how to leverage different models, notably Llama 3.1 and Groq (function calling model). I'll demonstrate practical examples, including weather queries for San Francisco, opening applications like the calculator and Chrome, and asking Claude AI questions. I'll guide you through the implementation using Bun Lang Chain, TypeScript, and share the code via a GitHub repository. Perfect for enhancing your chat applications and more!

Links:
https://ollama.com/
Repo: https://github.com/developersdigest/o...

00:00 Introduction to Ollama Tool Support
00:38 Setting Up and Running Models
01:59 Example Functions and Use Cases
03:16 Implementing Function Calling
04:48 Handling Different Operating Systems
08:26 Final Thoughts and Conclusion

Комментарии

Информация по комментариям в разработке