Reliable, fully local RAG agents with LLaMA3

Описание к видео Reliable, fully local RAG agents with LLaMA3

With the release of LLaMA3, we're seeing great interest in agents that can run reliably and locally (e.g., on your laptop). Here, we show to how build reliable local agents using LangGraph and LLaMA3-8b from scratch. We combine ideas from 3 advanced RAG papers (Adaptive RAG, Corrective RAG, and Self-RAG) into a single control flow. We run this locally w/ a local vectorstore c/o @nomic_ai & @trychroma, @tavilyai for web search, and LLaMA3-8b via @ollama.

Code:
https://github.com/langchain-ai/langg...

Комментарии

Информация по комментариям в разработке