LLMs for Advanced Question-Answering over Tabular/CSV/SQL Data (Building Advanced RAG, Part 2)

Описание к видео LLMs for Advanced Question-Answering over Tabular/CSV/SQL Data (Building Advanced RAG, Part 2)

In the second video of this series we show you how to compose an simple-to-advanced query pipeline over tabular data. This includes using LLMs to infer both Pandas operations and SQL queries. This also includes pulling in RAG concepts for advanced capabilities, such as few-shot table and row selection over multiple tables.

LlamaIndex Query Pipelines makes it possible to express these complex pipeline DAGs in a concise, readable, and visual manner. It's very easy to add few-shot examples, link prompts, LLMs, custom functions, retrievers, and more.

Colab notebook used in this video: https://colab.research.google.com/dri...

This presentation was taken from our documentation guides - check them out 👇

Text-to-SQL: https://docs.llamaindex.ai/en/stable/...

Text-to-Pandas: https://docs.llamaindex.ai/en/stable/...

Timeline:
00:00-06:18 - Intro
6:18-12:13 - Text-to-Pandas (Basic)
12:13-27:05 - Query-Time Table Retrieval for Advanced Text-to-SQL
27:05 - Query-Time Row Retrieval for Advanced Text-to-SQL

Комментарии

Информация по комментариям в разработке