LocalAI LLM Single vs Multi GPU Testing scaling to 6x 4060TI 16GB GPUS

Описание к видео LocalAI LLM Single vs Multi GPU Testing scaling to 6x 4060TI 16GB GPUS

An edited version of a demo I put together for a conversation amongst friends about single vs multiple GPU's when running LLM's locally. We walk through testing from a single to up to 6x 4060TI 16GB VRAM GPUs.

Github Repo: https://github.com/kkacsh321/st-multi...
See the Streamlit app and results here: https://gputests.robotf.ai/

Recorded and best viewed in 4K

Комментарии

Информация по комментариям в разработке