Running Llama 2 13B locally in chat mode with my NVIDIA A6000

Описание к видео Running Llama 2 13B locally in chat mode with my NVIDIA A6000

Today I installed the oobabooga text-generation-webui software on my PC to test the Llama 2 locally in chat mode with my NVIDIA A6000. For the test I used the Llama-2-13B-Chat-fp16 model from TheBloke published on Hugging Face. In the terminal window nvidia-smi is running to show the VRAM usage and as well GPU usage.
This is my blog: https://ai-box.eu

Комментарии

Информация по комментариям в разработке