Local chat and code completion with Cody and Ollama (Experimental)

Описание к видео Local chat and code completion with Cody and Ollama (Experimental)

Try Cody for free today 👉 https://cody.dev

Learn how to enable experimental local inference for Cody for Visual Studio Code which allows you to use local LLMs for both chat and code completion for those times when Internet connectivity is out of reach.

This feature is limited to Cody Free and Pro users at this time.

Комментарии

Информация по комментариям в разработке