Hands-on: Spring AI with Ollama and Microsoft Phi-3 🚀 🦙 | Run LLMs locally and connect from Java

Описание к видео Hands-on: Spring AI with Ollama and Microsoft Phi-3 🚀 🦙 | Run LLMs locally and connect from Java

This video covers how to run LLMs locally using Ollama and Spring AI with hands on example in Spring Boot and Ollama libraries.

Join this channel by contributing to the community:
   / @techprimers  

📌 Related Playlist
================
🔗 AI Primer Playlist -    • AI Primer  
🔗Spring Boot Primer -    • Spring Boot Primer  
🔗Spring Cloud Primer -    • Spring Cloud Primer  
🔗Spring Microservices Primer -    • Spring Microservices Primer  
🔗Spring JPA Primer -    • Spring JPA Primer  
🔗Java 8 Streams -    • Java 8 Streams  
🔗Spring Security Primer -    • Spring Security Primer  

💪 Join TechPrimers Slack Community: https://bit.ly/JoinTechPrimers
📟 Telegram: https://t.me/TechPrimers
🧮 TechPrimer HindSight (Blog):   / techprimers  
☁️ Website: http://techprimers.com
💪 Slack Community: https://techprimers.slack.com
🐦 Twitter:   / techprimers  
📱 Facebook: http://fb.me/TechPrimers
💻 GitHub: https://github.com/TechPrimers or https://techprimers.github.io/

🎬 Video Editing: FCP

---------------------------------------------------------------
🔥 Disclaimer/Policy:
The content/views/opinions posted here are solely mine and the code samples created by me are open sourced.
You are free to use the code samples in Github after forking and you can modify it for your own use.
All the videos posted here are copyrighted. You cannot re-distribute videos on this channel in other channels or platforms.
#SpringAI #Ollama #TechPrimers

Комментарии

Информация по комментариям в разработке