How to run AI workload on K3s cluster - Emrah Sifoğlu

Описание к видео How to run AI workload on K3s cluster - Emrah Sifoğlu

Running AI workload on Single Board Computer might actually be possible if Python and required libraries can be installed.

However performance wise, it would not be that good especially when we want to train models because there will not be a GPU to handle heavy calculations.

Or is there? What if we can use an ARM-based device that has special units to handle these calculations with decent performance in cluster.

My presentation is shaped on the idea that is how to set up high-availability cluster using K3s on Raspberry Pi 4B/5 along with Jetson Nano to run AI workload.

First part is gonna be about how to setup a Type 1 hypervisor like VMware ESXi or Proxmox.

Second part is going be about choosing/installing Linux operation systems such as Debian, Ubuntu or even Photon OS which should be powerful enough run control-planes and agents.

Then I will introduce you light weight version of Kubernetes(K8s) called K3s and talk about how to a setup cluster moreover automate repeated tasks using Ansible.

Lastly I will do a demo and make sure that AI workload will only be scheduled on Jetson Nano, where CUDA core can be used for AI workload.

Комментарии

Информация по комментариям в разработке