Powering Your Generative AI Workloads with AMD and Open-Source ROCm - Farshad Ghodsian

Описание к видео Powering Your Generative AI Workloads with AMD and Open-Source ROCm - Farshad Ghodsian

Powering Your Generative AI Workloads with AMD and Open-Source ROCm - Farshad Ghodsian, Sourced Group

In the generative AI ecosystem today, there is a strong emphasis on expensive AI hardware and proprietary CUDA implementations. While CUDA has undeniably played a crucial role in the success of generative AI, I’d like to share my experience with running generative AI workloads and applications on cost-effective AMD hardware and the open-source ROCm software stack. This alternative approach aims to provide users with greater flexibility and options, allowing them to apply their generative AI solutions across a wider range of hardware and software choices than ever before. Learn how to run your favourite open source large language and image generation models using ROCm, how far ROCm has come from previous versions and what features are currently supported, including PyTorch support, Optimum-AMD, Flash Attention 2, GPTQ and vLLM and how more affordable workstation class AMD GPUs compare to their Nvidia counterparts in terms of performance and inference speed. You will also see several demos of ROCm in action and some tips and things to watch out for when working with AMD GPUs.

Комментарии

Информация по комментариям в разработке