Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models

  • ML in PL
  • 2024-05-04
  • 66
Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models
  • ok logo

Скачать Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Grzegorz Jacenków-Privacy distillation:reducing reidentification risk of multimodal diffusion models

Knowledge distillation in neural networks refers to compressing a large model or dataset into a smaller version of itself. We introduce Privacy Distillation, a framework that allows a text-to-image generative model to teach another model without exposing it to identifiable data. Here, we are interested in the privacy issue faced by a data provider who wishes to share their data via a multimodal generative model. A question that immediately arises is “How can a data provider ensure that the generative model is not leaking identifiable information about a patient?”. Our solution consists of (1) training a first diffusion model on real data (2) generating a synthetic dataset using this model and filtering it to exclude images with a re-identifiability risk (3) training a second diffusion model on the filtered synthetic data only. We showcase that datasets sampled from models trained with privacy distillation can effectively reduce re-identification risk whilst maintaining downstream performance.

Currently a Data Scientist at Amazon, Grzegorz Jacenków specialises in multimodal learning research and large language models (LLMs). Prior to joining Amazon, he was a PhD student in Healthcare AI at The University of Edinburgh, where he also earned an MSc in Artificial Intelligence. His academic foundation was laid with a BSc in Computer Science with Business and Management from The University of Manchester. Notably, Grzegorz contributed to CERN as a technical student, addressing author disambiguation at Inspire-HEP. His research interests encompass multimodal alignment, low-resource learning, and leveraging knowledge graphs.

The talk was delivered during ML in PL Conference 2023 as a part of Contributed Talks. The conference was organized by a non-profit NGO called ML in PL Association.

ML in PL Association website: https://mlinpl.org/
ML In PL Conference 2023 website: https://conference2023.mlinpl.org/
ML In PL Conference 2024 website: https://conference.mlinpl.org/
---

ML in PL Association was founded based on the experiences in organizing of the ML in PL Conference (formerly PL in ML), the ML in PL Association is a non-profit organization devoted to fostering the machine learning community in Poland and Europe and promoting a deep understanding of ML methods. Even though ML in PL is based in Poland, it seeks to provide opportunities for international cooperation.

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]