Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Risk Verification of AI-enabled Autonomous Systems

  • Lars Lindemann
  • 2022-07-16
  • 127
Risk Verification of AI-enabled Autonomous Systems
  • ok logo

Скачать Risk Verification of AI-enabled Autonomous Systems бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Risk Verification of AI-enabled Autonomous Systems или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Risk Verification of AI-enabled Autonomous Systems бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Risk Verification of AI-enabled Autonomous Systems

A talk on "Risk Verification of AI-enabled Autonomous Systems" that I gave at the RSS workshop on Risk Aware Decision Making (https://sites.google.com/nyu.edu/risk....

Abstract: AI-enabled autonomous systems promise to enable many future technologies such as autonomous driving, intelligent transportation, and robotics. Accelerated by the computational advances in machine learning and AI, there has been tremendous success in the development of autonomous systems over the past years. At the same time, however, new fundamental questions were raised regarding the safety of these increasingly complex systems that often operate in uncertain environments. In fact, such systems have been observed to take excessive risks in certain situations, often due to the use of neural networks which are known for their fragility. In this seminar, I will provide new insights in how to conceptualize risk for AI-enabled autonomous systems, and how to verify these systems in terms of their risk.

The main idea that I would like to convey in this talk is to use notions of spatial and temporal robustness to systematically define risk for autonomous systems. We are here particularly motivated by the fact that the safe deployment of autonomous systems critically relies on their robustness, e.g., against modeling or perception errors. In the first part of the talk, we will consider spatial robustness which can be understood in terms of safe tubes around nominal system trajectories. I will then show how risk measures, classically used in finance, can be used to quantify the risk of lacking robustness against failure, and how we can reliably estimate this robustness risk from finite data with high confidence. We will compare and verify four different neural network controllers in terms of their risk for a self-driving car in the autonomous driving simulator CARLA. In the second part of the talk, we will take a closer look at temporal robustness which has been much less studied than spatial robustness despite its importance, e.g., timing uncertainties in autonomous driving. I will introduce the notions of synchronous and asynchronous temporal robustness to quantify the robustness of system trajectories against various forms of timing uncertainties, and consecutively use risk measures to quantify the risk of lacking temporal robustness against failure. Finally, I am going to show that both notions of spatial and temporal robustness risk can be used for general forms of safety specifications including temporal logic specifications.

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]