Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory

  • All Tech Is Human
  • 2025-11-03
  • 391
When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory
sam gregorydeepfakesall tech is humanresponsible techresponsible tech summitresponsible ai
  • ok logo

Скачать When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture | Sam Gregory

Sam Gregory delivered his talk, When 'Seeing is Believing' Falters: AI, Visual Truth and Epistemic Fracture, at All Tech Is Human's Responsible Tech Summit. This was held in NYC on October 27, 2025 and was sponsored by Mastercard and Pinterest. Sam Gregory is the Executive Director of WITNESS.

All Tech Is Human is an organization at the forefront of the Responsible Tech movement, shaping its direction through multistakeholder community-building, education, and career-related activities. Our non-profit organization is committed to tackling thorny tech & society issues and aligning our tech future with the public interest. We do this by leveraging collective intelligence, involvement, and action.

Our range of activities include regular in-person gatherings (NYC, London, DC, SF), a bi-monthly livestream series, a large Slack community (13k members, 114 countries), our annual Responsible Tech Guide, our popular Responsible Tech Job Board, and regular workshops and reports. We also recently released a series of five short Responsible AI courses. Our work is made possible through the philanthropic support of the Patrick J. McGovern Foundation, Siegel Family Endowment, and others. Visit AllTechIsHuman.org to learn more.

Sam Gregory is a human rights advocate and technologist fighting deepfakes and deceptive AI and deploying the power of video and technology to defend human rights and protect truth and evidence. As Executive Director of WITNESS, recipient of the inaugural Peabody Global Impact Award for “media or organizations that have profoundly changed the world for the better” and for its work “tirelessly championing the power of emergent media technologies in defense of human rights around the world,” he leads a global team empowering millions to use video and technology for human rights. He launched the groundbreaking "Prepare, Don't Panic" initiative, tackling deepfakes and generative AI through policy and standards influence, public debate, and the first-ever Deepfakes Rapid Response Force. Sam has testified before the US Congress and Senate on AI and media transparency, delivered a TED Talk on combating malicious AI, and has over 25 years of global experience innovating at the intersection of video, technology, and human rights.

Sam's expertise is regularly featured in major media outlets, and he taught the first course at Harvard on participatory media and human rights. Sam has served on the ICC Technology Advisory Board, co-chaired key initiatives and working groups for the Partnership on AI and the Coalition for Content Provenance and Authenticity, and completed his PhD at the University of Westminster focused on participatory media, human rights activism, AI, and trust.

===
Sam Gregory, Executive Director of WITNESS, opened by illustrating how generative AI has destabilized public trust in visual and auditory media. He presented striking examples of AI-manipulated or falsely attributed media: still photos animated into fake videos, fabricated protest footage, and genuine content wrongly labeled as fake. Gregory explained that misinformation now operates on two fronts—deception (deepfakes that fabricate events) and denial (using “the AI alibi” to discredit real evidence). About one-third of WITNESS’s current cases involve such denials. This pervasive uncertainty, he argued, is fueling an “epistemic crisis,” where the shared basis of verifiable reality—crucial for journalism, human-rights documentation, and democratic debate—is rapidly eroding.

Rather than relying on “individual vigilance,” Sam urged systemic solutions to strengthen public confidence in what we see and hear. He called for four urgent actions: (1) build resilient infrastructure with provenance systems such as watermarking and embedded metadata that disclose AI’s role in content creation (e.g., the C2PA standard); (2) ensure privacy-protecting, enforceable regulation to make authenticity disclosures mandatory; (3) improve detection systems—while acknowledging they rarely exceed 85–90% accuracy—and make them globally inclusive; and (4) invest in tools and training for journalists, election monitors, and human-rights defenders to authenticate evidence and protect individuals’ likenesses. He also stressed the need for new legal safeguards and alerts against unauthorized digital cloning.

Sam concluded that the loss of visual truth is not just a technical failure but a crisis of power. When truth becomes unverifiable, he warned, the experiences of marginalized communities are the first to be dismissed, echoing Hannah Arendt’s warning that a society unable to believe anything becomes manipulable. Preserving “what is human, real, and true” in an AI-saturated world, he argued, demands urgent, coordinated action across technology, law, and civil society—before our collective capacity to discern reality is irreparably fractured.

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]