Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA)

  • Ted Pavlic
  • 2026-02-02
  • 10
Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA)
  • ok logo

Скачать Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA) бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA) или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA) бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA)

In a Genetic Algorithm (GA), there are five key hyperparameters – population size, number of parents, number of elites, crossover rate, and mutation rate – along with hyperparameters of a selection operator that adjust so-called selection pressure. In this video, I describe the collective effect of these 6 hyperparameters of the performance of a Genetic Algorithm. I describe how the population size (M) represents a computational cost paid to increase the general accuracy of an algorithm, allowing it to innovate through increased capacity. However, within a given population size, the other parameters adjust the dynamics of that search. The number of parents (R) sets up the amount of background information retention in the system, such that the difference M-R (which I call reproductive skew) sets up the potential for exploration of new solutions. That novelty is only possible by having mutation, set by the mutation rate (Pm), with the shape of trajectories to new candidate solutions being significantly modulated by the crossover rate (Pc) that happens to have little effect when there is no diversity left in the system. On top of all of these parameters is the selection pressure (tuned in different ways for different selection operators), which represents how much greedy pressure there is for satisficing (i.e., converging on a good enough local solution as opposed to searching for a better global solution). I try to capture all of this in different graphical frameworks to help remember how these parameters relate to exploration and exploitation/fine tuning, and I close with a characterization of evolutionary systems (in general) in drift fields that inevitably switch from exploration to exploitation to random steady-state movement. It is the goal of the operations researcher employing the optimization metaheuristic to tune hyperparameters to best navigate this "drift field" space.

This video was recorded by Theodore P. Pavlic to support IEE/CSE 598 (Bio-Inspired AI and Optimization) at Arizona State University.

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]