Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case

  • DataRobot
  • 2023-08-28
  • 878
Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case
  • ok logo

Скачать Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Enriching Datasets for Better Model Accuracy | DataRobot Generative AI Use Case

LLMs are ideally suited to enriching datasets. Data augmentation is very time-consuming for humans and difficult to replicate when models are used for inference. Therefore is often omitted. Using well-designed prompts, LLMs can be used to augment data sets before model training and greatly improve the accuracy of your predictive models.

Learn more at http://datarobot.com/platform/generat...

Content
Today's highlighted solution is Enriching Datasets for Better Model Accuracy. This is widely applicable across many data sets and use cases.

Sometimes the data you have just isn't sufficient to build a world-class predictive model, but you have specific ideas about what additional information you'd like to have. Large language models can help you enrich your data very, very quickly compared to other methods of augmentation.

This diagram shows tabular data being compiled into custom prompts and then using those prompts to ask an LLM for more contextual information. That contextual information is added to the original data set to create an augmented data set. When the augmented data set is used for a Predictive AI model the accuracy improves compared to the original data set without augmentation.

The demonstration that follows shows the process of augmenting training data in a DataRobot hosted Notebook before modeling begins.

We're inside a DataRobot hosted Notebook and this section of the notebook deals with adding additional rich details to the training data. Let me scroll down and show you what we'll be producing and then we'll go over the code that created it.

Here's our data frame and we've added this enhanced details column. I'll mouse over some of the enhanced details. This is all text which is outputted from the large language model which is enhancing the rich description of each piece of equipment, and should be very helpful in predicting its price. So where did all of that text come from?

We start by defining two functions - one of them is called compile_details_prompt(). This is the one that will actually create the prompt from the tabular data in our data set. And then we have generate_pipeline(). This will transmit the prompt to the API endpoint of a large language model and capture the response. In our case we're using GPT Turbo 3.5, but you could substitute this with any LLM that's accessible by API.

To use those two functions together, we define a third function called get_details() which we'll apply to every row in the data set.

This cell shows the prompt that is compiled for row 15 in the data set. It primes the large language model what its role should be and it asks what else do you know about this equipment? It also provides some instructions about how the LLM should respond.

This is the response we receive and store as an additional cell in our data set. In this use case we'll be predicting the price of some used equipment and so these types of rich details will hopefully be very helpful for predicting the price.

Now we're back where we started. We define a new column called enhancedDetails. We iterate through all rows in the data set. Then we add it to our data frame and that's how I was able to show you this preview at the beginning.

If you now register this data set and create a DataRobot project from this data set, you'll be able to use that enhancedDetails column as part of your model building. This is how data augmentation is able to improve your overall model accuracy.

With a platform approach you can confidently build valuable and safe generative AI applications at enterprise scale. At DataRobot our open extensible platform helps you build quickly and operate securely to bring your Generative AI solutions to life.

Learn more at https://datarobot.com/platform/genera... where you can also request a personalized demonstration or sign up for a free trial.


Stay connected with DataRobot!
► Blog: https://blog.datarobot.com/
► Community: https://community.datarobot.com/
► Twitter:   / datarobot  
► LinkedIn:   / datarobot  
► Facebook:   / datarobotinc  
► Instagram:   / datarobot  

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]