Complexity of European R&D: Asymmetric Queries w/ SBERT Sentence Transformers (SBERT 12)

Описание к видео Complexity of European R&D: Asymmetric Queries w/ SBERT Sentence Transformers (SBERT 12)

Apply asymmetric vs symmetric queries on SBERT pre-trained models for SentenceTransformers (in this video without fine-tuning!).


Data: 330 industrial R&D projects, starting 2021 in Europe. Investment: 1 bn Euro.
Task: Discover complex pattern in multidisciplinary R&D - in all of Europe - with sentence embedding (BERT-base).


Meta: Extract deep knowledge from 330 project descriptions.
Challenge: Define intelligent human queries.

#sbert
#tsdae
#nlproc
#datascience
#nlptechniques
#clustering
#semantic
#bert
#3danimation
#3dvisualization
#topologicalspace
#deeplearning
#machinelearningwithpython
#pytorch
#sentence
#embedding
#complex
#umap
#insight
#algebraic_topology
#code_your_own_AI
#SentenceTransformers

Parallel semantic search algorithm
pre-trained BERT models for asymmetric search algorithm

Комментарии

Информация по комментариям в разработке