ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

1,490 results

BrainDrain
How positional encoding works in transformers?

Today we will discuss positional encoding in Transformers people quite often mention this picture having trouble understanding ...

5:36
How positional encoding works in transformers?

36,494 views

2 years ago

AI Coffee Break with Letitia
Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

What are positional embeddings and why do transformers need positional encodings? In this video, we explain why Attention is ...

9:40
Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

87,536 views

4 years ago

Jia-Bin Huang
How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

What makes Rotary Positional Encodings useful? https://arxiv.org/abs/2410.06205 - [Controlled study] A Controlled Study on Long ...

13:39
How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

21,327 views

1 year ago

Stanford Online
Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

For more information about Stanford's Artificial Intelligence programs visit: https://stanford.io/ai This lecture is from the Stanford ...

13:02
Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

14,034 views

2 years ago

CodeEmporium
Positional Encoding in Transformer Neural Networks Explained

Positional Encoding! Let's dig into it ABOUT ME ⭕ Subscribe: https://www.youtube.com/c/CodeEmporium?sub_confirmation=1 ...

11:54
Positional Encoding in Transformer Neural Networks Explained

54,851 views

2 years ago

Serrano.Academy
How do Transformer Models keep track of the order of words? Positional Encoding

Transformer models can generate language really well, but how do they do it? A very important step of the pipeline is the ...

9:50
How do Transformer Models keep track of the order of words? Positional Encoding

13,071 views

1 year ago

DeepLearning Hero
RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

Unlike sinusoidal embeddings, RoPE are well behaved and more resilient to predictions exceeding the training sequence length.

14:06
RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

49,366 views

2 years ago

Efficient NLP
Rotary Positional Embeddings: Combining Absolute and Relative

... absolute and relative positional encodings. 0:00 - Introduction 1:22 - Absolute positional embeddings 3:19 - Relative positional ...

11:17
Rotary Positional Embeddings: Combining Absolute and Relative

67,873 views

2 years ago

Pramod Goyal
Positional Encoding | How LLMs understand structure

In this video, I have tried to have a comprehensive look at Positional Encoding, one of the fundamental requirements of ...

9:10
Positional Encoding | How LLMs understand structure

2,339 views

11 months ago

Herman Kamper
Positional encodings in transformers (NLP817 11.5)

Lecture notes: https://www.kamperh.com/nlp817/notes/11_transformers_notes.pdf Full playlist: ...

19:29
Positional encodings in transformers (NLP817 11.5)

8,004 views

2 years ago

TechieTalksAI
Easy LLM Part-3: Secrets of Transformer Embeddings & Positional Encoding!

Easy LLM Part-3: Secrets of Transformer Embeddings & Positional Encoding!

11:39
Easy LLM Part-3: Secrets of Transformer Embeddings & Positional Encoding!

4,064 views

6 months ago

StatQuest with Josh Starmer
Encoder-Only Transformers (like BERT) for RAG, Clearly Explained!!!

... 11:15 Positional Encoding 12:39 Attention 15:17 Applications of Encoder-Only Transformers 16:19 RAG (Retrieval-Augmented ...

18:52
Encoder-Only Transformers (like BERT) for RAG, Clearly Explained!!!

77,933 views

1 year ago

TechieTalksAI
The Secret Behind LLMs: Positional Encoding & RoPE Finally EXPLAINED (Mind-Blowing Visual Demo!)

Ever wondered how LLMs magically understand the order of words, even though Transformers do not read left–to–right?

13:30
The Secret Behind LLMs: Positional Encoding & RoPE Finally EXPLAINED (Mind-Blowing Visual Demo!)

1,499 views

3 weeks ago

Mr. Gyula Rabai
Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

In this video, Gyula Rabai Jr. explains Rotary Positional Embedding (RoPE), a technique used by large language models (LLMs) ...

4:17
Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

243 views

11 months ago

Machine Learning with PyTorch
Transformer Positional Embeddings With A Numerical Example

Unlike in RNNs, inputs into a transformer need to be encoded with positions. In this video, I showed how positional encoding are ...

6:21
Transformer Positional Embeddings With A Numerical Example

26,337 views

4 years ago

Skill Advancement
Positional Encoding in Transformers

This video offers a comprehensive deep dive into the concept of Positional Encoding (PE) within the Transformer Architecture.

7:15
Positional Encoding in Transformers

70 views

11 days ago

Alkademy Learning
Positional Encoding in Transformers Simplified

In this tutorial, you will learn about the concept of positional encoding and how to compute it for the attention mechanism in the ...

15:48
Positional Encoding in Transformers Simplified

772 views

8 months ago

Under The Hood
What Are Word Embeddings?

... Architecture 17:16 - Positional Encoding 18:46 - Outro Efficient Estimation of Word Representations in Vector Space: Word2Vec ...

19:33
What Are Word Embeddings?

55,046 views

10 months ago

AI Coffee Break with Letitia
Adding vs. concatenating positional embeddings & Learned positional encodings

When to add and when to concatenate positional embeddings? What are arguments for learning positional encodings? When to ...

9:21
Adding vs. concatenating positional embeddings & Learned positional encodings

24,980 views

4 years ago

Rui Xu
Positional Encoding as Spatial Inductive Bias in GANs (CVPR'2021 presentation video)

In this paper, we reveal how SinGAN learns the global structure with a limited receptive field. It is zero padding that serves as an ...

4:58
Positional Encoding as Spatial Inductive Bias in GANs (CVPR'2021 presentation video)

286 views

4 years ago