ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

783 results

INI Seminar Room 1
STSW02 | Johannes Schmidt-hieber | Statistical theory for deep neural networks with ReLU activation

STSW02 | Prof. Johannes Schmidt-hieber | Statistical theory for deep neural networks with ReLU activation function Speaker: ...

1:00:34
STSW02 | Johannes Schmidt-hieber | Statistical theory for deep neural networks with ReLU activation

7 views

3 weeks ago

Sumantra Codes
Activation Functions Explained: Sigmoid, ReLU, GELU & The Vanishing Gradient | Deep Learning

If we stack thousands of layers of neurons without activation functions, what do we get? We get a single linear regression model.

11:47
Activation Functions Explained: Sigmoid, ReLU, GELU & The Vanishing Gradient | Deep Learning

1,138 views

3 weeks ago

ByteQuest
AlexNet Explained from Scratch | Architecture, Layers, ReLU, Dropout

In this video, we break down the AlexNet convolutional neural network architecture layer by layer. We cover convolutions, pooling, ...

4:23
AlexNet Explained from Scratch | Architecture, Layers, ReLU, Dropout

276 views

11 days ago

AI Academy
Activation Functions Explained | ReLU, Sigmoid, Softmax (Simple & Visual for Beginners)

Activation Functions are the heart of Neural Networks. Without them, AI models would be dumb, linear, and unable to learn ...

2:13
Activation Functions Explained | ReLU, Sigmoid, Softmax (Simple & Visual for Beginners)

7 views

1 month ago

Machine Learning with PyTorch
ReLU Bends the Decision Boundary (Visual Explanation)

ReLU looks simple, but it changes geometry. In this video, I show how a tiny 2D ReLU network turns a straight decision line into a ...

8:52
ReLU Bends the Decision Boundary (Visual Explanation)

42 views

3 weeks ago

amos
Biological intuition behind ReLU activation function.

Hello everyone, this is my first video about artificial intelligence. I'm going to show you the biological support behind the ReLU ...

1:31
Biological intuition behind ReLU activation function.

7 views

3 weeks ago

SystemDR - Scalable System Design
Unlock CNN Power: The CRUCIAL Role of ReLU in Deep Learning EXPLAINED #aiml #cnn #deeplearning

ReLU (Rectified Linear Unit) is fundamental in Convolutional Neural Networks (CNNs) because it introduces crucial non-linearity, ...

7:57
Unlock CNN Power: The CRUCIAL Role of ReLU in Deep Learning EXPLAINED #aiml #cnn #deeplearning

24 views

4 weeks ago

Relu
Relu AI dental design service Bleaching Trays

In this video, we showcase how bleaching trays can be designed using Relu's AI-powered dental design service Bleaching Trays.

0:44
Relu AI dental design service Bleaching Trays

42 views

3 weeks ago

IgnoVex
Activation Functions in Neural Networks | Artificial Intelligence | Machine Learning | IgnoVex

Activation functions are what give neural networks the power to learn complex, real-world patterns. In this video, we break down ...

8:17
Activation Functions in Neural Networks | Artificial Intelligence | Machine Learning | IgnoVex

53 views

13 days ago

AI and Machine Learning Explained
What Is The Role Of ReLU In CNN Architectures?

Unravel the critical function of ReLU (Rectified Linear Unit) in Convolutional Neural Networks. This video breaks down how this ...

3:20
What Is The Role Of ReLU In CNN Architectures?

0 views

3 weeks ago

AI and Machine Learning Explained
What Is The Difference Between Sigmoid And ReLU In CNNs?

Ever wondered about the secret sauce behind powerful neural networks? This video breaks down the crucial differences between ...

4:30
What Is The Difference Between Sigmoid And ReLU In CNNs?

0 views

3 weeks ago

The Clue Matrix
How AI Handles Interference: The Power of ReLU in Neural Networks

Original Video: https://www.youtube.com/watch?v=pzOEx4x1EYw&list=PL0Vz403YCedh7XaXqJSS90ezoHAtRe_HU&index=3.

3:22
How AI Handles Interference: The Power of ReLU in Neural Networks

7 views

4 weeks ago

INI Seminar Room 1
Prof. Rebecca Willett | The Role of Linear Layers in Nonlinear Interpolating Networks

Title: The Role of Linear Layers in Nonlinear Interpolating Networks Speaker: Professor Rebecca Willett (University of Chicago) ...

55:57
Prof. Rebecca Willett | The Role of Linear Layers in Nonlinear Interpolating Networks

0 views

3 weeks ago

INI Seminar Room 1
MDLW01 | Prof. Johannes Schmidt-hieber | Convergence rates of deep ReLU networks

MDLW01 | Prof. Johannes Schmidt-hieber | Convergence rates of deep ReLU networks for multiclass classification Speaker: ...

28:01
MDLW01 | Prof. Johannes Schmidt-hieber | Convergence rates of deep ReLU networks

8 views

3 weeks ago

InsidiousScare
Deep Learning Intro: The Perceptron, ReLU Activation & The XOR Problem

Welcome to Phase 2: Neural Networks. In this video, we move from Standard Statistics to Deep Learning. I explain how a ...

4:03
Deep Learning Intro: The Perceptron, ReLU Activation & The XOR Problem

8 views

7 days ago

Ashutosh Maheshwari
Rectified Linear Unit (ReLU) activation function with CUDA | GPU Programming | LeetGPU

Writing a CUDA kernel requires a shift in mental model. Instead of one fast processor, you manage thousands of tiny threads.

6:11
Rectified Linear Unit (ReLU) activation function with CUDA | GPU Programming | LeetGPU

240 views

4 weeks ago

Machine & Deep Learning: From Learning to Hiring
Lecture 10: Learn Deep Learning: ReLU and Leaky ReLU Functions

This lecture introduces the ReLU (Rectified Linear Unit) and Leaky ReLU activation functions, which are among the most widely ...

6:20
Lecture 10: Learn Deep Learning: ReLU and Leaky ReLU Functions

7 views

3 days ago

Perceptron
Activation Functions In Neural Network | Types Of Activation Functions #learning #ai #education

What is an activation function and why is it so important in neural networks? In this video, you'll understand the concept of ...

4:19
Activation Functions In Neural Network | Types Of Activation Functions #learning #ai #education

68 views

7 days ago

Big Data Landscape
Neural Networks for Beginners - MLP -Activation Functions

... might use a variant like leaky relu that preserves some negative values but standard relu is the workhorse The sigmoid function ...

25:38
Neural Networks for Beginners - MLP -Activation Functions

27 views

4 weeks ago

AI Polygot
A 5-minute Keras Recipe

This is a omprehensive overview of neural network history and modern implementation using the Keras library. One section traces ...

7:15
A 5-minute Keras Recipe

5 views

3 weeks ago