ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

396 results

CodeEmporium
Activation Functions - EXPLAINED!

... new activation function (swish): https://arxiv.org/pdf/1710.05941v1.pdf [6] Used an Image of activation functions from this Pawan ...

10:05
Activation Functions - EXPLAINED!

151,230 views

5 years ago

CampusX
Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

In artificial neural networks, each neuron forms a weighted sum of its inputs and passes the resulting scalar value through a ...

44:52
Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

131,339 views

3 years ago

CampusX
Relu Variants Explained | Leaky Relu | Parametric Relu | Elu | Selu | Activation Functions Part 2

This is part 2 of the Activation Function Series. In this video, we will discuss the dying relu problem and then learn about the ...

33:25
Relu Variants Explained | Leaky Relu | Parametric Relu | Elu | Selu | Activation Functions Part 2

68,494 views

3 years ago

CIS 522 - Deep Learning
Implementing ReLU

This video was recorded as part of CIS 522 - Deep Learning at the University of Pennsylvania. The course material, including the ...

0:54
Implementing ReLU

454 views

4 years ago

Brandon Rohrer
What do neural networks learn?

Part of the End-to-End Machine Learning Course 193, How Neural Networks Work at http://e2eml.school/193 Blog post: ...

27:24
What do neural networks learn?

32,497 views

6 years ago

Hugo Mougard
Visualization of the universal approximation theorem

Illustration of how a neural net with one hidden layer can approximate a function. Wikipedia: ...

0:31
Visualization of the universal approximation theorem

28,279 views

4 years ago

Developers Hutt
Softmax Activation Function || Softmax Function || Quick Explained || Developers Hutt

Here is another one in the Quick Explained series. The softmax function is widely used to make multi-class classifiers. In this video ...

2:18
Softmax Activation Function || Softmax Function || Quick Explained || Developers Hutt

110,335 views

5 years ago

The Debug Zone
Approximating Sine Function with Neural Networks and ReLU Activation

In this video, we delve into the fascinating intersection of mathematics and artificial intelligence by exploring how neural networks ...

2:22
Approximating Sine Function with Neural Networks and ReLU Activation

29 views

7 months ago

Vuk Rosić
Neural Networks From Scratch - Step by Step - AI Research

... a Layer 12:30 - Layer Forward Pass 14:13 - More Activation Functions 16:11 - Activation Function Examples 17:28 - Next Steps.

18:01
Neural Networks From Scratch - Step by Step - AI Research

365 views

1 month ago

Townview Machine Learning and AI
Training a Neural Network (Meeting 18)

Broadcasted live on Twitch -- Watch live at https://www.twitch.tv/townviewml​ This was our eighteenth meeting. This week, we ...

54:40
Training a Neural Network (Meeting 18)

13 views

4 years ago

Arthur Kho
Neural Network Backpropagation Step 1:  Forward Pass

ReLU activation function https://compendium.xyz/2021/05/16/neural-network-backpropagation.html.

0:53
Neural Network Backpropagation Step 1: Forward Pass

221 views

4 years ago

Data Science Diaries
Activation Functions  | Deep Learning Tutorial -3 |6 Most Commonly Used Activation Functions

Linear Activation Function 3. Sigmoid Activation Function 4. Tanh Activation Function 5. ReLU Activation Function 6. Leaky ReLU ...

10:52
Activation Functions | Deep Learning Tutorial -3 |6 Most Commonly Used Activation Functions

173 views

3 years ago

BohrAI
AI Research: Why He Initialization Works Best for ReLU – Explained

In this video, I break down why He initialization is the best choice when you're using ReLU activation. I explain what makes He ...

11:04
AI Research: Why He Initialization Works Best for ReLU – Explained

66 views

1 month ago

MLQs Café
Today’s Question: What is Leaky ReLU, how it differ from traditional ReLU activation function?

Learn the concept clearly in under 1 minutes, explained step-by-step with examples! About MLQ Café Welcome to MLQ Café ...

0:44
Today’s Question: What is Leaky ReLU, how it differ from traditional ReLU activation function?

162 views

8 months ago

NFDI4Earth
Neural Networks Explained: Theory, Activation Functions, and Learning Process

... linear regression, activation functions, and the importance of biases. The video also covers backpropagation, gradient descent ...

25:45
Neural Networks Explained: Theory, Activation Functions, and Learning Process

55 views

10 months ago

CIS 522 - Deep Learning
How ReLU Networks Work: Locally Linear Functions

This video was recorded as part of CIS 522 - Deep Learning at the University of Pennsylvania. The course material, including the ...

1:16
How ReLU Networks Work: Locally Linear Functions

421 views

4 years ago

IBM Research
Recent Progress in Adversarial Robustness of AI Models: Attacks, Defenses, and Certification

By: Pin-Yu.Chen, IBM Research April 22, 2019 NeurIPS Paper : NeurIPS 2018 ...

59:43
Recent Progress in Adversarial Robustness of AI Models: Attacks, Defenses, and Certification

5,928 views

6 years ago

Roel Van de Paar
Computer Science: How is ReLU used in machine learning functions?

Computer Science: How is ReLU used in machine learning functions? Helpful? Please support me on Patreon: ...

1:30
Computer Science: How is ReLU used in machine learning functions?

1 view

4 years ago

ICTP Quantitative Life Sciences
Implicit Regularization of Gradient Descent for Wide Two-layer Relu Neural Networks

Lénaïc CHIZAT (University of Paris-Saclay, France) Youth in High-Dimensions | (smr 3602) 2021_06_15-18_20-smr3602.

18:10
Implicit Regularization of Gradient Descent for Wide Two-layer Relu Neural Networks

118 views

4 years ago

Roel Van de Paar
Should activation function be monotonic in neural networks?

Should activation function be monotonic in neural networks? Helpful? Please support me on Patreon: ...

1:22
Should activation function be monotonic in neural networks?

20 views

4 years ago