ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

626 results

CodeEmporium
Activation Functions - EXPLAINED!

We start with the whats/whys/hows. Then delve into details (math) with examples. Follow me on M E D I U M: ...

10:05
Activation Functions - EXPLAINED!

151,087 views

5 years ago

CampusX
Relu Variants Explained | Leaky Relu | Parametric Relu | Elu | Selu | Activation Functions Part 2

This is part 2 of the Activation Function Series. In this video, we will discuss the dying relu problem and then learn about the ...

33:25
Relu Variants Explained | Leaky Relu | Parametric Relu | Elu | Selu | Activation Functions Part 2

68,211 views

3 years ago

Michael Nielsen
The Universal Approximation Theorem for neural networks

For an introduction to artificial neural networks, see Chapter 1 of my free online book: ...

6:25
The Universal Approximation Theorem for neural networks

83,398 views

8 years ago

Developers Hutt
Softmax Activation Function || Softmax Function || Quick Explained || Developers Hutt

Here is another one in the Quick Explained series. The softmax function is widely used to make multi-class classifiers. In this video ...

2:18
Softmax Activation Function || Softmax Function || Quick Explained || Developers Hutt

110,248 views

5 years ago

CIS 522 - Deep Learning
Implementing ReLU

This video was recorded as part of CIS 522 - Deep Learning at the University of Pennsylvania. The course material, including the ...

0:54
Implementing ReLU

454 views

4 years ago

CIS 522 - Deep Learning
How ReLU Networks Work: Locally Linear Functions

This video was recorded as part of CIS 522 - Deep Learning at the University of Pennsylvania. The course material, including the ...

1:16
How ReLU Networks Work: Locally Linear Functions

421 views

4 years ago

Vuk Rosić
Neural Networks From Scratch - Step by Step - AI Research

github - https://github.com/vukrosic/become-elite-ai-researcher Become AI Researcher & Train LLM From Scratch ...

18:01
Neural Networks From Scratch - Step by Step - AI Research

365 views

1 month ago

Vuk Rosić
ReLU vs GELU vs SiLU for LLM Training (In FeedForward Layer)

GitHub - https://github.com/vukrosic/LLM-activation-fn-comparison/ Code ...

1:50
ReLU vs GELU vs SiLU for LLM Training (In FeedForward Layer)

410 views

5 months ago

BohrAI
AI Research: Why He Initialization Works Best for ReLU – Explained

In this video, I break down why He initialization is the best choice when you're using ReLU activation. I explain what makes He ...

11:04
AI Research: Why He Initialization Works Best for ReLU – Explained

64 views

1 month ago

Roel Van de Paar
Computer Science: How is ReLU used in machine learning functions?

Computer Science: How is ReLU used in machine learning functions? Helpful? Please support me on Patreon: ...

1:30
Computer Science: How is ReLU used in machine learning functions?

1 view

4 years ago

CampusX
Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

In artificial neural networks, each neuron forms a weighted sum of its inputs and passes the resulting scalar value through a ...

44:52
Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

130,864 views

3 years ago

The Debug Zone
Approximating Sine Function with Neural Networks and ReLU Activation

In this video, we delve into the fascinating intersection of mathematics and artificial intelligence by exploring how neural networks ...

2:22
Approximating Sine Function with Neural Networks and ReLU Activation

29 views

7 months ago

DSGT Bootcamp
[5.3] Introduction to Deep Learning  Alternatives to ReLU

Data Science at Georgia Tech Udemy Bootcamp Introduction to Deep Learning (5th module) By Faris Durrani Copyright © Data ...

1:23
[5.3] Introduction to Deep Learning Alternatives to ReLU

13 views

3 years ago

Jeremy Howard
Lesson 17: Deep Learning Foundations to Stable Diffusion

All lesson resources are available at http://course.fast.ai.) In this lesson, we discuss the importance of weight initialization in neural ...

1:56:33
Lesson 17: Deep Learning Foundations to Stable Diffusion

11,834 views

2 years ago

Carsten Wulff
Lecture 4 - Analog Neural Networks and Translinear Circuits

Lecture Notes: https://analogicus.com/aic2025/2025/02/06/Lecture-4-Analog-Neural-Networks.html Demo: ...

34:55
Lecture 4 - Analog Neural Networks and Translinear Circuits

3,118 views

11 months ago

Roel Van de Paar
Should activation function be monotonic in neural networks?

Should activation function be monotonic in neural networks? Helpful? Please support me on Patreon: ...

1:22
Should activation function be monotonic in neural networks?

20 views

4 years ago

Arthur Kho
Neural Network Backpropagation Step 1:  Forward Pass

ReLU activation function https://compendium.xyz/2021/05/16/neural-network-backpropagation.html.

0:53
Neural Network Backpropagation Step 1: Forward Pass

220 views

4 years ago

Vuk Rosić
Functions In AI Resesarch - Become AI Researcher Step by Step

... Exponential Functions 11:52 - Trigonometric Functions 12:51 - Sigmoid Function 15:04 - ReLU Function 16:56 - Tanh Function ...

17:38
Functions In AI Resesarch - Become AI Researcher Step by Step

340 views

3 months ago

Rolando Coto
Basics of Neural Networks (Accelerated Computational Linguistics 2020.W06.04)

Accelerated Computational Linguistics Dartmouth College LING48/COSC72 Spring 2020. Week 06, Video 04: Basics of Neural ...

19:54
Basics of Neural Networks (Accelerated Computational Linguistics 2020.W06.04)

257 views

5 years ago

MLQs Café
Today’s Question: What is Leaky ReLU, how it differ from traditional ReLU activation function?

Learn the concept clearly in under 1 minutes, explained step-by-step with examples! About MLQ Café Welcome to MLQ Café ...

0:44
Today’s Question: What is Leaky ReLU, how it differ from traditional ReLU activation function?

162 views

8 months ago