ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

7,630,711 results

Related queries

claude shannon

information theory entropy

information theory shannon

information theory digital communication

source coding theorem

Computerphile
Why Information Theory is Important - Computerphile

Zip files & error correction depend on information theory, Tim Muller takes us through how Claude Shannon's early Computer ...

12:33
Why Information Theory is Important - Computerphile

180,023 views

3 years ago

Khan Academy Labs
What is information theory? | Journey into information theory | Computer Science | Khan Academy

A broad introduction to this field of study Watch the next lesson: ...

3:26
What is information theory? | Journey into information theory | Computer Science | Khan Academy

308,734 views

11 years ago

Oxford Mathematics
Information Theory, Lecture 1: Defining Entropy and Information - Oxford Mathematics 3rd Yr Lecture

In this lecture from Sam Cohen's 3rd year 'Information Theory' course, one of eight we are showing, Sam asks: how do we ...

53:46
Information Theory, Lecture 1: Defining Entropy and Information - Oxford Mathematics 3rd Yr Lecture

146,277 views

10 months ago

Intelligent Systems Lab
Information Theory Basics

The basics of information theory: information, entropy, KL divergence, mutual information. Princeton 302, Lecture 20.

16:22
Information Theory Basics

97,701 views

5 years ago

3Blue1Brown
Solving Wordle using information theory

An excuse to teach a lesson on information theory and entropy. These lessons are funded by viewers: ...

30:38
Solving Wordle using information theory

11,483,574 views

3 years ago

Khan Academy Labs
Information entropy | Journey into information theory | Computer Science | Khan Academy

Finally we arrive at our quantitative measure of entropy Watch the next lesson: ...

7:05
Information entropy | Journey into information theory | Computer Science | Khan Academy

352,318 views

11 years ago

People also watched

StatQuest with Josh Starmer
Entropy (for data science) Clearly Explained!!!

Entropy is a fundamental concept in Data Science because it shows up all over the place - from Decision Trees, to similarity ...

16:35
Entropy (for data science) Clearly Explained!!!

814,978 views

4 years ago

Lex Clips
Claude Shannon at MIT: The best master's thesis in history | Neil Gershenfeld and Lex Fridman

Lex Fridman Podcast full episode: https://www.youtube.com/watch?v=YDjOS0VHEr4 Please support this podcast by checking out ...

7:39
Claude Shannon at MIT: The best master's thesis in history | Neil Gershenfeld and Lex Fridman

232,815 views

2 years ago

Cracking The Nutshell
WII? (2a) Information Theory, Claude Shannon, Entropy, Redundancy, Data Compression & Bits

What is Information? - Part 2a - Introduction to Information Theory: Script: ...

24:17
WII? (2a) Information Theory, Claude Shannon, Entropy, Redundancy, Data Compression & Bits

67,655 views

12 years ago

Abhay Dang
Information Theory and Entropy - Intuitive introduction to these concepts

With this video, I hope to give an easy introduction to the concept of information function and entropy. These concepts are often ...

35:25
Information Theory and Entropy - Intuitive introduction to these concepts

8,092 views

9 years ago

James V Stone
Information Theory Tutorial Part 1: What is Information?

Part 2 can be viewed here: https://www.youtube.com/watch?v=7OQ7BFuINOU The book chapter from which this example is taken ...

7:19
Information Theory Tutorial Part 1: What is Information?

5,629 views

2 years ago

Serrano.Academy
Shannon Entropy and Information Gain

CORRECTION: at 13:41, the probability is 6.1e-5 and not 4.8e-4 (however, the entropy is 1.75, which is correct). Thank you ...

21:16
Shannon Entropy and Information Gain

220,491 views

8 years ago

JentGent
these compression algorithms could halve our image file sizes (but we don't use them) #SoMEpi

an explanation of the source coding theorem, arithmetic coding, and asymmetric numeral systems this was my entry into #SoMEpi.

18:23
these compression algorithms could halve our image file sizes (but we don't use them) #SoMEpi

345,332 views

1 year ago

Technovedanta
From Information Theory to a Theory of Everything: The Ouroboros Code

Philosophical Treatise on how Digital Physics and Consciousness can be captured in one framework.

57:45
From Information Theory to a Theory of Everything: The Ouroboros Code

2,969 views

7 years ago

Aurélien Géron
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence

Entropy, Cross-Entropy and KL-Divergence are often used in Machine Learning, in particular for training classifiers. In this short ...

10:41
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence

382,191 views

7 years ago

Institute for Quantum Computing
John Preskill - Introduction to Quantum Information (Part 1) - CSSQI 2012

John Preskill, Richard P. Feynman Professor of Theoretical Physics at the California Institute of Technology, gave a lecture about ...

1:00:27
John Preskill - Introduction to Quantum Information (Part 1) - CSSQI 2012

83,868 views

13 years ago

Reducible
Huffman Codes: An Information Theory Perspective

Huffman Codes are one of the most important discoveries in the field of data compression. When you first see them, they almost ...

29:11
Huffman Codes: An Information Theory Perspective

272,864 views

4 years ago

Visual Electric
The Story of Information Theory: from Morse to Shannon to ENTROPY

Course: https://www.udemy.com/course/introduction-to-power-system-analysis/?couponCode=KELVIN ✓ This is the story of how ...

41:15
The Story of Information Theory: from Morse to Shannon to ENTROPY

390,929 views

7 months ago

Jakob Foerster
Lecture 1: Introduction to Information Theory

Lecture 1 of the Course on Information Theory, Pattern Recognition, and Neural Networks. Produced by: David MacKay ...

1:01:51
Lecture 1: Introduction to Information Theory

385,382 views

11 years ago