Information Theory, Lecture 2: Basic Properties of Information - 3rd Year Student Lecture
Given the definition of entropy and Shannon information, how can we do algebra involving these quantities? In the second lecture from Sam Cohen's 3rd year course on Information Theory, we study how conditioning on other random variables affect entropy, and how we can use this to understand the relationships between random outcomes.
You can watch eight lectures from the course as they appear via the playlist: https://www.youtube.com/playlist?list=PL4d5ZtfQonW3iAhXvTYCnoGEeRhxhKHMc
You can also watch many other student lectures via our main Student Lectures playlist (also check out specific student lectures playlists): https://www.youtube.com/playlist?list=PL4d5ZtfQonW0A4VHeiY0gSkX1QEraaacE
All first and second year lectures are followed by tutorials where students meet their tutor in pairs to go through the lecture and associated problem sheet and to talk and think more about the maths. Third and fourth year lectures are followed by classes.