Information theory comes into play where ever encoding & decoding is present. For example: compression(multimedia), cryptography.
In Information Theory we encounter terms like "Entropy", "Self Information", "Mutual Information" and entire subject is based on these terms. Which just sound nothing more than abstract. Frankly, they don't really make any sense.
Is there any book/material/explanation (if you can) which explains these things in a practical way?
EDIT:
An Introduction to Information Theory: symbols, signals & noise by John Robinson Pierce is The Book that explains it the way I want (practically). Its too good. I started reading it.
Shanon's original paper "A mathematical theory of communication" is one very very important resource for studying this theory. Nobody NOBODY should miss it.
By reading it you will understand how Shanon arrived at the theory which should clear most of the doubts.
Also studying workings of Huffman compression algorithm will be very helpful.
EDIT:
An Introduction to Information Theory
John R. Pierce
seems good according to the amazon reviews (I haven't tried it).
[by Googleing "information theory layman" ]