Information and Coding

Question 1
Marks : +2 | -2
Pass Ratio : 100%
Entropy of a random variable is
0
1
Infinite
Cannot be determined
Explanation:
Entropy of a random variable is also infinity.
Question 2
Marks : +2 | -2
Pass Ratio : 100%
Self information should be
Positive
Negative
Positive & Negative
None of the mentioned
Explanation:
Self information is always non negative.
Question 3
Marks : +2 | -2
Pass Ratio : 100%
Binary Huffman coding is a
Prefix condition code
Suffix condition code
Prefix & Suffix condition code
None of the mentioned
Explanation:
Binary Huffman coding is a prefix condition code.
Question 4
Marks : +2 | -2
Pass Ratio : 100%
The self information of random variable is
0
1
Infinite
Cannot be determined
Explanation:
The self information of a random variable is infinity.
Question 5
Marks : +2 | -2
Pass Ratio : 100%
The unit of average mutual information is
Bits
Bytes
Bits per symbol
Bytes per symbol
Explanation:
The unit of average mutual information is bits.
Question 6
Marks : +2 | -2
Pass Ratio : 100%
When the base of the logarithm is 2, then the unit of measure of information is
Bits
Bytes
Nats
None of the mentioned
Explanation:
When the base of the logarithm is 2 then the unit of measure of information is bits.
Question 7
Marks : +2 | -2
Pass Ratio : 100%
When probability of error during transmission is 0.5, it indicates that
Channel is very noisy
No information is received
Channel is very noisy & No information is received
None of the mentioned
Explanation:
When probability of error during transmission is 0.5 then the channel is very noisy and thus no information is received.
Question 8
Marks : +2 | -2
Pass Ratio : 100%
The method of converting a word to stream of bits is called as
Binary coding
Source coding
Bit coding
Cipher coding
Explanation:
Source coding is the method of converting a word to stream of bits that is 0’s and 1’s.
Question 9
Marks : +2 | -2
Pass Ratio : 100%
The event with minimum probability has least number of bits.
True
False
Explanation:
In binary Huffman coding the event with maximum probability has least number of bits.
Question 10
Marks : +2 | -2
Pass Ratio : 100%
When X and Y are statistically independent, then I (x,y) is
1
0
Ln 2
Cannot be determined
Explanation:
When X and Y are statistically independent the measure of information I (x,y) is 0.