Pattern Classification

Question 1
Marks : +2 | -2
Pass Ratio : 100%
Two classes are said to be inseparable when?
there may exist straight lines that doesn’t touch each other
there may exist straight lines that can touch each other
there is only one straight line that separates them
all of the mentioned
Explanation:
Linearly separable classes, functions can be separated by a line.
Question 2
Marks : +2 | -2
Pass Ratio : 100%
In perceptron learning, what happens when input vector is correctly classified?
small adjustments in weight is done
large adjustments in weight is done
no adjustments in weight is done
weight adjustments doesn’t depend on classification of input vector
Explanation:
No adjustments in weight is done, since input has been correctly classified which is the objective of the system.
Question 3
Marks : +2 | -2
Pass Ratio : 100%
The perceptron convergence theorem is applicable for what kind of data?
binary
bipolar
both binary and bipolar
none of the mentioned
Explanation:
The perceptron convergence theorem is applicable for both binary and bipolar input, output data.
Question 4
Marks : +2 | -2
Pass Ratio : 100%
Is it necessary to set initial weights in prceptron convergence theorem to zero?
yes
no
Explanation:
Initial setting of weights doesn’t affect perceptron convergence theorem.
Question 5
Marks : +2 | -2
Pass Ratio : 100%
If e(m) denotes error for correction of weight then what is formula for error in perceptron learning model: w(m + 1) = w(m) + n(b(m) – s(m)) a(m), where b(m) is desired output, s(m) is actual output, a(m) is input vector and ‘w’ denotes weight
e(m) = n(b(m) – s(m)) a(m)
e(m) = n(b(m) – s(m))
e(m) = (b(m) – s(m))
none of the mentioned
Explanation:
Error is difference between desired and actual output.
Question 6
Marks : +2 | -2
Pass Ratio : 100%
If two classes are linearly inseparable, can perceptron convergence theorem be applied?
yes
no
Explanation:
Perceptron convergence theorem can only be applied, if and only if two classses are linearly separable.
Question 7
Marks : +2 | -2
Pass Ratio : 100%
What is the objective of perceptron learning?
class identification
weight adjustment
adjust weight along with class identification
none of the mentioned
Explanation:
The objective of perceptron learning is to adjust weight along with class identification.
Question 8
Marks : +2 | -2
Pass Ratio : 100%
When two classes can be separated by a separate line, they are known as?
linearly separable
linearly inseparable classes
may be separable or inseparable, it depends on system
none of the mentioned
Explanation:
Linearly separable classes, functions can be separated by a line.
Question 9
Marks : +2 | -2
Pass Ratio : 100%
On what factor the number of outputs depends?
distinct inputs
distinct classes
both on distinct classes & inputs
none of the mentioned
Explanation:
Number of outputs depends on number of classes.
Question 10
Marks : +2 | -2
Pass Ratio : 100%
w(m + 1) = w(m) + n(b(m) – s(m)) a(m), where b(m) is desired output, s(m) is actual output, a(m) is input vector and ‘w’ denotes weight, can this model be used for perceptron learning?
yes
no
Explanation:
Gradient descent can be used as perceptron learning.