1. Heteroassociative memory can be an example of which type of network?
a) group of instars
b) group of oustar
c) either group of instars or outstars
d) both group of instars or outstars
Explanation: Depending upon the flow, the memory can be of either of the type
2. What is STM in neural network?
a) short topology memory
b) stimulated topology memory
c) short term memory
d) none of the mentioned
Explanation:Full form of STM.
3. What does STM corresponds to?
a) activation state of network
b) encoded pattern information pattern in synaptic weights
c) either way
d) both way
Explanation: Short-term memory (STM) refers to the capacity-limited retention of information over a brief period of time,hence the option.
4. What LTM corresponds to?
a) activation state of network
b) encoded pattern information pattern in synaptic weights
c) either way
d) both way
Explanation:Long-term memory (LTM-the encoding and retention of an effectively unlimited amount of information for a much longer period of time) & hence the option.
5. On what parameters can change in weight vector depend?
a) learning parameters
b) input vector
c) learning signal
d) all of the mentioned
Explanation:Change in weight vector corresponding to jth input at time (t+1) depends on all of these parameters.
6. If the change in weight vector is represented by ∆wij, what does it mean?
a) describes the change in weight vector for ith processing unit, taking input vector jth into account
b) describes the change in weight vector for jth processing unit, taking input vector ith into account
c) describes the change in weight vector for jth & ith processing unit.
d) none of the mentioned
Explanation: ∆wij= µf(wi a)aj, where a is the input vector
7. What is learning signal in this equation ∆wij= µf(wi a)aj?
a) µ
b) wi a
c) aj
d) f(wi a)
Explanation: This the non linear representation of output of the network.
8. State whether Hebb’s law is supervised learning or of unsupervised type?
a) supervised
b) unsupervised
c) either supervised or unsupervised
d) can be both supervised & unsupervised
Explanation: No desired output is required for it’s implementation
9. Hebb’s law can be represented by equation?
a) ∆wij= µf(wi a)aj
b) ∆wij= µ(si) aj, where (si) is output signal of ith input
c) both way
d) none of the mentioned
Explanation: (si)= f(wi a), in Hebb’s law.
10. State which of the following statements hold foe perceptron learning law?
a) it is supervised type of learning law
b) it requires desired output for each input
c) ∆wij= µ(bi – si) aj
d) all of the mentioned
Explanation: all statements follow from ∆wij= µ(bi – si) aj, where bi is the target output & hence supervised learning.