ARTIFICIAL NEURAL NETWORKS

  1. a) Write some of the applications of artificial neural networks. [2]      b) With an example write about systems of linear equations and substitutions.     [2]
  2. Define perceptron and its structure. [2]
  3. Write about various notations used in back propagation algorithm derivation. [3]
  4. Compare multilayer perceptron and Radial Basis Function networks. [3]
  5. Write the Lagrange multiplier function and two conditions of optimality. [2]                                
Click here to join us on Social Media for getting instant update on every notice

                                                        PART–B(4x14 = 56 Marks)              

  1. a) “Neuron inhibition depends on activation function” Justify this statement with [7] different types of activation functions.
  2. b) Explain the taxonomy of artificial neural network architectures.                                                               [7]
  3.   a) What is state space model of artificial neural networks? How it can be used for [7] optimization of various applications. 
  4. b) Discuss the role of mean square error in delta learning rule? Explain the impact [7] of continuous activation function in it.

                                     

  1. a) Write and explain initialization, activation, computation of actual response [7] adaptation of weight vector and continuation operations of perceptron convergence theorem.
  2. b) What kind of operations can be implemented with perceptron? Show that it [7] cannot implement Exclusive OR function.
  3.   a) How to improve the performance of back propagation learning algorithm [7] through free parameters? Write about its convergence. 
  4. b) List and explain various practical and design issues of back propagation            [7] learning.

                                     

  1. Write about the following with respect to Radial Basis Function(RBF) networks                                     [14]
    1. RBF networks design
    2. RBF networks training
    3. RBF networks with regularization theory
    4.                           a)    What is Support Vector Machine? Explain how it separates non-separable        [7] patterns.
    5. b) How to build a Support Vector Machine for pattern recognition problem?                                                            [7]

Explain in detail.

 

PART–A(14 Marks)                                             

  1. a) What is the role of synapse in biological neuron? Discuss [2]          b) Differentiate neural networks with state space neural networks           [3]
  2. Write about linear adaptive filtering. [2]
  3. What is backward propagation of error signals? [2]
  4. What is interpolation? [2]
  5. Give the architecture of Support Vector Machine. [3]

                                     

  PART–B(4x14 = 56 Marks)  2. a) Explain the working principles of single input neuron, multiple inputs neuron [7] and neurons with „R‟ number of inputs.  

  1. b) Why activation function is used in Artificial neuron? Explain different [7] activation functions.

                                     

  1. Justify the statement” Artificial neuron can learn the environment” through [14] different learning strategies.
  2.   a) Illustrate the working principle of perceptron with a pair of linearly separable [7] and a pair of non-linearly separable patterns. 
  3. b) Explain the relation between perceptron and classical pattern Bayes classifier [7] for the Gaussian environment.
  4.                           a)    How multilayer feed forward networks can be used to solve linearly inseparable           [7] functions? Explain.
  5. b) Discuss the training algorithm and its derivation for weight updates in back       [7] propagation networks.
  6.   a) What is radial basis function network (RBFN)? Explain the training algorithm [7] used for RBFN with fixed centers. 
  7. b) How regularization theory helps in solving ill-posed problems? Explain in [7] detail.

                                     

  1. Explain How to find maximal hyper planes to solve two class classification [14] problem with Support Vector Machine When data is
  2. Linearly separable
  3. Linearly Inseparable

 

  1. a) Discuss about neuron cell inhibition. [2]      b) Write a short note on invertible and singular matrices in matrix algebra.  [2]
  2. What is Jacobian matrix? Give its applications in single layer perceptron. [2]
  3. Write a short note on learning rate parameter and local gradient in back [3] propagation.
  4. Differentiate regularization networks and Redial Basis Function networks. [3]
  5. What is support vector? Give example. [2]

                                     

  PART–B(4x14 = 56 Marks)  2. a) Explain various function aspects of artificial neuron model with respect to bias, [7] weighted inputs and activation functions.

  1. b) With neat sketch differentiate multilayer feed forward networks and recurrent [7] neural networks.
  2.   a) What is the role of vector algebra in multivariate analysis? Explain various [7] operations that can be performed on vector algebra.
  3. b) Differentiate the working principles of supervised and unsupervised learning [7] with an example learning algorithm for each type of learning.

                                     

  1. Write the following with respect to Perceptron algorithm   [14]
    1. Training Sample with input signal vector x(n) and Desired response d(n)
    2. Signal Flow graph representations
    3. Convergence Considerations
    4. Virtues and limitations  
    5.   a) What is the use of Back Propagation networks? Explain the training steps for [7] back propagations networks. 
    6. b) Discuss various steps involved in solving function approximation with back [7] propagation networks.
    7.   a) Write about the usage of Radial Basis Function networks to perform complex [7] pattern classification task. 
    8. b) What is universal approximation theorem? Explain approximation properties of [7] Radial Basis Function networks.

                                     

  1. a) Illustrate the idea of an optimal hyperplane for linearly separable patterns. [7] b) What is inner product kernels? Explain inner product kernels for various types [7] of Support Vector Machines.
  1. a) Discuss the role of activation function in artificial neuron. [2] b) How to find multiplication by inverse in vector algebra? Give example.       [3]
  2. What is learning rate annealing in perceptron? [2]
  3. Explain forward propagation of function signals. [2]
  4. Write the role of three layers involved in Radial Basis Function networks. [3]
  5. What is dual problem? [2]

                                     

                                                         PART–B(4x14 = 56 Marks)             

  1. a) “Artificial neuron is resembling the functionalities of biological neuron”-Justify [7] this statement in all functional aspects.
  2. b) Explain the concept of single layer of „S‟ number of neurons and multi-layer [7] neuron model.
  3.   a) Discuss the concept of optimization with suitable example related to artificial [7] neural networks. 
  4. b) What is unsupervised learning? Explain competitive and Hebbian learning [7] algorithms.
  5.                           a)    Write about the two-class pattern classification problem. How it can be solved by [7] perceptron? Explain. 
  6. b) Explain how synaptic weights are adapted iteration by iteration using error [7] correction rule in perceptron convergence algorithm.
  7.   a) What is Multi-layer feed forward networks?  What is the importance of hidden [7] and output layers in it?
  8. b) Write and explain the derivation of back propagation training algorithm. Explain [7] the role of learning rate coefficient in its convergence.
  9.   a) What is interpolation problem? Explain how it is solved with Radial Basis [7] Function networks?
  10. b) Explain weighted norm and receptive fields of generalized radial basis function [7] networks.
  11.   a) Derive and explain various constraints involved in quadratic optimization for [7] finding the optimal hyperplanes. 
  12. b) Design the Support Vector Machine for Classification Problem. Explain various [7] mathematical functions used behind it.