ARTIFICIAL NEURAL NETWORKS
- a) Write some of the applications of artificial neural networks. [2] b) With an example write about systems of linear equations and substitutions. [2]
- Define perceptron and its structure. [2]
- Write about various notations used in back propagation algorithm derivation. [3]
- Compare multilayer perceptron and Radial Basis Function networks. [3]
- Write the Lagrange multiplier function and two conditions of optimality. [2]
Click here to join us on Social Media for getting instant update on every notice
PART–B(4x14 = 56 Marks)
- a) “Neuron inhibition depends on activation function” Justify this statement with [7] different types of activation functions.
- b) Explain the taxonomy of artificial neural network architectures. [7]
- a) What is state space model of artificial neural networks? How it can be used for [7] optimization of various applications.
- b) Discuss the role of mean square error in delta learning rule? Explain the impact [7] of continuous activation function in it.
- a) Write and explain initialization, activation, computation of actual response [7] adaptation of weight vector and continuation operations of perceptron convergence theorem.
- b) What kind of operations can be implemented with perceptron? Show that it [7] cannot implement Exclusive OR function.
- a) How to improve the performance of back propagation learning algorithm [7] through free parameters? Write about its convergence.
- b) List and explain various practical and design issues of back propagation [7] learning.
- Write about the following with respect to Radial Basis Function(RBF) networks [14]
- RBF networks design
- RBF networks training
- RBF networks with regularization theory
- a) What is Support Vector Machine? Explain how it separates non-separable [7] patterns.
- b) How to build a Support Vector Machine for pattern recognition problem? [7]
Explain in detail.
PART–A(14 Marks)
- a) What is the role of synapse in biological neuron? Discuss [2] b) Differentiate neural networks with state space neural networks [3]
- Write about linear adaptive filtering. [2]
- What is backward propagation of error signals? [2]
- What is interpolation? [2]
- Give the architecture of Support Vector Machine. [3]
PART–B(4x14 = 56 Marks) 2. a) Explain the working principles of single input neuron, multiple inputs neuron [7] and neurons with „R‟ number of inputs.
- b) Why activation function is used in Artificial neuron? Explain different [7] activation functions.
- Justify the statement” Artificial neuron can learn the environment” through [14] different learning strategies.
- a) Illustrate the working principle of perceptron with a pair of linearly separable [7] and a pair of non-linearly separable patterns.
- b) Explain the relation between perceptron and classical pattern Bayes classifier [7] for the Gaussian environment.
- a) How multilayer feed forward networks can be used to solve linearly inseparable [7] functions? Explain.
- b) Discuss the training algorithm and its derivation for weight updates in back [7] propagation networks.
- a) What is radial basis function network (RBFN)? Explain the training algorithm [7] used for RBFN with fixed centers.
- b) How regularization theory helps in solving ill-posed problems? Explain in [7] detail.
- Explain How to find maximal hyper planes to solve two class classification [14] problem with Support Vector Machine When data is
- Linearly separable
- Linearly Inseparable
- a) Discuss about neuron cell inhibition. [2] b) Write a short note on invertible and singular matrices in matrix algebra. [2]
- What is Jacobian matrix? Give its applications in single layer perceptron. [2]
- Write a short note on learning rate parameter and local gradient in back [3] propagation.
- Differentiate regularization networks and Redial Basis Function networks. [3]
- What is support vector? Give example. [2]
PART–B(4x14 = 56 Marks) 2. a) Explain various function aspects of artificial neuron model with respect to bias, [7] weighted inputs and activation functions.
- b) With neat sketch differentiate multilayer feed forward networks and recurrent [7] neural networks.
- a) What is the role of vector algebra in multivariate analysis? Explain various [7] operations that can be performed on vector algebra.
- b) Differentiate the working principles of supervised and unsupervised learning [7] with an example learning algorithm for each type of learning.
- Write the following with respect to Perceptron algorithm [14]
- Training Sample with input signal vector x(n) and Desired response d(n)
- Signal Flow graph representations
- Convergence Considerations
- Virtues and limitations
- a) What is the use of Back Propagation networks? Explain the training steps for [7] back propagations networks.
- b) Discuss various steps involved in solving function approximation with back [7] propagation networks.
- a) Write about the usage of Radial Basis Function networks to perform complex [7] pattern classification task.
- b) What is universal approximation theorem? Explain approximation properties of [7] Radial Basis Function networks.
- a) Illustrate the idea of an optimal hyperplane for linearly separable patterns. [7] b) What is inner product kernels? Explain inner product kernels for various types [7] of Support Vector Machines.
- a) Discuss the role of activation function in artificial neuron. [2] b) How to find multiplication by inverse in vector algebra? Give example. [3]
- What is learning rate annealing in perceptron? [2]
- Explain forward propagation of function signals. [2]
- Write the role of three layers involved in Radial Basis Function networks. [3]
- What is dual problem? [2]
PART–B(4x14 = 56 Marks)
- a) “Artificial neuron is resembling the functionalities of biological neuron”-Justify [7] this statement in all functional aspects.
- b) Explain the concept of single layer of „S‟ number of neurons and multi-layer [7] neuron model.
- a) Discuss the concept of optimization with suitable example related to artificial [7] neural networks.
- b) What is unsupervised learning? Explain competitive and Hebbian learning [7] algorithms.
- a) Write about the two-class pattern classification problem. How it can be solved by [7] perceptron? Explain.
- b) Explain how synaptic weights are adapted iteration by iteration using error [7] correction rule in perceptron convergence algorithm.
- a) What is Multi-layer feed forward networks? What is the importance of hidden [7] and output layers in it?
- b) Write and explain the derivation of back propagation training algorithm. Explain [7] the role of learning rate coefficient in its convergence.
- a) What is interpolation problem? Explain how it is solved with Radial Basis [7] Function networks?
- b) Explain weighted norm and receptive fields of generalized radial basis function [7] networks.
- a) Derive and explain various constraints involved in quadratic optimization for [7] finding the optimal hyperplanes.
- b) Design the Support Vector Machine for Classification Problem. Explain various [7] mathematical functions used behind it.