Gradient Descent Multiple Choice Question In Machine Learning
Hello Dear Students we will discuss about Gradient Descent Multiple Choice Question in this Post. Gradient Descent is algorithm which is used in Machine Learning In this Algorithm minimize some function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient.
First of all let’s discuss about some definations Which are present into Gradient Descent Topic.
What Is mean by Gradient Descent?
Gradient Descent is Machine Learning Algorithm in which this algorithm minimize some function function by iteratively moving in the direction of steepest decent as defined by the negetive of the gradient.
71.Gradient Descent is an optimization algorithm used for,
ANSWER= B) minimizing the cost function in various machine learning algorithms
Explain:-Gradient Descent is an optimization algorithm used for minimizing the cost function in various machine learning algorithms
72._____processes all the training examples for each iteration of gradient descent.
ANSWER= B) Batch Gradient Descent
Explain:- Batch Gradient Descent processes all the training examples for each iteration of gradient descent.
73.There are how many types of Gradient Descent?
ANSWER= B) 3
Explain:- There 3 types of Gradient Descent which are Batch Gradient Descent Stochastic Gradient Descent Mini Batch gradient descent
74.There are how many types of Gradient Descent?
ANSWER= B) 3
Explain:- There 3 types of Gradient Descent which are Batch Gradient Descent Stochastic Gradient Descent Mini Batch gradient descent
75._____is a type of gradient descent which processes 1 training example per iteration.
ANSWER= B) Stochastic Gradient Descent
Explain:- Stochastic Gradient Descent is a type of gradient descent which processes 1 training example per iteration.
76.Which is the fastest gradient descent?
ANSWER= C) Mini Batch gradient descent
Explain:- Mini Batch gradient descent is faster than batch gradient descent and stochastic gradient descent.
77.Which is quite faster than batch gradient descent?
ANSWER= B) Stochastic Gradient Descent
Explain:- the parameters are being updated even after one iteration in which only a single example has been processed. Hence this is quite faster than batch gradient descent.
78.Which Gradient descent works for larger training examples and that too with lesser number of iterations.?
ANSWER= C) Mini Batch gradient descent
Explain:- Mini Batch gradient descent
79.______, the algorithm follows a straight path towards the minimum.
ANSWER= A) Batch Gradient Descent
Explain:-
80. If the cost function is convex, then it converges to a _____
ANSWER= B) global minimum
Explain:- If the cost function is convex, then it converges to a global minimum.