option
Questions
ayuda
daypo
search.php

ERASED TEST, YOU MAY BE INTERESTED ON backpropagation - the basics

COMMENTS STATISTICS RECORDS
TAKE THE TEST
Title of test:
backpropagation - the basics

Description:
the basics of backpropagation

Author:
pooh bear
Other tests from this author

Creation Date: 30/08/2024

Category: Computers

Number of questions: 10
Share the Test:
New CommentNuevo Comentario
No comments about this test.
Content:
Backpropagation propagates the error from the output layer to the intermediate layers of a neural network so that they can modify their weights in order to minimize the average error. true false.
One of the main differences between backpropagation and SGD (stochastic gradient descent) is the way the weights are updated, since SGD uses the gradient calculated for all training data, while backpropagation uses the gradient calculated only for a mini-batch of training data. true false.
In which artificial neural network architecture is the backpropagation algorithm used for training? (A) Kohonen. (B) hopfield (C) Perceptron. (D) Multilayer perceptron (MLP). (E) Radial Basis Function (RBF).
Considering Data Mining methods, analyze the following description: “builds the so-called linear classifiers, which separate the data set by means of a hyperplane, and are considered one of the most effective for the classification task.” These are: A- Wang-Mendel. B- Backpropagation. C- SVM (Support Vector Machines). D- Naive Bayesian Classifier.
Artificial neural networks (ANNs) are computational techniques that, based on a mathematical model inspired by the neural structure of intelligent beings, acquire knowledge through experience. Regarding ANNs, select the correct option. A- In the ANN learning process, the reinforcement learning paradigm can be used. B- The backpropagation algorithm is used in ANNs in the process of reducing the space of output variables. C- The model proposed by McCullock and Pitts in the first half of the 20th century does not use an activation function. D- ANNs do not have an output layer. E- Minsky and Papert mathematically analyzed the perceptron and demonstrated that single-layer networks are capable of solving problems that are not linearly separable.
A group of programmers working with data mining for a large company built an algorithm based on one of the existing mining algorithm models. In this case, the algorithm built is one that performs multiple passes over the transaction database, being able to work with a large number of attributes, and that obtains, as a result, several combinatorial alternatives between them, using successive searches in the entire database and achieving satisfactory execution performance. The data mining algorithm model described in the presented situation is based on algorithms Alternatives A- Apriori B- Baysian C- Backpropagation D- C4.5 E- CN2.
Considering Data Mining methods, analyze the following description: “builds the so-called linear classifiers, which separate the data set by means of a hyperplane, and are considered one of the most effective for the classification task.” These are: A- Wang-Mendel. B-Backpropagation. C- SVM (Support Vector Machines). D- Naive Bayesian Classifier.
Considering the study in Artificial Intelligence, select the alternative that correctly presents the classification algorithms in supervised learning. A) Naive Bayes, Artificial Neural Networks and K-means. B) Decision Trees, Simulated Annealing and Backpropagation. C) k-means, Naive Bayes and Genetic Algorithms. D) Decision Tree, Artificial Neural Networks and KNN. E) Logistic Regression, K-means and Fuzzy Logic.
LOBs or Large Objects. LOBs can be used to store binary data, character data, and file references, and can store up to 128 terabytes of data, depending on the database configuration. There are four types of LOBs; one of them is used to store a pointer to a file. Select it. BLOB CLOB BFILE NCLOB.
A linearly separable set is composed of examples that can be separated by at least one hyperplane. Linear SVMs seek the optimal hyperplane according to statistical learning theory, defined as the one in which the separation margin between the classes present in the data is minimized. true false.
Report abuse