Bagging is the technique used by Random Forests. Random forests are a collection of trees which work on sampled data from the original dataset with the final prediction being a voted average of all trees.
The correct answer to the question “What ensemble technique is used by Random Forests?” is:
Random Forests use the ensemble technique known as Bagging (Bootstrap Aggregating). Bagging involves training multiple independent models on different subsets of the training data, and then aggregating their predictions to make the final prediction. In the case of Random Forests, the base models are decision trees, and each tree is trained on a random subset of the training data (bootstrap sample) and considers a random subset of features at each split. This randomness helps to reduce overfitting and improves the overall performance of the ensemble model.