eBay Auctions—Boosting and Bagging. Using the eBay auction data (file eBayAuctions.csv) with variable Competitive as the outcome variable, partition the data into training (60%) and validation (40%).
a. Run a classification tree, using the default settings of DecisionTreeClassifier. Looking at the validation set, what is the overall accuracy? What is the lift on the first decile?
b. Run a boosted tree with the same predictors (use AdaBoostClassifier with DecisionTreeClassifier as the base estimator). For the validation set, what is the overall accuracy? What is the lift on the first decile?
c. Run a bagged tree with the same predictors (use BaggingClassifier). For the validation set, what is the overall accuracy? What is the lift on the first decile?
d. Run a random forest (use RandomForestClassifier). Compare the bagged tree to the random forest in terms of validation accuracy and lift on first decile. How are the two methods conceptually different?
d.