Friday, October 7, 2022 3pm to 4pm
Ensemble decision trees such as Random Forest, XGBoost and Bayesian additive regression trees have gained great popularity in statistical and machine learning prediction and classification tasks. Most existing ensemble decision tree models rely on decision tree weak learners with axis-parallel univariate split rules to partition the Euclidean feature space into rectangular regions. In practice, however, many regression problems involve features with multivariate structures (e.g., spatial locations) possibly lying in a manifold, where rectangular partitions may fail to respect irregular intrinsic geometry and boundary constraints of the structured feature space. In this talk, we propose a new class of Bayesian additive multivariate decision tree models that combine univariate split rules for handling possibly high dimensional features without known multivariate structures and novel multivariate split rules for features with multivariate structures in each weak learner. The proposed multivariate split rules are built upon stochastic predictive spanning tree bipartition models on reference knots, which are capable of achieving highly flexible nonlinear decision boundaries on manifold feature spaces while enabling efficient dimension reduction computations. We demonstrate the superior performance of the proposed method using simulation data and a Sacramento housing price data set.
Green Building, UT Dallas, GR 3.420
Undergraduate Students, Faculty & Staff, Alumni, General Public, Prospective Students, Graduate Students, International Students
UTD strives to create inclusive and accessible events in accordance with the Americans with Disabilities Act (ADA). If you require an accommodation to fully participate in this event, please contact the event coordinator (listed above) at least 10 business days prior to the event. If you have any additional questions, please email ADACoordinator@utdallas.edu and the AccessAbility Resource Center at accessability@utdallas.edu.