MoCaE: Mixture of Calibrated Experts Significantly Improves Object Detection

CoRR(2023)

Cited 0|Views9
No score
Abstract
Combining the strengths of many existing predictors to obtain a Mixture of Experts which is superior to its individual components is an effective way to improve the performance without having to develop new architectures or train a model from scratch. However, surprisingly, we find that naïvely combining expert object detectors in a similar way to Deep Ensembles, can often lead to degraded performance. We identify that the primary cause of this issue is that the predictions of the experts do not match their performance, a term referred to as miscalibration. Consequently, the most confident detector dominates the final predictions, preventing the mixture from leveraging all the predictions from the experts appropriately. To address this, when constructing the Mixture of Experts, we propose to combine their predictions in a manner which reflects the individual performance of the experts; an objective we achieve by first calibrating the predictions before filtering and refining them. We term this approach the Mixture of Calibrated Experts and demonstrate its effectiveness through extensive experiments on 5 different detection tasks using a variety of detectors, showing that it: (i) improves object detectors on COCO and instance segmentation methods on LVIS by up to ∼ 2.5 AP; (ii) reaches state-of-the-art on COCO test-dev with 65.1 AP and on DOTA with 82.62 AP_50; (iii) outperforms single models consistently on recent detection tasks such as Open Vocabulary Object Detection.
More
Translated text
Key words
calibrated experts,detection,object
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined