You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An easy question! How can we ensure that there is no data breach when using a pretrained model trained on a series of large datasets, including the COCO dataset with 80 classes and other datasets? Additionally, it seems that U-Recall may be misleading, as all objects in the M-OWODB and S-OWODB experiments appear to be known classes in pretrained model like clip.
The text was updated successfully, but these errors were encountered:
Hi, @hansensamjohn, thank you for your interest. During the pretraining process, we ensured that all data from the COCO dataset was removed, but as you rightly pointed out, we cannot guarantee that similar data does not appear in other pretraining datasets in some form, which complicates the definition of "unknown" classes and is a common challenge when using pretrained model. We conducted additional tests in recent real-world benchmark and on web images, and our observations indicate that our method still demonstrates strong generalization capabilities. While we recognize concerns in its evaluation metrics and datasets, we ultimately chose to use the widely adopted Open-World dataset, supplemented with the nuScenes dataset for autonomous driving scenarios, in our paper. Selecting more appropriate benchmarks would be an important area for our future exploration and improvement. Please feel free to share any additional thoughts or suggestions!
An easy question! How can we ensure that there is no data breach when using a pretrained model trained on a series of large datasets, including the COCO dataset with 80 classes and other datasets? Additionally, it seems that U-Recall may be misleading, as all objects in the M-OWODB and S-OWODB experiments appear to be known classes in pretrained model like clip.
The text was updated successfully, but these errors were encountered: