We propose a solution for Active Visual Search of objects in an environment, whose 2D floor map is the only known information. Our solution has three key features that make it more plausible and robust to detector failures compared to state-of-the-art methods: (i) it is unsupervised as it does not need any training sessions. (ii) During the exploration, a probability distribution on the 2D floor map is updated according to an intuitive mechanism, while an improved belief update increases the effectiveness of the agent’s exploration. (iii) We incorporate the awareness that an object detector may fail into the aforementioned probability modelling by exploiting the success statistics of a specific detector. Our solution is dubbed POMP-BE-PD (Pomcp-based Online Motion Planning with Belief by Exploration and Probabilistic Detection). It uses the current pose of an agent and an RGB-D observation to learn an optimal search policy, exploiting a POMDP solved by a Monte-Carlo planning approach. On the Active Vision Database benchmark, we increase the average success rate over all the environments by a significant 35% while decreasing the average path length by 4% with respect to competing methods. Thus, our results are state-of-the-art, even without using any training procedure.
Searching for Coca Cola bottle
Searching for Listerine
Searching for Red Bull
@ARTICLE{taioli_24_UAVS,
author={Taioli, Francesco and Giuliari, Francesco and Wang, Yiming and Berra, Riccardo and Castellini, Alberto and Bue, Alessio Del and Farinelli, Alessandro and Cristani, Marco and Setti, Francesco},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
title={{Unsupervised Active Visual Search with Monte Carlo Planning under Uncertain Detections}},
year={2024},
volume={},
number={},
pages={1-12},
doi={10.1109/TPAMI.2024.3451994}
}