Inferring visual biases in uav videos from eye movements - Institut d'Electronique et de Télécommunications de Rennes Accéder directement au contenu
Article Dans Une Revue Drones Année : 2020

Inferring visual biases in uav videos from eye movements

Résumé

Unmanned Aerial Vehicle (UAV) imagery is gaining a lot of momentum lately. Indeed, gathered information from a bird-point-of-view is particularly relevant for numerous applications, from agriculture to surveillance services. We herewith study visual saliency to verify whether there are tangible differences between this imagery and more conventional contents. We first describe typical and UAV contents based on their human saliency maps in a high-dimensional space, encompassing saliency map statistics, distribution characteristics, and other specifically designed features. Thanks to a large amount of eye tracking data collected on UAV, we stress the differences between typical and UAV videos, but more importantly within UAV sequences. We then designed a process to extract new visual attention biases in the UAV imagery, leading to the definition of a new dictionary of visual biases. We then conduct a benchmark on two different datasets, whose results confirm that the 20 defined biases are relevant as a low-complexity saliency prediction system.
Fichier principal
Vignette du fichier
Perrin-2020-Inferring Visual Biases in UAV Videos.pdf (4.69 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

hal-03002080 , version 1 (12-11-2020)

Licence

Paternité

Identifiants

Citer

Anne-Flore Perrin, Lu Zhang, Olivier Le Meur. Inferring visual biases in uav videos from eye movements. Drones, 2020, 4 (3), pp.1-25. ⟨10.3390/drones4030031⟩. ⟨hal-03002080⟩
40 Consultations
22 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More