Audio-visual Annotation Graphs for Guiding Lens-based Scene Exploration

Moonisa Ahsan, Fabio Marton, Ruggero Pintus, Enrico Gobbetti
Computers \& Graphics - 2022
Download the publication : cag2022-avagraphs.pdf [9.1Mo]  
We introduce a novel approach for guiding users in the exploration of annotated 2D models using interactive visualization lenses. Information on the interesting areas of the model is encoded in an annotation graph generated at authoring time. Each graph node contains an annotation, in the form of a visual and audio markup of the area of interest, as well as the optimal lens parameters that should be used to explore the annotated area and a scalar representing the annotation importance. Directed graph edges are used, instead, to represent preferred ordering relations in the presentation of annotations, by having each node point to the set of nodes that should be seen before presenting its associated annotation. A scalar associated to each edge determines the strength of this constraint. At run-time, users explore the scene with the lens, and the graph is exploited to select the annotations that have to be presented at a given time. The selection is based on the current view and lens parameters, the graph content and structure, and the navigation history. The best annotation under the lens is presented by playing the associated audio clip and showing the visual markup in overlay. When the user releases control, requests guidance, opts for automatic touring, or when no available annotations are under the lens, the system guides the user towards the next best annotation using glyphs, and potentially moves the lens towards it if the user remains inactive. This approach supports the seamless blending of an automatic tour of the data with interactive lens-based exploration. The approach is tested and discussed in the context of the exploration of multi-layer relightable models.

Images and movies


BibTex references

  author       = {Ahsan, M. and Marton, F. and Pintus, R. and Gobbetti, E.},
  title        = {Audio-visual Annotation Graphs for Guiding Lens-based Scene Exploration},
  journal      = {Computers \\& Graphics},
  year         = {2022},
  note         = {To appear (currently available online)},
  keywords     = {interactive visualization lenses, annotations, user interfaces, interactive exploration,   guidance, guided tour},
  doi          = {10.1016/j.cag.2022.05.003},
  url          = {},

Other publications in the database

» Moonisa Ahsan
» Fabio Marton
» Ruggero Pintus
» Enrico Gobbetti