Please use this identifier to cite or link to this item: https://dspace.crs4.it/jspui/handle/1138/38
DC FieldValueLanguage
dc.contributor.authorAhsan, Moonisaen_US
dc.contributor.authorMarton, Fabioen_US
dc.contributor.authorPintus, Ruggeroen_US
dc.contributor.authorGobbetti, Enricoen_US
dc.date.accessioned2022-05-13T12:40:17Z-
dc.date.available2022-05-13T12:40:17Z-
dc.date.issued2022-
dc.identifier.urihttps://dspace.crs4.it/jspui/handle/1138/38-
dc.description.abstractWe introduce a novel approach for guiding users in the exploration of annotated 2D models using interactive visualization lenses. Information on the interesting areas of the model is encoded in an annotation graph generated at authoring time. Each graph node contains an annotation, in the form of a visual and audio markup of the area of interest, as well as the optimal lens parameters that should be used to explore the annotated area and a scalar representing the annotation importance. Directed graph edges are used, instead, to represent preferred ordering relations in the presentation of annotations, by having each node point to the set of nodes that should be seen before presenting its associated annotation. A scalar associated to each edge determines the strength of this constraint. At run-time, users explore the scene with the lens, and the graph is exploited to select the annotations that have to be presented at a given time. The selection is based on the current view and lens parameters, the graph content and structure, and the navigation history. The best annotation under the lens is presented by playing the associated audio clip and showing the visual markup in overlay. When the user releases control, requests guidance, opts for automatic touring, or when no available annotations are under the lens, the system guides the user towards the next best annotation using glyphs, and potentially moves the lens towards it if the user remains inactive. This approach supports the seamless blending of an automatic tour of the data with interactive lens-based exploration. The approach is tested and discussed in the context of the exploration of multi-layer relightable models.en_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.relationAdvanced Visual and Geometric Computing for 3D Capture, Display, and Fabricationen_US
dc.relationSVDCen_US
dc.relation.ispartofComputers & Graphicsen_US
dc.subjectInteractive visualization lenses; Annotations; User interfaces; Interactive exploration; GuidanceGuided touren_US
dc.titleAudio-visual annotation graphs for guiding lens-based scene explorationen_US
dc.typejournal articleen_US
dc.identifier.doi10.1016/j.cag.2022.05.003-
dc.contributor.affiliationCRS4en_US
dc.contributor.affiliationCRS4en_US
dc.contributor.affiliationCRS4en_US
dc.contributor.affiliationCRS4en_US
dc.relation.grantno813170en_US
dc.relation.grantnoRASen_US
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.openairetypejournal article-
item.cerifentitytypePublications-
item.grantfulltextopen-
crisitem.author.orcid0000-0001-8611-1921-
crisitem.author.orcid0000-0003-1786-7068-
crisitem.author.orcid0000-0003-0831-2458-
crisitem.project.funderEC-
crisitem.project.projectURLwww.evocation.eu-
crisitem.project.fundingProgramH2020-
crisitem.project.openAireinfo:eu-repo/grantAgreement/EC/H2020/813170-
Appears in Collections:CRS4 publications
Files in This Item:
File Description SizeFormat
cag2022-annograph.pdfGreen Open Access Copy8,95 MBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.