Improving user experience during the delivery of
immersive content is crucial for its success for both the content
creators and audience. Creators can express themselves better
with multisensory stimulation, while the audience can experience a higher level of involvement. The rapid development of
mulsemedia devices provides better access for stimuli such as
olfaction and haptics. Nevertheless, due to the required manual
annotation process of adding mulsemedia effects, the amount
of content available with sensorial effects is still limited. This
work introduces an innovative mulsemedia-enhancement solution
capable of automatically generating olfactory and haptic content
based on 360° video content, with the use of neural networks. Two
parallel neural networks are responsible for automatically adding
scents to 360° videos: a scene detection network (responsible
for static, global content) and an action detection network
(responsible for dynamic, local content). A 360° video dataset with
scent labels is also created and used for evaluating the robustness
of the proposed solution. The solution achieves a 69.19% olfactory
accuracy and 72.26% haptics accuracy during evaluation using
two different datasets.
EU Horizon 2020 Research and Innovation programme under Grant Agreement no. 870610 for the TRACTION project, Science Foundation Ireland (SFI) Research Centres Programme Grant Numbers SFI/12/RC/2289_P2 (Insight) and SFI/16/SP/3804 (ENABLE)
ID Code:
26879
Deposited On:
01 Apr 2022 11:13 by
Anderson Augusto Simiscuka
. Last Modified 24 Mar 2023 14:25