Video4IMX-2025

2nd International Workshop on Video for Immersive Experiences

at the ACM Interactive Media Experiences Conference (IMX 2025), 3-6 June 2025

WORKSHOP AIM AND TOPICS




The aim of the Video4IMX - sponsored by the TRANSMIXR project – workshop is to address the increasing importance and relevance of classical (linear 2D), interactive (non-linear), 360° and volumetric video assets in the creation of immersive experiences, in connection with the use of state of the art Generative AI models. Richly granular and semantically expressive descriptive metadata about video assets is necessary for their discovery, adaptation and re-use in immersive content experiences, both in automated ways (e.g. automated insertion of journalist’s video recordings inside an immersive experience for a breaking news story) and semi-automated (e.g. creatives can search for and re-use videos as part of a theatrical or cultural immersive experience).


The descriptive metadata needs to be extracted (adapted to the particular characteristics of interactive, 360° and volumetric video), modelled (according to shared vocabularies and knowledge models) and managed (in appropriate storage tools with expressive query support) before it can be meaningfully used to discover and organise video assets for new, innovative data-driven immersive content experiences. There should also be means to adapt, summarise or remix video content according to the usage purpose in the immersive experience, even to the extent that various modalities could be input to generative AI models to generate e.g. video from text or image, or 3D objects or scenes from video.


The workshop will solicit the latest research and development in all areas around the extraction, modelling and management of descriptive metadata for video as well as approaches to adapt or convert video according to its purpose and use in an immersive experience. It aims to support the growth of a community of researchers and practitioners interested in creating an ecosystem of tools, specifications and best practices for video discovery, adaptation, summarization or generation, particularly in the context of video (re-)use in immersive experiences.


Topics for the workshop include, but are not limited to:


  • Extraction and modelling of descriptive metadata about traditional 2D video, 360° video and volumetric video (decomposition, semantic representation, categorization, annotation, emotion/mood extraction etc.)
  • Tools and algorithms for the (semi-automatic) adaptation, summarisation or remixing of any type of video assets (traditional, interactive, 360°, volumetric), particularly for re-use in immersive content
  • Generative AI (foundational vision models, vision language models) for visual understanding, extraction of descriptive metadata from traditional, interactive, 360° or volumetric video
  • Generative AI for creation of video assets out of other modal inputs such as textual prompts or image sets
  • Generative AI for transformation of or between any type of video, such as generating (possibly multimodal) video summaries, or converting an input video into immersive content (3d objects or scenes)
  • Artificial intelligence and machine learning for volumetric video content analysis, understanding and retrieval to facilitate XR content generation
  • Methods for explainable AI for visual content understanding and for immersive multimedia applications (e.g. game design)
  • Examples and use cases for usage of video (esp. 360° or volumetric) or immersive content generated from video in immersive experiences
  • Evaluations of user experience with video (esp. 360° or volumetric) or immersive content generated from video as part of an immersive experience
  • Multimedia tools and algorithms for multi-modal immersive simulations

Video4IMX-2025 will continue from the successful first Video4IMX workshop at IMX 2024 and the DataTV workshops held at IMX 2019 and 2021 where a range of topics related to data-driven personalised of television were presented, as reported in the workshop proceedings at http://datatv2019.iti.gr and http://datatv2021.iti.gr, and which also led to a Special Issue on Data Driven Personalisation of Television Content in the Multimedia Systems journal.


The 2nd Video4IMX Workshop will be hybrid , and you may participate physically or virtually.




CALL FOR PAPERS




Video4IMX foresees two types of submission. Both submission types will be handled by the same dedicated Precision Conference Solutions (PCS) page. Full papers will have an oral presentation at the workshop and short papers may be presented as either a poster or a demo at the workshop:


  • ACM SIGHCHI templates to be used for the workshop papers.
  • Selected workshop papers may be invited, following their extension, for submission to a special issue of the Springer Quality and User Experience journal

Full papers


These are to be between 7000 and 9000 words in the SIGCHI Proceedings Format with 150 word abstract, describing original research which has been completed or is close to completion and which covers at least one of the workshop topics. Accepted papers will be presented in the oral session.


Short papers


These are 3500-5500 words in the SIGCHI Proceedings Format with 150 word abstract. Papers are to describe works in progress or demos, to be included in the poster and demo session. The submitters will be asked to provide links to the work that will be presented and outline in the short paper why this is relevant to a topic of the workshop, as well as identify if the submission is for a poster or a demo to be shown at the workshop. We expect new concepts and early work-in-progress to be reported here.


The submission site will open soon

IMPORTANT DATES


  • Paper submission by 24 March 2025
  • Notification of Acceptance by 21 April 2025
  • Camera ready submission by 5 May 2025

SCHEDULE

Detailed program will be published here


CHAIRS

Lyndon Nixon, MODUL Technology GmbH, Austria
Vasileios Mezaris, CERTH-ITI, Greece
Stefanos Vrochidis, CERTH-ITI, Greece

TRANSMIXR
Your Second Image
Privacy & Cookies Policy