Morning Session: ME AND BEYOND (Cinema REX)


Paul Labelle: Immersion beyond Belief – Objective, Subjective, and Social Presence

VR research in the fields of cognitive science and presence research has pointed towards a new way of understanding the underlying mechanisms of presence experience. The research and theorizing in these fields together with recent work in the analytical philosophy of perception makes a new and fascinating way of thinking about presence possible – a way of thinking that carries significant implications for the study of art as a whole.

This paper will present and defend a naturalist theory of presence both in- and outside VR environments. Presence is understood as a set of distinct perceptual processing operations phenomenally accessible as ‘metacognitive feelings’ (Dokic and Martin, 2017). The theory is grounded in an eliminativist ontology and a corresponding simulation theory of mind. From this position, no recourse can be made to the idealist’s ‘beliefs’ of presence-experiencing subjects, nor to the naïve realist’s ‘brute reality’ of the present-appearing objects of perception. I argue that the resulting theory of presence not only conforms to contemporary cognitive theories (e.g. Perceptual Processing), but also provides a far more fruitful way of discussing the aesthetic implication of presence effects in art.

Paul Labelle is a doctoral researcher in media studies at the Research Training Group DFG-2291 “Gegenwart/Literatur” at the University of Bonn. He studied music at the Royal Northern College of Music, Manchester and musicology at the University of Hamburg. He is currently writing a dissertation on presence techniques in music, film, and video games. His research interests include: media and perception, sound, semiotics, and media theory.


Pia Tikka: Enactive Co-Presence in Narrative Virtual Reality

My talk describes our most recent project “The State of Darkness” (SOD 2.0; work-in-progress). The SOD 2.0 is an artistic dissemination of the research project “Enactive Co-presence in Narrative Virtual Reality: A Triadic Interaction Model”.

Read more of the project here: http://enactivevirtuality.tlu.ee/the-state-of-darkness-ii/

SOD 2.0 is a virtual reality installation in which human and non-human lives coexist. The first is lived by the participant while the latter is lived by the non-human Other. The narrative VR system is enactive, this is, all elements of the narrative space are in a reciprocally dependent state with the other elements. – The concept of non-human narrative allows the State of Darkness 2.0 to reflect the human-centric perspective against that of a non-human perspective. The intriguing question is whether narratives and the narrative faculty should be considered as exclusively characteristic to humans, or if the idea of narrative can be extended to other domains of life, or even to the domain of artificially humanlike beings.

Dr. Pia Tikka is a professional filmmaker and EU Mobilitas Research Professor at the Baltic Film, Media, and Arts School, Tallinn University, where she leads the Enactive Virtuality Lab associated with the MEDIT Centre of Excellence. She holds the honorary title of Adjunct Professor of New Narrative Media at the University of Lapland, and is a former Director of Crucible Studio, Department of Media, Aalto University (2014-2017). She has published widely on the topics of enactive media, narrative complex systems, and neurocinematics. She is a Fellow of the Society for Cognitive Studies of the Moving Image and a member of the European Film Academy.


Manischa Eichwalder and Manuel van der Veen: How am I? Voids within References to Self and Others in Artistic VR Experiences

In virtual reality, the technological perception and that of the viewers coincide – both are characterized by a common void, namely that of their own position. How consistent is this initial position when, technically speaking, all relations are calculated from a point which is itself the target of this calculation? And what are the consequences for the visitors in a virtual world when instead of their torso they see nothing, while their hands levitate, the proportions are out of sync, and the field of vision is narrowed? This ultimately leads to the fact that the visitors are necessarily thrown back on themselves.

With this observation in mind, we would like to examine the complex relational structure in VR from an initially counter-intuitive relationality to the self. Artistic VR experiences are particularly revealing in this regard, as they make this void productive as a contact zone. We, therefore, want to focus on strategies in which the viewer’s body appears as the intersection of a relational structure with objects and artificial augmentations. Against this background of self-referential correlationality, how are connections between self, world and other reconfigured? How does this affect a socio-critical reflection on the promises of VR’s immediated presence?

Manischa Eichwalder (M.A.) is interested in critical practices in contemporary art. With a focus on virtual art she is currently a research assistant at the CRC 1567 “Virtual Lifeworlds”, Ruhr University Bochum. Before completing her Master’s degree in modern art history, she worked as a curatorial assistant at the Museum Folkwang, as head of art education at Urbane Künste Ruhr and studied Philosophy and Cultural Reflection.

Dr. des. Manuel van der Veen completed his doctorate in 2022 on the topic of “Augmented Reality. Trompe-l’oeil and Relief as Technique and Theory” after studying Fine Arts at the AdBK Karlsruhe and Philosophy at the Albert-Ludwigs-Universität Freiburg from 2012-2017. Since 2022, he has been a research assistant in the art historical sub-project C03 “Virtual Art” of the CRC 1567 “Virtual Lifeworlds” at Ruhr University Bochum. His research interests also include the philosophy of space and technology as well as the theory and history of painting.



Exhibition Session: PERFORMING INTERACTION (Exhibition Space)


Marie-Laure Cazin: Freud’s last hypnosis – validating emotion-driven enactions in cinematic VR

I propose to share with you the first results from the qualitative and the quantitative analysis of user experiences in our 360° VR film “Freud’s Last Hypnosis”, in which the public has the possibility to experience the point of view of the patient and the one of Freud, in the same sequence. The hypothesis is that we find empathy-related presence indications in the psycho-physiological data and subjective report data for the person observed in the point of view (be that Freud or his patient), and for the person embodied in the subjective viewpoint (be this of the patient, or Freud), respectively. The presence in cinematic VR is understood in terms of reported emotions, character identification, and empathy.

We have explored the emotional feedback of 40 participants using psycho-physiological measures and eye-tracking data while watching the film “Freud’s last Hypnosis” with Head-Mounted Display (HMD). In the experiments, we collected behavioral data of eye-gaze in CVR, measured heart’s rhythm (electrocardiogram), galvanic skin response (electrodermal activity) and subjective ratings on the participants’ emotional feelings. Participants have replied to an Empathy Quotient questionnaire before arriving at the experiment, then ITC-SOPI Presence Inventory and AttrakDiff questionnaires after the experience. The recorded data are inferred against the annotated film events and interviews of the participants just after the experience.

Dr. Marie-Laure Cazin is an artist, filmmaker and a researcher, affiliated to the Enactive Virtuality Laboratory, Tallinn University, Estonia. She is a teacher at the Ecole Supérieure d’Art et de design ESAD-TALM in Tours-Angers-Le Mans (France), and at L’Ecole des Arts, Paris 1 University Panthéon-Sorbonne. She is developing art-science projects, creating cinematic and VR prototypes, using physiosensors for an implicit and emotional interaction between the film and the viewers. Her research is about the inner activity of the spectator.


Oliver Sahli: The Performative Player: Scales of Embodied Agency in Virtual Reality Games through Gestures

In the context of video games, the act of playing has always had a performative aspect. The players’ performative acts consist of the micro-actions enabled by way of input devices that act as Human-Computer-Interfaces [HCI]. Micro-actions are amplified in the game as actions that constitute agency – that is, meaningful interaction possibilities in the game (Murray 1997). Contextualized within wearable Virtual Reality devices, however, such actions become part of a larger embodied act, gaining a stronger performative quality than simply being an extension of the device (Viseau and Suchman 2010).

For embodiment (Kilteni et al., 2014) and the sensation of spatial presence (Slater 2018; Nilsohn et al., 2016), the coupling of action and gesture is problematic. Gestures cannot be suggested but they must be acted out or performed explicitely, otherwise erosion or a rupture of the sense of presence might occur (Weibel et al., 2011). At the same time, such performative acts are essential for fulfilling the totality of the plausibility illusion of VR.

I will examine several works from the Immersive Arts Space at the Zurich University of the Arts, commercial VR games and current research projects which operate in this field of tension between the capability-enhancing quality of extensions and the embodiment enhancing qualities of gestures and focus on enacting through the players’ bodies a novel example of “Kinesthetic interface” (Sutherland 1968).

Oliver Sahli, M.A. Game Design is a researcher at the Immersive Arts Space at the Zurich University of the Arts. His research focuses on the boundaries of XR technologies. He works as an Immersive Artist and is a consultant to the Canton of Aargau for XR in culture. He co-founded a Game Design Studio.


Katharina Fuchs: Facing the glass wall: User-AI conversation in VR films

Voice interaction in VR has developed considerably these last fifteen years, especially in video games, where the player’s voice is used to interact with the game’s menu, with characters in the game, or to trigger an action, supposedly making the game more fluent, immersive and engaging. Narrative VR productions (VR films) are starting to discover human-AI voice interaction, too. Like the voice-over, massively employed by VR films, it is used to create a higher psychological, perceptual and emotional engagement of the user by creating a conversational situation between an author’s or protagonist’s voice-over, and the speaking user. But is the use of the voice really such a well adapted tool for breaking the “glass wall barrier” between the user and the virtual environment (VE)? Does it really allow conversational situations in the VR film, and an improved engagement?

Reflecting on the ontological difference and recognizing the impossibility to (really) interact with the voice-over and the virtual contents might cause a distancing from virtual contents, rather than an increased engagement. Some VR films use this distancing in their narrative about impossible interaction or the incapacity to influence the virtual environment: In “The Passengers” (Ziad Touma, 2021) and in “Darkening VR” (Ondřej Moravec, 2021) I would like to show how the desire—and the impossibility—of connecting with a virtual other, of breaking the glass wall in AI-human communication, and the spectatorial position in a VR film are reflected.

Katharina Fuchs is a PhD student in film studies at Universite Vincennes-Saint-Denis (Paris 8). In her work she questions the role of sound in the narrative structure of VR films, and how it might be used to overcome a lack of both interactivity and linearity in this genre. She has also been teaching information and communication studies at Universite Savoie-Mont-Blanc, and has collaborated with the interdisciplinary French-German journal Trajectoires and the cinema journal Théorème.



Afternoon Session: MIXED REALITIES (Exhibition Space/Kino REX)


Ludwig Zeller: OpenSoundLab – A Virtual Sound Laboratory for the Arts (Exhibition Space)

How can a virtual sound laboratory allow for new and exciting ways of sonic interaction in the context of the arts? The project, that I conceived and realized together with Hannes Barfuss, addresses this question by developing the virtual sound lab ‘OpenSoundLab’ (as an open-source fork of ‘SoundStage VR’ by Logan Olson) that introduces to the artistic and musical production of sonic media with the help of the VR goggles ‘Meta Quest 2’.

The aim is to combine the physical experience of working on spatial experimental systems, which is often perceived as positive and productive, with the advantages of digital tools and thus to enable independent learning and experimentation. The virtual lab allows to become familiar with the basics of creative sound generation and processing. Specially produced video tutorials play a central role here, which can be viewed at any time within the virtual environment and thus make it possible to study in individual lab environments independently of time and place. Furthermore, ‘OpenSoundLab’ may serve as an open-source tool for the professional and academic community of musicians, performers, and artists alike. In our reflection, we develop the notion of ‘cooking’ sound while ‘flowing’ in a mixed environment and apply this to experimental work in a virtual sound laboratory.

Ludwig Zeller (M.A.) has been a lecturer at the Institute Digital Communication Environments IDCE, Academy of Art and Design FHNW since 2011. Next month, he will be defending his PhD project in art and media theory on the topic of “Speculative Artifacts: Aesthetics and Fictionality of Critical and (Meta-)Speculative Design” at the Academy of Media Arts in Cologne. Ludwig led the SNSF Spark project “Sonic Imagination,” which expanded on his research on meta-speculative atmospheres. The project aimed to create speculative scenarios by utilizing binaural headphone renditions of Ambisonics soundbeds in public space.


Chris Elvis Leisi: Virtual Real Rooms: Game mechanics of co-presence MR in a full home environment

Recently, there has been a shift in the traditional experience of virtual reality (VR) gaming environments towards incorporating more of the exterior physical world. What kinds of new social experiences might emerge in such a mixed reality (MR) context in which we can overlay both the real world and the computationally generated? How can game mechanics and story elements support these layered worlds while using the limitations of the real world? Could new forms of immersion arise that incorporate elements of the real world (the haptic quality of objects or other forms of presence) with those of the virtual?

To address these questions, a framework was developed that enables new ways of conceiving of co-presence in MR-based game spaces by employing a multiroom setting. Through an iterative process with multiple testplayers (N=120) at several locations, a game entitled “Spacecraft – A New Way Home” was further developed to explore how to structure a variety of spatial game mechanics. In this game the players progressively rediscover their own apartment together, whereupon an increasingly complex, yet always individually shaped spaceship emerges. Players move around their home discovering more of the game in and around the apartment. After the spaceship has been brought under control, the team is ready: unknown worlds lie ahead. The obtained mechanics, developed in this process, can be used as design principles for future mixed reality games.

Chris Elvis Leisi (MA degree at ZHdK in Game Design 2021) is teaching at the ZHdK in Game Design and researches in the Immersive Arts Space in the field of co-location, co-presence, mixed reality. Since 2015 Chris Elvis has been working with various XR glasses and experimenting with new interaction possibilities. He also teaches at the University of Applied Sciences Ravensburg-Weingarten in Game Design and is co-founder of the company ArchLevel GmbH.


Chris Salter: XR Futures: Co-Presence, Co-Extensive Space and Bodily Experience

The recent and dramatic acceleration of technical research, technological promises, and corporate and public imaginaries in VR, AR (XR) has led computer scientists, Silicon Valley executives and the media to claim that the emerging “metaverse” will change human interaction as we know it” (Nardella 2019). At the same time, the emergence of a new set of wearable and half transparent AR (e.g., Microsoft Holo-Lens or Magic Leap) devices together with VR headsets which use a video technology called “passthrough,” are radically configuring not only the experience of user’s presence “between the digitality of VR and the concrete reality of their surroundings” (Saker and Frith 2020) but also embodied understandings of co-presence (Goffman 1959): “the conditions in which human individuals interact with one another face to face from body to body” (Zhao 2003).

Yet, the phenomenological experience of co-present, co-located (in the same physical space) sensory interaction in relation with others in new VR/AR environments remains understudied. Indeed, “presence” in XR contexts has been long accepted as the sense of “being there” (Riva et al., 2003), “telepresence” (Minsky 1980) or what has been called the “place illusion:” the experience of being in a place (even though the real place one is in is usually irrelevant to the virtual experience) “in spite of the sure knowledge that you are not there” (Slater 2009). While there is a vast literature on such virtual presence (Biocca and Levy 1995; Biocca et al., 2003; Lombard and Ditton 1997; Slater and Wilbur 1997), its focus is mainly on “the degree to which a virtual environment submerges the perceptual system of the user” away from the “real world” (Biocca and Delaney 1995). At the same time, little of this work addresses concepts of presence that focus on sensing, materiality and “thingness” from a philosophical or aesthetic perspective (Gumbrecht 2006; Noe 2012). Furthermore, studies of co-present interaction in VR/AR have overwhelmingly focused on purely virtual environments (Bulu 2011; Lankes et al., 2017; Schroeder 2002) with avatars and virtual humans (Wang 2011; Freiwald et al. 2021; Shin and Dongsik 2019), virtual agents (Strojny et al., 2020) and video conferencing (Kim et al., 2014). In other words, the vast majority of research into presence and immersion in VR “has only a limited involvement in concrete space” (Saker and Frith 2020).

Using a recent XR-based theatre project called “Animate” focused on climate transformation as a case study, this presentation aims to present concepts and methods for grappling with what Ronald Azuma claims is the fundamental challenge as we move into increasingly mixed reality-based experiences: “how to enable virtual content that is integrated with the surrounding real world, while users remain engaged with and aware of that ‘real world’” (2016).

Prof. Dr. Christopher Lloyd Salter is an artist, chair of the Immersive Arts Space, Zürich University of the Arts (since 2022), former full professor of computation arts at Concordia University in Montreal and Co-Director of the Hexagram network for Research-Creation in Media Arts and Technology, also in Montreal. His work has been seen all over the world at such venues as the Venice Architecture Biennale and Barbican Centre among many others. He is the author of Entangled (MIT Press, 2010), Alien Agency (MITP, 2015), and Sensing Machines: How Sensors Shape our Everyday Life (MIT Press, 2022).


Closed Workshop Session “Netted Letters in Immersive Environments”

Work-in-Progress presentation of a VR-Environment: We are developing a scientific research tool and a cultural history information platform containing metadata from personal archives. By help of a case study featuring a curated convolute of archival materials by author, sociologist, film theoretician and critique Siegfried Kracauer and his wife Elisabeth Kracauer, we study the epistemological impact of spatial navigation and movement in chronological time.

For more information on the project, please see: https://www.hslu.ch/en/lucerne-university-of-applied-sciences-and-arts/research/projects/detail/?pid=5668