• Skip to main content

BMO LAB

Creative Lab for the Arts, Performance, Emerging Technologies and AI

  • Home
  • About
    • A. I., Performance, and the Arts at BMO Lab
  • Highlights
  • Courses
    • Winter 2023
    • Winter 2022
    • Winter 2021
    • Fall 2019
    • 2018
    • Courses Summary
  • Research
  • Events & News
    • Can Stage BMO Lab Residency
    • Diagonal – Speaker Series and Reading Group
    • Radium AI and Art Residence Program: Discovering New Sites of Creativity
    • Events and News Summary
  • Lab
  • People
  • Info
  • Supporters & Partners

Nov 16 2020

Performers-In-Residence Update – Nov 16

Sebastien: Voxel-based cueing of sound, video and lights

Sebastien came in to work with David to try to convert a scene from a solo play to interactive triggering. We positioned triggers for sound cues, a video cue and a lighting cue into the space to see if such a system might be usable for actual performances.

The session brought up some interesting limitations in the current software that David has produced. The software is an extension of work David did for a very specific project a few years ago and thus was not designed to address the kinds of needs Sebastien has for his performance. Based on this experiment, David has been able to adjust and reimplement parts of the software to make it more appropriate for Sebastien’s needs. This process of iterative design, in consultation with people with real-world needs is a very important part of the successful development of good usable tools.

One limitation we found is that the sound triggering in David’s system was not designed for a situation where a sound clip was triggered by activity in a zone, but then was to play through to the end. Sebastien had to keep the trigger engaged to keep the clip playing.

Despite the limitations, we were able to get a rough sketch of the scene working.

Technically, one computer was tracking the movement (as seen in the video). This computer was talking to another computer that was controlling sound cues, lighting cues and video playback. We used Open Sound Control (OSC) as the communications protocol. At the lab we are trying to provide OSC interfaces for all our tools so that they can all be set up to talk to each other.

Written by David Rokeby · Categorized: CanStage_BMO

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

copyright - BMO Lab for Creative Research in the Arts, Performance, Emerging Technologies and AI