- The BMO Lab and Canadian Stage announce their new artist in residence, director Bronwen SharpThe BMO Lab in Creative Research in the Arts, Performance, Emerging Technologies and AI in partnership with Canadian Stage is excited to announce that they have selected director Bronwen Sharp for their 2021-22 artist in residence program. Sharp will collaborate with Lab instructors and learn about the technologies and approaches being used in the Lab and explore their application for theatre and performance. In the winter and spring, Sharp will attend and participate in the CDTPS’s Theatre and Emerging Technologies course and will also participate in a workshop production applying the technologies and performance modalities explored during her residency. The workshop production will be an adaptation… Read more: The BMO Lab and Canadian Stage announce their new artist in residence, director Bronwen Sharp
- Call for Submissions now open for 2021-22 CanStage BMO Lab Residency ProgramThe deadline to apply is July 9, 2021. Full details at the Canadian Stage website. For the second year in a row, Canadian Stage and the BMO Lab in Creative Research in the Arts, Performance, Emerging Technologies and Artificial Intelligence at the Centre for Drama, Theatre and Performance Studies, University of Toronto are partnering to host two paid professional artist residencies for the 21.22 Season. Salary will be jointly paid by Canadian Stage and the BMO Lab, with each recipient receiving a total of $10,000 (CAD). About the BMO Lab Residency The BMO Lab residency is a unique paid opportunity for… Read more: Call for Submissions now open for 2021-22 CanStage BMO Lab Residency Program
- BMO Lab at the CanadianStage Festival of Ideas and Creation 2021We are very excited to be participating in the CanadianStage Festival of Ideas and Creation! Tuesday May 18, Wednesday May 19 and Thursday May 20, we will be presenting the activities of the performers-in-residence Sebastien Heins, Ryan Cunningham, Maev Beaty and Rick Miller at the BMO Lab through this past half-year. Tuesday May 18 at 2:30: AI Generated Text: GPT-2 and Performance Wednesday May 19 at 2:30: Voxels: Invisible motion sensing triggers in performance Thursday May 20 at 2:30: Live Motion Capture and Performance Full clips of explorations featured in the presentations: Here are some videos providing more complete documentation… Read more: BMO Lab at the CanadianStage Festival of Ideas and Creation 2021
- Playful Product Proposal: “BRANDOOS”: Sébastien HeinsWOULD YOU WEAR THESE? Hi! My name is Sébastien Heins, and I’m one of your BMO Lab Residents! After several incredible months with David, Pia, Rick, Maev, and Ryan in the lab, I wanted to share some new tools I think could be useful to artists — and hopefully have a bit of fun. One of my favourite videogames growing up was Ratchet & Clank on Playstation 2. In it, an over-confident green man named Captain Qwark did commercials for ridiculous gadgets ranging from the dangerous Crotchitizer to the Personal Hygenator.In the spirit of fun, I thought I’d share some “products” based on the… Read more: Playful Product Proposal: “BRANDOOS”: Sébastien Heins
- CanadianStage debuts its short film about the BMO Lab / CanStage ResidencyFor the 20.21 season, Ryan Cunningham and Sébastien Heins are participating in the BMO Lab in Creative Research in the Arts, Performance, Emerging Technologies and Artificial Intelligence. Canadian Stage and the Centre for Drama, Theatre and Performance Studies, University of Toronto have partnered to create this unique paid opportunity for two professional actors who will immerse themselves with the Lab’s technologies and research possibilities for application to live theatre performance. Video by: https://www.videocompany.ca/ The BMO Lab would like to thank the Video Company for putting together such a fine reflection on our residency in progress despite the challenges of the… Read more: CanadianStage debuts its short film about the BMO Lab / CanStage Residency
- Performers-In-Residence Update – Nov 16Sebastien: Voxel-based cueing of sound, video and lights Sebastien came in to work with David to try to convert a scene from a solo play to interactive triggering. We positioned triggers for sound cues, a video cue and a lighting cue into the space to see if such a system might be usable for actual performances. The session brought up some interesting limitations in the current software that David has produced. The software is an extension of work David did for a very specific project a few years ago and thus was not designed to address the kinds of needs… Read more: Performers-In-Residence Update – Nov 16
- Performers-in-Residence Update – Nov 6A few days ago, Sebastien asked whether it would be possible to trigger voices instead of sounds through the system that places sound possibilities in space, so we decided to do an experiment to see how that might feel and what creative possibilities that might open up. First we sat down and came up with a range of words and phrases that were somewhat ambiguous. and could be presented in different orders. Then Sebastien and Maev recorded these phrases, doing multiple versions of each with different expression. Then, each utterance was converted into a sound file and loaded into the… Read more: Performers-in-Residence Update – Nov 6
- Performers-in-Residence Update – Oct 29Ryan: The Challenge of Indigenous Languages for AI As the others were unavailable, Ryan and David got together to explore a couple of issues related to Ryan’s interests. First Ryan brought up the fact that it would be challenging to replicate the kind of AI generated script experiments that we were exploring the other day in the context of Shakespeare in the space of indigenous cultural production. The system we were using (GPT-2 from OpenAI, fine-tuned on the plays of Shakespeare) was initially trained on a very large body of English language writing. The sort of ‘learning’ that this system… Read more: Performers-in-Residence Update – Oct 29
- Performers-in-Residence Update – Oct 28First we sat down to discuss the experience of the cold read of the AI generated script from last week: what worked, what didn’t, and how we might adjust the software to improve the experience. Sebastien, Maev and Rick discussed the potential to use the tool as a tool got training / rehearsal / skill development, as they all felt that the experience was both pleasurable, very challenging and productive. Then we went further with the GPT-2 generated scripts using a modified version of the program that allow us to mix models trained on different bodies of text. The example… Read more: Performers-in-Residence Update – Oct 28
- Performers-in-Residence Update – Oct 22Today we went deeper with the AI generated Shakespeare. We discussed many approaches to using this material, and then decided to put the talk aside and jump in. We set up the lab so that the output of the AI was projected on the wall, so that the performers could see the text as it was being produced, and read it immediately… with the performers adopting characters on the fly as they turned up in the script. As the text is formulated anew on the spot, the performers had no idea what was coming, and often did not know where… Read more: Performers-in-Residence Update – Oct 22
- Performers-in-Residence Update – Oct 21Today we had the pleasure of welcoming Maev Beaty to join in our explorations. We started by extending our exploration of placing spatial interactive cues. Using the Azure Kinect, we created virtual walls and ceilings in the space which trigger sounds when our bodies touched or passed through them. We created multiple layers of these sensitive sound zones above our head which we could reach into, and then created one that was low enough that we had to crouch or bow to keep from triggering. We considered how one might build an invisible stage set of constraints, that the performer… Read more: Performers-in-Residence Update – Oct 21
- Performers-in-Residence Update – Oct 15Today we dipped our toes in the world of Artificial Intelligence. We played with a neural network (GPT-2) that was trained on an enormous dataset of text, and then fine-tuned to the complete set of plays of William Shakespeare. This lead to a lively discussion about the relative meaningfulness or meaninglessness of the resulting text, and ways that this sort of system might be useful in the context of performance, as a tool for improvisation, as a challenge to the actor as interpreter, and as a tool during the workshopping phase of a new play. We considered the fact that… Read more: Performers-in-Residence Update – Oct 15
- Performers-in-Residence Update – Oct 14For the first meeting, we explored various methods of tracking and responding to movement. I set up an interactive sound installation that I created in 2003 that translates a performers movements into sound. We looked at ways a computer can locate the presence and movement of people within a video image, and played around with these to both get a feel for the technology and to think about how this technology might be used in performance. Then we looked at newer technologies for capturing the human body through depth sensors. We looked at the Azure Kinect, which creates an image… Read more: Performers-in-Residence Update – Oct 14