Live-I Workshop Journal

Mark Coniglio (
Tue, 27 Jan 1998 10:06:17 -0800

Dear List,

As I mentioned in an email a few months ago, I thought it might interesting
to write about our experinces giving our Live-I (Live Interactive) Workshop
at California Institute of the Arts this week. This is the first of entry
of that proposed journal.

My hope in doing this is to report not so much what Dawn and I are doing to
teach the workshop (though I am sure that will seep in) but instead to
report how these students react to the material and what they do with it.

I think the first thing to report is that the Dance school at CalArts has
made some powerful initiatives to get the dancers working with
technological tools of various kinds. They actually have a class called
"Digital Dancing" that is taught by an interesting gentleman called Noah.
(Sorry, I will get his last name for the next post.) He works with light in
his own performances, and has had the MFA 2's and the BFA 4's creating
small pieces that use very simple technology, like, a light bulb and a
battery. What is quite interesting about his approach is that it avoids the
flashy stuff, but in so doing, illustrates some very basic principles and
keeps the focus on content. I hope that I might interview Noah about his
class and his own work and put a report up here.

In any case, another class taught by Noah, "Video for Dance" has the
dancers using software (Adobe Premiere) to digitize and edit
dance-on-camera pieces, as well as using audio editing software (mostly
DigiDesign stuff) to create soundscapes for their compositions. I am hoping
to see or hear some of the results of the student's efforts while we are

To me, there is a fairly phenomenal amount of technical hands-on being done
by people who are also talented dancers. (Dawn took ballet and modern
classes in the morning with the students, and had a favorable report of
their abilities in that sphere.)

Yesterday we made our first presentation to the students, showing them four
sensory devices (Alesis D-4, Big Eye, Infra-Red Grid, MidiDancer), four
output devices (Peavey Audio Sampler, STEIM's LiSa, Laserdisc, MIDI Dimmer
Pack), and ways to link the two. Our hope is for each of the students to
create a one-minute long study that uses one sensor to control one output
device. We don't have much time with them: 1.5 to 2 hours each day on Mon,
Tue, Thu, and Fri, with additional hour-long individual meetigs with each
student. I suspect, however, that this is often going to be the scenario
when we do this workshop in a University context where students have
classes they must attend each day.

The students asked some very good questions right of the bat. One of the
first came when we were demonstrating BigEye, which can sense movement in
the video image coming from the camera. She asked, "how do you know what
you're doing if you can't see the computer screen?" which Dawn tranlated
to, "how can I play this instrument if I don't have any feedback." Also,
another student brought up the question of audience perception of the
interaction, and we discussed this for as long as our pressing schedule

During the final portion of the class, we gave their assignment for
Tuesday: write out a description of a one-minute long study that uses one
sensor to control one output device. During Tuesday's class, we will flesh
out these ideas and start them on implementing (with our help) their short

More Tomorrow....
Mark Coniglio

Mark Coniglio, Artistic Co-Director |
Troika Ranch |