RoboCity

Yacov Sharir (sharir@utxvms.cc.utexas.edu)
Mon, 22 Jun 1998 09:34:53 -0500 (CDT)

<x-rich>Dear Comrades,

I wanted to announce the opening of RoboCity at the Austin Children
Museum.

I have conceived and directed RoboCity together with a team of
multimedia artists some of which are my students in my Virtual Reality
and Cyberspace in the Arts course.

Enclosed you will find RoboCity narrative that lead to a major support
by AT&T and some other hi-tech corporations.

As I am leaving for a long European tour this afternoon, I have not
edited the narrative. (sorry)

I will be checking my e-mail and hopefully respond to comments.

If you are planning to visit the area please stop by and visit the
installation.

<bold>Robo City

</bold>Robo-City is an educational and artistic installation of
interactive

technologies, conceived and designed for the Austin Children's Museum

choreographer and interactive media artist

Yacov Sharir.

This six-week project (spanning June and July 1998) utilizes

20 area high school student artists and five teachers for design and

construction. Selected through a referral and interviewing process,

the student artists created an interactive world

for Museum visitors to explore which includes cyberhumans, robots,

interactive intelligent lighting, and MIDI interactive surfaces.

In January 1998, Yacov Sharir began to work with area high school
teachers

and administrators to amass a pool of 25 - 30 ethnically diverse
student

artists representing all geographic areas of the city who have an
interest

in participating in this summer project. Selected from this pool, the
20

project participants reflect a myriad of artistic disciplines
including

dance, visual arts, multi-media, and music, insuring a
multi-disciplinary

approach to this project. Teachers were selected based upon the level
of

interactive technology utilized in their curriculum.

>

The design phase of the project took place at the Austin Children's

Museum, in downtown Austin, Texas. The ACM Theatre was equipped with
21

Power Macs with

software which includes LifeForms, Extreme 3D, Micromind Director,

Photoshop, Detailer, After Effects, and Poser II as well as components
for the design and

construction of the MIDI surfaces and robotics.

>

Spanning three weeks, the design phase of Robo-City required the staff
and

participants to meet Monday through Friday for four hours per day.
Week one

focused on

familiarizing participants with computer hardware and software
necessary

for the project's design and construction and introducing them to

interactive environments. By the conclusion of week one, participants
had

created computerized environments and designed quick time movies

that has been used fore both the web site and performance in the
physical space. Students where encouraged to

think of the computer as a Theatre or as a world rather than a tool.
Additionally,

the participants also began to construct robotics elements, by building
the

mechanics for a remote control car and then building onto it or
stripping it

down to create interactive elements that became the whimsical
population for

the physical space of Robo-City.

>

During week two, participants were introduced to the additional
software

programs and learned how all these programs talk and relate to each
other.

The QuickTime movie material and concept was utilized, along with the
introduction

of the video camera and its importance to the creation of Robo-City's

environment, with video-in and out demonstrations. Week two saw the
conclusion

of the robotics element construction, as well as other interactive

components, including cyberhumans, robots, robotics lighting
instruments and

the MIDI interactive musical surfaces.

>

The third and final week of the design phase was devoted to the
completion

of the

design (space) construction and project experimentation.

>

The central component of Robo-City are the MIDI (Musical Instrument
Digital

Interface) sensored surfaces. The MIDI surfaces are capable of

transmitting precise position coordinates, velocity, and pressure

information in the form of standard MIDI messages. Some of these
surfaces

carpet the floor and consist of a large number of Force Sensing
Resistors

(FSRs) which are attached to heavy duty plastic sheeting and covered
with

polyethylene foam. Each FSR is assigned a separate input channel of a

Voltage to a MIDI Interface Box which has 64 analog inputs, plus MIDI
Out.

The Midi Box incorporates a Motorola MC68HC11 microprocessor and can
be

programmed to convert input/output analog signals to/from any desired
MIDI

message, on multiple MIDI channels.

>

Used in conjunction with an "intelligent" external MIDI processing
system,

these floor surfaces are ideal for Robo-City, where one or more
visitors to

the exhibit can affect the music, the lighting, the cyberhumans, and
the

robots in real time by the nature of their movements and by their
precise

position(s) on the surfaces. Robo-City also employs the I-Cube
system;

various tactile objects (constructed by the technicians and the

participants) are outfitted with a variety of I-Cube sensors such as
touch

and slide sensors, flexible bend sensors, light sensitive and G-force

sensors. The way in which visitors interact, react to, and play around
with

these sensored objects controls the environment variables of light,
sound,

video projection, and triggers the Disklavier (an electronic "player"
piano!)

>

Participants also addressed the connection between humans and their

representational presence in cyberspace as they created computerized

cyberhumans. Issues of time, space, physicality and gravity were

visited, as well as the question of how the body is to be represented
and

inhabited within a virtual space. The cyberhumans inhabit Robo-City

via video projection controlled by the MIDI interactive surfaces.

>

In developing the cyberhumans and their surrounding environment, a

multiplicity of questions were raised, and a model for collaboration

was proposed. This collaborative technological investigation

enhanced the methods of expression in virtual space and uncovered
innovative

digital processes through experimentation with the various software

and 3D rendering programs. First phase participants experienced true

collaboration in technological design, construction and showcasing and
came

away with the profound energy of human discovery.

>

The showcase phase of Robo-City opens June 19, 1998 at the newly

constructed Austin Children's Museum Theatre and continues for three
weeks.

>

As visitors enter Robo-City, they transverse a MIDI floor panel which

activates video and audio instructions. With phase one design
participants

as their guides, visitors learn that each person's own movement on the

floor panels triggers various sensors which control the city's
environment of

music, lighting, cyberhumans on the video screens and robotics
inhabitants.

Visitors are invited to "play" with the sensored objects and interact
with

the robotics population to affect the city's environment.

>

Student ambassadors are trained to assist visitors in exploring the

installation and facilitating their questions comparing the cybercity
with

>the city in which they live.

>

Both the design and showcase phases of Robo-City are being video
documented

for educational purposes. The video will be utilized to introduce

educators to new uses of science and technology in exploring social
studies

and arts content and process. Robo-City is being transformed for the
Austin

Children's Museum Website, enabling visitors to explore the
technological

city on-line.

I hope this is helpful and not to confusing

</x-rich>