Yacov Sharir (
Mon, 22 Jun 1998 09:34:53 -0500 (CDT)

<x-rich>Dear Comrades,

I wanted to announce the opening of RoboCity at the Austin Children

I have conceived and directed RoboCity together with a team of
multimedia artists some of which are my students in my Virtual Reality
and Cyberspace in the Arts course.

Enclosed you will find RoboCity narrative that lead to a major support
by AT&T and some other hi-tech corporations.

As I am leaving for a long European tour this afternoon, I have not
edited the narrative. (sorry)

I will be checking my e-mail and hopefully respond to comments.

If you are planning to visit the area please stop by and visit the

<bold>Robo City

</bold>Robo-City is an educational and artistic installation of

technologies, conceived and designed for the Austin Children's Museum

choreographer and interactive media artist

Yacov Sharir.

This six-week project (spanning June and July 1998) utilizes

20 area high school student artists and five teachers for design and

construction. Selected through a referral and interviewing process,

the student artists created an interactive world

for Museum visitors to explore which includes cyberhumans, robots,

interactive intelligent lighting, and MIDI interactive surfaces.

In January 1998, Yacov Sharir began to work with area high school

and administrators to amass a pool of 25 - 30 ethnically diverse

artists representing all geographic areas of the city who have an

in participating in this summer project. Selected from this pool, the

project participants reflect a myriad of artistic disciplines

dance, visual arts, multi-media, and music, insuring a

approach to this project. Teachers were selected based upon the level

interactive technology utilized in their curriculum.


The design phase of the project took place at the Austin Children's

Museum, in downtown Austin, Texas. The ACM Theatre was equipped with

Power Macs with

software which includes LifeForms, Extreme 3D, Micromind Director,

Photoshop, Detailer, After Effects, and Poser II as well as components
for the design and

construction of the MIDI surfaces and robotics.


Spanning three weeks, the design phase of Robo-City required the staff

participants to meet Monday through Friday for four hours per day.
Week one

focused on

familiarizing participants with computer hardware and software

for the project's design and construction and introducing them to

interactive environments. By the conclusion of week one, participants

created computerized environments and designed quick time movies

that has been used fore both the web site and performance in the
physical space. Students where encouraged to

think of the computer as a Theatre or as a world rather than a tool.

the participants also began to construct robotics elements, by building

mechanics for a remote control car and then building onto it or
stripping it

down to create interactive elements that became the whimsical
population for

the physical space of Robo-City.


During week two, participants were introduced to the additional

programs and learned how all these programs talk and relate to each

The QuickTime movie material and concept was utilized, along with the

of the video camera and its importance to the creation of Robo-City's

environment, with video-in and out demonstrations. Week two saw the

of the robotics element construction, as well as other interactive

components, including cyberhumans, robots, robotics lighting
instruments and

the MIDI interactive musical surfaces.


The third and final week of the design phase was devoted to the

of the

design (space) construction and project experimentation.


The central component of Robo-City are the MIDI (Musical Instrument

Interface) sensored surfaces. The MIDI surfaces are capable of

transmitting precise position coordinates, velocity, and pressure

information in the form of standard MIDI messages. Some of these

carpet the floor and consist of a large number of Force Sensing

(FSRs) which are attached to heavy duty plastic sheeting and covered

polyethylene foam. Each FSR is assigned a separate input channel of a

Voltage to a MIDI Interface Box which has 64 analog inputs, plus MIDI

The Midi Box incorporates a Motorola MC68HC11 microprocessor and can

programmed to convert input/output analog signals to/from any desired

message, on multiple MIDI channels.


Used in conjunction with an "intelligent" external MIDI processing

these floor surfaces are ideal for Robo-City, where one or more
visitors to

the exhibit can affect the music, the lighting, the cyberhumans, and

robots in real time by the nature of their movements and by their

position(s) on the surfaces. Robo-City also employs the I-Cube

various tactile objects (constructed by the technicians and the

participants) are outfitted with a variety of I-Cube sensors such as

and slide sensors, flexible bend sensors, light sensitive and G-force

sensors. The way in which visitors interact, react to, and play around

these sensored objects controls the environment variables of light,

video projection, and triggers the Disklavier (an electronic "player"


Participants also addressed the connection between humans and their

representational presence in cyberspace as they created computerized

cyberhumans. Issues of time, space, physicality and gravity were

visited, as well as the question of how the body is to be represented

inhabited within a virtual space. The cyberhumans inhabit Robo-City

via video projection controlled by the MIDI interactive surfaces.


In developing the cyberhumans and their surrounding environment, a

multiplicity of questions were raised, and a model for collaboration

was proposed. This collaborative technological investigation

enhanced the methods of expression in virtual space and uncovered

digital processes through experimentation with the various software

and 3D rendering programs. First phase participants experienced true

collaboration in technological design, construction and showcasing and

away with the profound energy of human discovery.


The showcase phase of Robo-City opens June 19, 1998 at the newly

constructed Austin Children's Museum Theatre and continues for three


As visitors enter Robo-City, they transverse a MIDI floor panel which

activates video and audio instructions. With phase one design

as their guides, visitors learn that each person's own movement on the

floor panels triggers various sensors which control the city's
environment of

music, lighting, cyberhumans on the video screens and robotics

Visitors are invited to "play" with the sensored objects and interact

the robotics population to affect the city's environment.


Student ambassadors are trained to assist visitors in exploring the

installation and facilitating their questions comparing the cybercity

>the city in which they live.


Both the design and showcase phases of Robo-City are being video

for educational purposes. The video will be utilized to introduce

educators to new uses of science and technology in exploring social

and arts content and process. Robo-City is being transformed for the

Children's Museum Website, enabling visitors to explore the

city on-line.

I hope this is helpful and not to confusing