This is Olivia’s and my final project for MPM33.
1. Outline the project concept;
-create interactive exploratory user experience involving the manipulation of sound to generate a visual response in which the environment will change
-relationship between user and the semiotics of sound
-challenge this relationship through auditory, physical, and visual elements
2. Describe its realization (the technology it uses, how it occupies physical or virtual space etc.);
-the main technological component will be reliant on the user’s physical presence, movement, and touch with interfaces that will recognize these attributes through a touchscreen device
-not inventing new technology but manipulating existing ones in order to create a new interface and experience for the user
3. Explain why this project is important
-commenting and challenging the user through our interaction which is playing on mainstream digital media technologies out there today such as YouTube, Myspace, Soundcloud, last.fm.
-here, the user is in control of the musical experience they are generating which alters their perception of music and their previous relationships established with it
4. Explain the historical significance of the project.
-evolution of technology and our changing relationship to it
-relevance of audio in different forms/media
-the progression of music software/technologies and how it’s developing for the future
-evolving beyond Google API and projects such as David Rockby’s Very Nervous System
5. Please make reference to the artist you presented in class and explain how his/her work relates to yours.
-collection of data and regurgitation into musical output
-using the physical and translating it into auditory experience
-mapping space and physical information
-visual feedback from the music for enhanced user experience (light art and motion tracking)
-inspired by the Lemur input device, except manipulating sound, light, and visual effects
-performance aspect to be incorperated ? (people take examples by watching other people explore)
-iconic representations of what’s creating the sound -how it’s visualized
-reversal of the implied sound- not always what user expects and could generate better result (ah factor)
-space and sound discovery
-ghosting or creation of a larger picture- what will the user strive to create visually?
-music evokes emotional reactions in people through a combination of its
elements: rhythm, tempo, melody, themes, chords and instrumentation
-They provide a good way to achieve
an immersing experience. Ambiance like background noises and sounds treated
with varying types and degrees of reverb can make a real difference to how
involved an audience is with images. They generate an inclusive space
-Music especially relies on these connotative meanings. The associative links within
each individual, evoking emotional reactions linked to an individuals past
experiences with hearing a type of music in a certain context
-e. Sound and visuals have also the ability to transfer their specific
‘qualities’ onto one another
**In an interactive scene it is impossible to predict when the user is going to make a
change. In many interactive designs music is just used as a constant background.
If music is used to communicate a message to the user it could involve changing
the mood, pace or impact of the music. When a piece of music starts and the user
makes a fairly quick change, there is a good change that a musical theme or19
melody is cut short in order to make way for another piece of music that relates to
the new scene. If this happens frequently music will lose its ‘hidden powers’ and
become rather annoying. However if a user decides to remain at the same position
for a while, the music has to be able to keep interest.
**Solfege in musical terms is used to describe a technique in which the scale is divided into syllables to allow easier comprehension of musical tones