MDP Science Fair Oct 31, 2011

Great successful day at Art Center MDP’s annual science fair. A great variety of topics ranging from sound, touch, urban, language, memory, gaming, gesture, and mine on visual perception. My current working title is “Perceptual Exaptations”.

The three experiments I displayed starting from the right consists of:

Trails – Transformation of the physical world with your gaze.

The middle is Mutational Physical Interface, an accelerometer driven form changing physical interface.

On the very left is Extended Sight + Physical Animated Motion Graphic Immersive Space. The glasses consists of a eyetracker (hacked PS3 IR web cam) and a LCD driven by two servos for the right eye. The camera controlled by the glasses is placed inside a box decorated with dimensional graphics, scripted to allow the information look as if they animate onto screen while the camera pans around.

Week 5: Hacking the PlayStation 3 EyeToy for Eye Tracking

The first time I hacked a PS3 Eye Toy to use as an eye tracker was a year ago, during my concept year here in MDP for a class call New Ecology of Things (NET) taught by Phil Van Allen. The warmup project at the time was called “Useless Networks” where I created a clever but useless helmet called “The 11th Finger“. I learned how to create this device by following the tutorials for creating the Eyewriter. Thanks to the smart people over at the Graffiti Research Lab for their detailed tutorial on their website as well as on Instructables.

 

 

 

To take apart the case, you need to dig in with a flat head screwdriver and pry it open while twisting

 

 

 

 

 

 

 

Week 4: Prototyping / Learning through making

These are the first iterations of my experimentation and implimenting sketches from the previous week
Tangible Motion Graphics – Idea around Physical Animated Spacial Experience

 

Through these scale models and experiments, this investigation begins to speculate the distance and perceivable range around the viewer. When this is experienced in the dark, the manipulation to play with fading is very useful. If I continue this investigation, these graphic elements that takes up space will be motorized.

 

Transformation
There are countless examples of visual effects in film and even in our daily lives that revolves around the idea of Transformation. Yes, that includes Hollywood hit The Transformers franchise, along with many other science fiction films. This is the first testing of a self reveal opening. Imagine if this container is duplicated on all 6 sides of a cube. Where this cube opens can be independent interfaces that are utilized for different tasks.

 

Mutation – Cube Duplication

This rough initial prototype investigates the possibility of hidden compartments or flaps to revel additional mass to allow for physical duplication or multiplicity. Visual effects often allow almost the impossible of virtual mass expansion. For example the transportaion (flying cars) in the cartoon The Jetsons, cars would fold up and become a suitcase. Or even in the japanese comic/anime Dragonball Z, the characters can click on a pill shaped device and throw it onto the ground…then poof! it turns into a motorcycle, private hotel, perserved food, or other large devices.

 

Synchronization – Mechanical Iris study

The idea of synchronization in visual effects or animation always has an mesmerizing effect on the viewer. In this initial experiment as part of my study of simply moving many pieces of shapes at the same time to morph or expose. This can be embed into container openings or duplication of many of these containers to create a physical interface for access and various applications.

Literature Review Rough Draft

Literature Review Rough Draft

How far can we push the human sensorial perceptions? Visual perception for example, is an active gesture rather than a passive one. In order to understand what we see, our body relies on sensory motor (body movement) and other sensorial inputs including phenomenological interpretations to draw conclusion of information perceived.(1) Cognitive psychology is a basis for many HCI developments. Although many HCI are designed around functionality and less on visual perception, rather more focused on the ability to understand and control through cognition. Motion Graphics and Visual Effects animation are executed based on the same motor sensing theory, except it is restricted to the spacial dimensions within the screen.

Visual effects and Motion Graphics are based on the idea of active view through movement, therefore, introducing the idea of a slow multi-dimensional drift to reinforce the understanding while the audience absorbs the text and graphical information. Through movement, we also get a spacial understanding of the scripted environment. Therefore, even though screen-based visual effects and animation is displayed on a flat screen, the space within is portrayed with the possibility of a greater or infinite dimensions. There are not many physical interaction projects based on the visual effects language, but any existing projects can be interpreted in relations with some aspects of the VFX techniques and philosophies.

Spacial recognition through sensory motor is crucial to our perceptual understanding. In the 1960s, Richard Held and Alan Hein exercised a famous study about “Movement-produced stimulation in the development of visually guided behavior”. The experiment placed two kittens on either sides of a device that looks like a classic weight scale. One kitten was allowed to walk with protruding limbs, the other is held fully suspended without the affordance of mobility. The kitten that is actively exploring the environment have normal visual system compared to the kitten with only passive visual stimulation. (2)

Many researchers dealing with perceptual technologies develop ideas geared towards people with disabilities, lacking one or more sensory input to the brain. Therefore, not too many of these developments are geared to art and expression mainly due to budget, profit, and marketing aspect. In that respect I can understand but many of these researches can be very rich if applied to new ways of interaction with devices or tools for the everyday use.

Through technology, one sense can be replaced with another. In the psychology world, this is called Sensory substitution, which was coined by Paul-Y-Rita in 1969. (3) His experiment using a camera that turns pixilated image into braille ,a interpretation of dark and light pixel contrast, protruding onto the back using the sense of touch and feel to understand imagery. Another example around this idea is a project that uses similar idea except it impliments the transfer of pixelated images to a sensation on your tongue using small electricity emitters layed on a grid and placed across the top surface of one of the most sensitive location on our body.
Sensory Hijack is about stealing from one sense to apply to another. It is similar to sensory substitution, but with slight obvious differences. Couple examples I like to bring in for this example is “Finger Sight” (4) which uses sound waves or reflected laser beams to give haptic feedback on the side of fingers, allowing the blind to navigate. Another example is also developed for the blind, called “The vOIce” (5) which is a device and system that allows the blind to see with sound. The design of the pitch triggers the neurological sensors in the brain that are designed for sight. Another term to explain this phenomenon is “Artificial Synethesia”.

Taking particle animation infused with Gestalt psychology’s theory of “emergence”, and execute the idea in the physical interactive world is the BMW kinetic sculpture by ART+COM. (6) This installation brings animated and synchronized particles to the physical world. Allowing the viewer to experience the effect in a physical space while providing the ability to view from various angles. This is one of the inspirations which brings me to the interest of translating the visual effects language to the real world rather than on screen and the only from the point of view designated by the creators.

Optical illusions are a way to inform us about the way human perception works. Through an interview on TED on a neurologist/artist name Beau Lotto.(7) He works with Perception-bending projects. Some of which are installation art, and some are smart phone apps, as well as object based speculative projects. Many of which Lotto describes as synethetic experiences.

Through these examples, and more out in the world, one can see the vast variety of implementations through manipulating perception with technology. Many of which used for medical feel like there is a large gap between these technologies and HCI. Although once a similar “artificial synethesia” is implimented into a speculative project for the everyday user or an communicative art installation, it becomes a impressive experience. The way humans perceive isn’t a one to one interaction. It involves motor sensory as well as cognition to comprehend the message being communicated through these speculative objects or art installations. Human interactions should not be flat. If it is required to be on a flat surface such as a digital reader or personal computer, the interface should introduce some dimensionality for the audience to fully engage. The ultimate solution is giving a dimensional affordances to every possible Human-Computer Interaction technology.

Bibliography

1. Noë, Alva. Action in Perception. Cambridge, MA: MIT, 2004. Print.

2. R. Held and A. Hein, “Movement-produced stimulation in the development of visually guided behavior.” Journal of Comparative and Physiological Psychology 56(5): 872-876

3. Bach-y Rita P, Collins CC, Saunders F, White B, Scadden L.(1969). “Vision substitution by tactile image projection.”. Nature, 221:963–964.

4. Finger Sight
Stetten, G., Klatzky, R., Nichol, B., Galeotti, J., Rockot, K., Zawrotny, K.,
Weiser, D., Sendgikoski, N., Horvath, S., Horvath, S., 2007. Fingersight: Fingertip visual haptic sensing and control. In: Haptic, Audio and Visual Environments and Games, 2007. HAVE 2007. IEEE International Workshop on. pp. 80–83.
URL HYPERLINK “http://dx.doi.org/10.1109/HAVE.2007.4371592″ http://dx.doi.org/10.1109/HAVE.2007.4371592

5. The vOICe. Synthetic vision through auditory video representations.

http://www.seeingwithsound.com/

6. ART + COM, Kinetic Skupture. BMW Museum Munich, 2008

http://www.artcom.de/

7. Beau Lotto (Neuroscientist and artist) – Perception-bending
HYPERLINK “http://blog.ted.com/2009/10/08/beau_q_and_a/” http://blog.ted.com/2009/10/08/beau_q_and_a/

Perceptual Exaptations Manifesto

Perceptual Exaptation is:

Is a exploration to invent new uses for the human visual perception whether it is the sensation and phenomenon of seeing or the act of physically directing the eye. Playing with various physical parts of the human eye, including the Iris, retina, fovea, and blind spot to create new ways of functional productive interactions.

Is a series of investigations that involves the physical human eye, cognitive psychological concepts, Gestalt psychology, and Visual effect languages.

Is a translation of visual effects techniques to the tangible world using motors and sensors embedded in mechanical parts.

Is using the structural language of visual effects as a vehicle to create tangible apparatuses that introduces new uses for existing bodily functions.

Is open to the experimentation of extending the human perceptual functions to other locations of the body or other locations of the physical world.

Is open to the experimentations involving two or more ways of interactions involving two or more personals. Is actively interactive.

Perceptual Exaptation is not:

Is not a optical illusion

Is not a magic trick to deceive the human eye

Is not a engineering dominate project, but rather more expressive with metaphors.

Article Summary

Critique of Alva Noe’s book “Action in Perception”
- by Ned Block

This article lays out the fundamental ideas of Noe’s book Action in Perception with the writer’s thoughts and criticisms. There are several intriguing and provocative explanations. It combines neuroscience of perception and the phenomenology of experience with an appreciation of the psychology. Noe believes in the process of visual perception, dependencies on the interactions between the observer and the spacial environment includes multiple sensories not only from the eye. There is a great emphasis that vision is not passive, but rather an active act of seeing along with touch or body movement to reassure what is perceived with additional information and conformation. The act of body movement constructs the experience of visual perception.
Noe claims the idea of “enactive view”, he explains that “Perceptual experience, according to the enactive approach, is an activity of exploring the environment drawing on knowledge of sensorimotor dependencies and thought” (p228). Sensorimotor serves to explain the way sensory stimulatenactiveion varies as you move. Sensorimotor knowledge is knowing how objective appearances change as you move but it is also a matter of knowing how rather than knowing that. The mind-body informations will inevitably be a source of attention to the enactive point of view.
The enactive view means the perceptual experience depends on sensorimotor contingencies (body movement and other sensory information to hypothesize. Our tactile experience of solid objects depends on sensing its resistance when we push against it, thus showing a minor dependency on the experience on action.
Humans and other primates have two distinct visual systems, a conscious visual system that begins in the back of the brain, moving to the bottom and side(ventral system) and from the back towards the top of the brain, a much less clearly conscious (dorsal system). The conscious ventral system is slow, directs towards long term visual planning of motion and uses object-centered observations from a stereotypical view instead of using the view from the current position. The egocentric dorsal system is fast, representing the distance and orientation without memory or color vision to guide the action. For example dribbling of a basketball down the court and avoid obstacles uses the dorsal system. Another example would be walking down the sandy beach, our feed avoids stones that we don’t seem to see. The dorsal system feeds more strongly than the ventral system to the peripheral vision.
The author defends the Alva’s attack by two points he’s trying to convey. First the information perceived through the experience is the brain and doesn’t include the rest of the body. Second, although body movement output instructions that affect the perceptual experience, the experience that is understood is abstract. I disagrees with Block’s defense because I too believe the active perceptual experience requires feedback from multiple senses not just from the passive sensation of the eye.

Abstract V.04

Abstract

Tangible Visual Effects is a thesis investigation on the structural components of screen based visual effects, (ex: animatronics, multiplicity, scripted depth of field, type animation, mutation/transformation, lighting effects, augmented interfaces, etc) and seeks to explore ways in which they can be deployed to form a new language of tangible interactions. I believe the structural components of visual effect techniques can change future interaction. This can introduce and develop of new uses for informing future perceptive interactions. This would entail undertaking experiments that explore the possibility of extending the visual experience to other senses and subverting expectations of “familiar interactions” through the tools of illusion, mutation, discovery, and stimulating imagination.

References

There are certain attributes in devices when and if manipulate, creates a sense of illusion. It is a natural phenomena to dilute one’s mind just as a chameleon can change colors to deceive predators while sitting on a tree. I’m creating illusion to dilute the eye, experimenting with perception and sub-conscious to allow variations of experiences by individuals.

As Maurice Merleau-Ponty observed that “consciousness exists in the world and experience of things in the world exist in the consciousness”. Perceptive manipulation devices can alter the consciousness. A recent experiment by Ehrsson in 2007 was done by mounting a video camera on the head to allow users to perceive a variety of out-of-body experience.

Galvano-vestibular stimulation system by Fitzpatrick investigates a systems that can manipulate the sense of balance, reflexes, and proprioception. Startling sensations such as sirens, bright flashes, or even vibrations to the sense of touch can unconsciously divert our attention. George Stetten devised the concept of FingerSight in 1999. It extends human sight to fingers by detecting reflected laser light to vibrations felt on the surface of finger, allowing the device to mimic the phenomena of synethesia.

The Approach

Professionally I have resided in the world of motion based narratives in conjunction to this I have also been attracted to breaking and making things. These have informed many of my projects during my two years of masters studies, 11th Finger (extension of the body), Velit (object animism and mythology), and Healthbrush (experience design into our everyday lives). Through experimentation from making, I plan to further develop devices and interfaces, even small one function objects around perception manipulation and physical animation. I will collaborate with other researchers as well to develop hybrid of thesis concepts and tackle a broader spectrum.

Investigation Questions:

• if visual effects in film have the capability to convincingly deceive us, and draw us into a scripted narrative, what would be the implications of the application of embedded visual effects in physical world?

• would they become entertainment, or could they have a deeper effect in subverting our expectations?

• what would be the implications if our objects plays with inversions of our expectations-on our sensory perception, relationship to objects, to entertainment?

• if the frequency of light waves and audio waves can be manipulated enough to cross over in our neurological responses, how can this phenomenon be synthesize towards a comprehensible narrative of personal interpretations?

• when the physical interface takes on a greater responses to touch and controls with hands and feet over digital responses that stimulates human perception, what possibilities can arise from memory, recovery, and discovery?

 

Abstract V.03

Sight, Touch, Mind Bending Apparatus- Thesis abstract by Link Huang

There’s the saying, “eyes are the windows to your soul”. This window is a two way access for massive amounts of quick interactions between the viewed and the viewer. Although to completely rely on sight can often mislead and deliver false information to the viewer, tricking the mind to think one thing is bigger than another, one color looks different than the compared just because its environments are different, or even unexpected reveal of objects. This occurs in real life but even more so on film and Television. My inspirations are driven by motivations from techniques used in visual effects and optical illusions in conjunction with physical mechanics. Techniques such as mechanical transformations, green screen composite, particle animations, sense of 3 dimensional space, or even holographic interfaces. Some methods are by embedding artificial sensors (such as cameras, microphones, temperature, proximity, and touch sensors) in tangible forms to allow the separation of perception and body.

These attributes are embedded in devices to manipulate perception and conscious to allow variations of experiences by individuals. As Maurice Merleau-Ponty observed that “consciousness exists in the world and experience of things in the word exist in consciousness”. Perceptive manipulation devices can alter the consciousness. A recent experiment by Ehrsson in 2007 was done by mounting a video camera on the head to allow users to perceive a variety of out-of-body experience.

Galvano-vestibular stimulation system by Fitzpatrick investigates a systems that can manipulate the sense of balance, reflexes, and proprioception. Startling sensations such as sirens, bright flashes, or even vibrations to the sense of touch can unconsciously divert our attention. George Stetten devised the concept of FingerSight in 1999. It extends human sight to fingers by detecting reflected laser light to vibrations felt on the surface of finger, allowing the device to mimic the phenomena of synethesia.

My professional background comes from storyboarding to animation in the broadcast motion graphics and film industry, it is in conjunction with my personal hobby of various material craft and mechanical works has led me towards my obsession. Example projects I worked on include the 11th Finger, Velit, and Healthbrush. 11th Finger is a hat-like device worn on the head with a finger on the top. Driven by two servo motors and read-outs from a infrared camera that senses the direction of the left eye worn by the user and points the finger in the same direction. Velit was a project to investigate object animism and this autonomous wishing machine takes on a life of it’s own. This creature reacts left and right to the movement of the person standing in front and opens its mouth at a intimate distance when the person’s face comes close, revealing a dandelion looking wishing star created with several tiny LEDs and an audio sensor. The user can also interacts with it by feeding it with some wishes and a breath of air, blowing into the star. Feeding Velit with a wish excites the creature and changes it’s LED tail pattern. When there is no interaction, this creature doze off and falls asleep. The third project is Healthbrush. This is a concept demo of a toothbrush that will analyze the user’s blood from the average daily routine and give the user’s health statistics, assessments, and suggestions.

The thesis investigation I am proposing plays with the language of visual effects as they are applied in an immersive film experience (ex:multiplicity, duplication) into tangible real world things. The goal is to experiment with the possibility to manipulate perception, extending the visual experience of special effects (with theories like gestalt and cognitive psychology) to other senses and in effect subverting expectations of “familiar interactions”, reintroducing the possibility for discovery, and stimulating imagination.

Investigation Questions:

• if visual effects in film have the capability to convincingly deceive us, and draw us into a scripted narrative, what would be the implications of the application of embedded visual effects in physical world?

• would they become entertainment, or could they have a deeper effect in subverting our expectations?

• what would be the implications if our objects subverting our expectations-on our sensory perception, relationship to objects, to entertainment?

• if the frequency of light waves and audio waves can be manipulated enough to cross over in our neurological responses, how can this phenomenon be synthesize towards a comprehensible narrative of personal interpretations?

• when the physical interface takes on a greater responses to touch and controls with hands and feet over digital responses that stimulates human perception, what possibilities can arise from memory, recovery, and discovery?

 

Abstract V.02

Thesis Workshop 09-17-2011: Thesis Intent Wk02

by Link Huang

Tangible Visual Effects

Interests for Investigation

There’s the saying, “eyes are the windows to your soul”. This window is a two way access for massive amounts of quick interactions between the viewed and the viewer. Although to completely relying on sight can often mislead and deliver false information to the viewer, tricking the mind to think one thing is bigger than another, or one color looks different than the compared just because its environments are different or even unexpected reveal of objects. This occurs in real life but even more so on film. Filmmakers often use illusional techniques to complete the visual compositions displayed for storytelling. These techniques such as mechanical transformations, green screen composited scenes, particle animations, sense of 3 dimensional space within, or even holographic interfaces. It is often technologies science fiction writers talk about in their stories, given them new affordances and functions in their fictional portray of the future. When the book is transformed into a feature film, designers are often required to think about not only the aesthetics but also the way they appear. These movement must feel natural and seamless in the way it appears and disappear. In many ways visual effect animations are illusions created to enhance the story, giving credibility for the the audience to believe everything they are watching actually exist.

My interest began collectively towards the investigation of “Hollywood and Science Fiction display of technologies in comparison to a perspective shift of the crude reality when various technologies are implemented in the real world“. In this initial stage, I had the notion of investigating glitches and its imperfections as part of the technology flaws through exploring it’s possibilities. Then as I develop my projects during the courses over the summer, I began to realize my interests lies within the visual effects aspect of these fictional technologies in the stories. My goal is to look into ways to bring these visual effects to the tangible world. I would like to creating inventive and innovative projects while staying away from a carbon copy reflection of a existing visual effects used in film.

 

Expertise and Obsession

My professional background derives from a handful of years doing motion graphics through storyboarding to animation. It is also a combination with some of my hobbies building RC vehicles, craft works exploring materials. Example projects I worked on that led me to this obsession, a few examples include the 11th Finger, Velit, and Healthbrush. 11th Finger is a hat-like device worn on the head with a finger on the top. Driven by two servo motors and read-outs from a infrared camera that senses the direction of the left eye worn by the user and points the finger in the same direction. Velit was a project to investigate object animism and this autonomous wishing machine takes on a life of it’s own. This creature reacts left and right to the movement of the person standing in front and opens its mouth at a intimate distance when the person’s face comes close, revealing a dandelion looking wishing star created with several tiny LEDs and an audio sensor. The user can also interacts with it by feeding it with some wishes and a breath of air, blowing into the star. Feeding Velit with a wish excites the creature and changes it’s LED tail pattern. When there is no interaction, this creature doze off and falls asleep. The third project is Healthbrush. This is a concept demo of a toothbrush that will analyze the user’s blood from the average daily routine and give the user’s health statistics, assessments, and suggestions. Interaction with a Function while using visual effects techniques in the tangible world became a large part of my interest to explore infuse with my expertise.

Questions

I intend to investigate questions like, “Can animistic characteristics of these devices be utilized in terms of a narrative? Can we find out what sorts of “power” fit for these narratives?”,”What would be some amazingly powerful information, if the user could only get it somehow?”, “Can the device be some kind of mounted display, something your essentially wear?”, ” “What kind of experience does the user go through while using these tangible interactions?”, “How can the sense of discovery play into the experience?”, “How can system, network, or social media implications tie in with the design?”.