Literature Review Rough Draft

Literature Review Rough Draft

How far can we push the human sensorial perceptions? Visual perception for example, is an active gesture rather than a passive one. In order to understand what we see, our body relies on sensory motor (body movement) and other sensorial inputs including phenomenological interpretations to draw conclusion of information perceived.(1) Cognitive psychology is a basis for many HCI developments. Although many HCI are designed around functionality and less on visual perception, rather more focused on the ability to understand and control through cognition. Motion Graphics and Visual Effects animation are executed based on the same motor sensing theory, except it is restricted to the spacial dimensions within the screen.

Visual effects and Motion Graphics are based on the idea of active view through movement, therefore, introducing the idea of a slow multi-dimensional drift to reinforce the understanding while the audience absorbs the text and graphical information. Through movement, we also get a spacial understanding of the scripted environment. Therefore, even though screen-based visual effects and animation is displayed on a flat screen, the space within is portrayed with the possibility of a greater or infinite dimensions. There are not many physical interaction projects based on the visual effects language, but any existing projects can be interpreted in relations with some aspects of the VFX techniques and philosophies.

Spacial recognition through sensory motor is crucial to our perceptual understanding. In the 1960s, Richard Held and Alan Hein exercised a famous study about “Movement-produced stimulation in the development of visually guided behavior”. The experiment placed two kittens on either sides of a device that looks like a classic weight scale. One kitten was allowed to walk with protruding limbs, the other is held fully suspended without the affordance of mobility. The kitten that is actively exploring the environment have normal visual system compared to the kitten with only passive visual stimulation. (2)

Many researchers dealing with perceptual technologies develop ideas geared towards people with disabilities, lacking one or more sensory input to the brain. Therefore, not too many of these developments are geared to art and expression mainly due to budget, profit, and marketing aspect. In that respect I can understand but many of these researches can be very rich if applied to new ways of interaction with devices or tools for the everyday use.

Through technology, one sense can be replaced with another. In the psychology world, this is called Sensory substitution, which was coined by Paul-Y-Rita in 1969. (3) His experiment using a camera that turns pixilated image into braille ,a interpretation of dark and light pixel contrast, protruding onto the back using the sense of touch and feel to understand imagery. Another example around this idea is a project that uses similar idea except it impliments the transfer of pixelated images to a sensation on your tongue using small electricity emitters layed on a grid and placed across the top surface of one of the most sensitive location on our body.
Sensory Hijack is about stealing from one sense to apply to another. It is similar to sensory substitution, but with slight obvious differences. Couple examples I like to bring in for this example is “Finger Sight” (4) which uses sound waves or reflected laser beams to give haptic feedback on the side of fingers, allowing the blind to navigate. Another example is also developed for the blind, called “The vOIce” (5) which is a device and system that allows the blind to see with sound. The design of the pitch triggers the neurological sensors in the brain that are designed for sight. Another term to explain this phenomenon is “Artificial Synethesia”.

Taking particle animation infused with Gestalt psychology’s theory of “emergence”, and execute the idea in the physical interactive world is the BMW kinetic sculpture by ART+COM. (6) This installation brings animated and synchronized particles to the physical world. Allowing the viewer to experience the effect in a physical space while providing the ability to view from various angles. This is one of the inspirations which brings me to the interest of translating the visual effects language to the real world rather than on screen and the only from the point of view designated by the creators.

Optical illusions are a way to inform us about the way human perception works. Through an interview on TED on a neurologist/artist name Beau Lotto.(7) He works with Perception-bending projects. Some of which are installation art, and some are smart phone apps, as well as object based speculative projects. Many of which Lotto describes as synethetic experiences.

Through these examples, and more out in the world, one can see the vast variety of implementations through manipulating perception with technology. Many of which used for medical feel like there is a large gap between these technologies and HCI. Although once a similar “artificial synethesia” is implimented into a speculative project for the everyday user or an communicative art installation, it becomes a impressive experience. The way humans perceive isn’t a one to one interaction. It involves motor sensory as well as cognition to comprehend the message being communicated through these speculative objects or art installations. Human interactions should not be flat. If it is required to be on a flat surface such as a digital reader or personal computer, the interface should introduce some dimensionality for the audience to fully engage. The ultimate solution is giving a dimensional affordances to every possible Human-Computer Interaction technology.

Bibliography

1. Noë, Alva. Action in Perception. Cambridge, MA: MIT, 2004. Print.

2. R. Held and A. Hein, “Movement-produced stimulation in the development of visually guided behavior.” Journal of Comparative and Physiological Psychology 56(5): 872-876

3. Bach-y Rita P, Collins CC, Saunders F, White B, Scadden L.(1969). “Vision substitution by tactile image projection.”. Nature, 221:963–964.

4. Finger Sight
Stetten, G., Klatzky, R., Nichol, B., Galeotti, J., Rockot, K., Zawrotny, K.,
Weiser, D., Sendgikoski, N., Horvath, S., Horvath, S., 2007. Fingersight: Fingertip visual haptic sensing and control. In: Haptic, Audio and Visual Environments and Games, 2007. HAVE 2007. IEEE International Workshop on. pp. 80–83.
URL HYPERLINK “http://dx.doi.org/10.1109/HAVE.2007.4371592″ http://dx.doi.org/10.1109/HAVE.2007.4371592

5. The vOICe. Synthetic vision through auditory video representations.

http://www.seeingwithsound.com/

6. ART + COM, Kinetic Skupture. BMW Museum Munich, 2008

http://www.artcom.de/

7. Beau Lotto (Neuroscientist and artist) – Perception-bending
HYPERLINK “http://blog.ted.com/2009/10/08/beau_q_and_a/” http://blog.ted.com/2009/10/08/beau_q_and_a/

Comments are closed.