In human–computer interaction, an organic user interface (OUI) is defined as a user interface with a non-flat display.[1] After Engelbart and Sutherland's graphical user interface (GUI), which was based on the cathode ray tube (CRT), and Kay and Weiser's ubiquitous computing, which is based on the flat panel liquid-crystal display (LCD), OUI represents one possible third wave of display interaction paradigms, pertaining to multi-shaped and flexible displays. In an OUI, the display surface is always the focus of interaction, and may actively or passively change shape upon analog (i.e., as close to non-quantized as possible) inputs.[1] These inputs are provided through direct physical gestures, rather than through indirect point-and-click control. Note that the term "Organic" in OUI was derived from organic architecture, referring to the adoption of natural form to design a better fit with human ecology. The term also alludes to the use of organic electronics for this purpose.
Organic user interfaces were first introduced in a special issue of the Communications of the ACM in 2008.[1] The first International Workshop on Organic User Interfaces took place at CHI 2009 in Boston, Massachusetts. The second workshop took place at TEI 2011 in Madeira, Portugal. The third workshop was held at MobileHCI 2012 in Monterey, California, and the fourth workshop at CHI 2013 in Paris, France.
According to Vertegaal and Poupyrev,[1] there are three general types of organic user interface:
Flexible (or deformable) user interfaces: When flexible displays are deployed, shape deformation, e.g., through bends, is a key form of input for OUI. Flexible display technologies include flexible OLED (FOLED) and flexible E Ink, or can be simulated through 3D active projection mapping.
Shaped user interfaces: Displays with a static non-flat display. The physical shape is chosen so as to better support the main function of the interface. Shapes may include spheres, cylinders or take the form of everyday objects.[2]
Actuated (or kinetic) user interfaces: Displays with a programmable shape controlled by a computer algorithm. Here, display shapes can actively adapt to the physical context of the user, the form of the data, or the function of the interface. An extreme example is that of Claytronics: fully physical 3D voxels that dynamically constitute physical 3D images.
Holman and Vertegaal present three design principles that underlie OUI:[2]
Input equals output: In traditional GUIs, input and output are physically separated: Output is generated graphically on the screen on the basis of input provided by a control device such as a mouse. A key feature of OUI is that the display surface, and its physical deformations are always the locus of user interaction.
Function equals form: Coined by Frank Lloyd Wright, this means the shape of an interface determines its physical functionality, and vice versa. Shapes should be chosen such that they best support the functionality of the interface. An example is a spherical multitouch interface,[3] which is particularly suited to geographic information interfaces, which were previously limited to distorted flat projections of spherical earth data.
Form follows flow: OUIs physically adapt to the context of a user's multiple activities, e.g., by taking on multiple shapes. An example of this is the "clamshell" phone, where the physical metaphor of altering the phone's shape (by opening it) alters the state of the user interface (to open communications). Another example is folding a thin-film tablet PC into a smaller, pocket-sized smartphone for mobility.
Early examples of OUIs include Gummi, a rigid prototype of a flexible credit card display,[4] PaperWindows,[2] featuring active projection-mapped pieces of paper, the Microsoft Sphere, one of the first spherical multitouch computers,[3] and DisplayObjects (rigid objects with displays wrapped around them).[2] PaperPhone[5] was one of the first OUIs to introduce bend gestures on a real flexible screen. It featured a flexible electrophoretic display and an array of 5 bend sensors that allowed for user navigation of content. Examples of actuated OUIs include shape changing prototypes like MorePhone and Morphees.[6] The Nokia Kinetic,[7] a flexible smartphone that allows input techniques such as bend, twist and squeeze, and the Samsung Youm,[8] are early commercial prototypes of OUIs. It is widely expected that OUIs will be introduced on the market by the year 2018.
Note that OUIs differ from a natural user interface (NUI) in that NUIs are limited to touch or remote gestural interactions with a flat display only. Although remote gestural interaction violates the principle of Input Equals Output, OUIs generally subsume NUIs. Also note that OUI is a successor to and form of tangible user interface that always features a bitmapped display skin around its multi-shaped body. Finally, note that all OUIs are examples of haptic technologies, as their physical shapes, like real objects, provide passive tactile-kinaesthetic feedback even in non-actuated cases.
Original source: https://en.wikipedia.org/wiki/Organic user interface.
Read more |