Mittwoch, 25. November 2009

Tangible Media for Design and Inspiration

An interesting talk of Ishii at University of Stanford, about his projects:

Sonntag, 22. November 2009

multi touch display



iBar:



Microsoft Surface Application:

http://www.realtime-technology.com/flash/index_eng.php#/251

collaboration

an Example why collaboration is important and how it can change :D





Message board: This is a system that people can post messages on the board from their private PC and others can see it synchronously and can comment it.

http://www.adamfass.com/projects_frames.html

Montag, 26. Oktober 2009

MoPlay

They use touch display with tokens. Tokens are used to select the object of interest for comparison.

http://this-play.com/de/details.html

TouchLab

TouchLab - London DSEi 2009

Das TouchLab wurde in Zusammenarbeit mit der NewMedia Yuppies GmbH und dem Fraunhofer IGD entwickelt. Spiral Studios zeichnete verantwortlich für die gesamte Artdirektion und erstellte einen Styleguide für das Interfacedesign des Multitouchtable und die Projektionen auf der Powerwall. Zusätzlich war Spiral Studios verantwortlich für die Artdirektion und die Produktion diverser Bewegt Grafiken, 3D Modelle, Trailer und Typo-Animationen.

http://www.spiralstudios.eu/de/industrie/79-eads-touchlab-london-dsei-2009.html

Topology-Aware Navigation in Large Networks

Zooming and panning for navigating large networks, e.g. flight network.

Donnerstag, 15. Oktober 2009

10/GUI Interface

A new paradigm for User interface and a new way of multi touch interaction done by Claytin Miller.


Donnerstag, 8. Oktober 2009

Touch and Write

A project from DFKI with touch table and semantic zooming:


Freitag, 4. September 2009

iCar Remote

Remotely control your Car by iPhone:

Dienstag, 25. August 2009

Freitag, 19. Juni 2009

worlds first mobile Augmented Reality browser

AMSTERDAM, Tuesday June 16th, 2009. Mobile innovation company SPRXmobile launches Layar, worlds first mobile Augmented Reality browser, which displays real time digital information on top of reality (of) in the camera screen of the mobile phone. While looking through the phones camera lens, a user can see houses for sale, popular bars and shops, jobs, healthcare providers and ATMs. The first country to launch Layar is The Netherlands. Launching partners are local market leaders ING (bank), funda (realty website), Hyves (social network), Tempo-team (temp agency) and Zekur.nl (healthcare provider).




How it works
Layar is derived from location based services and works on mobile phones that include a camera, GPS and a compass. Layar is first avaliable for handsets with the Android operating system (the G1 and HTC Magic). It works as follows: Starting up the Layar application automatically activates the camera. The embedded GPS automatically knows the location of the phone and the compass determines in which direction the phone is facing. Each partner provides a set of location coordinates with relevant information which forms a digital layer. By tapping the side of the screen the user easily switches between layers. This makes Layar a new type of browser which combines digital and reality, which offers an augmented view of the world.

Sonntag, 14. Juni 2009

magic Interaction

a one-week word for interaction design for an interactive table. Funny :D

Designed in a one-week Interaction design project at GSA (Glasgow School of Art), the video shows a concept for an interactive table, that responds to the volumes of peoples' conversations, and their gestures, with motion sensors. In doing so, OLED lights reflect the amount of input some one has by revealing a pattern on the table surface, and by the lamp center piece getting brighter. Initially inspired for intimate situations such as dating, where body language and equal input to a conversation are subconsciously noted and none the less very revealing. Both the concept and the video were created within a week.

Body Gesture Interaction Systems

This video shows the results of the PhD thesis project of title "Markerless Full-Body Human Motion Capture and Combined Motor Action Recognition for Human-Computer Interaction" developed by Luis Unzueta(http://luis-unzueta.vndv.com/) at CEIT ( http://www.ceit.es/mechanics/index.htm ).



THE_SPACE_BETWEEN_US

The space between us' is a motion tracking installation which is a trigger for interaction between people.
Most new media communication systems today are
channeled through the medium of written text, for example, email, chat, sms.
Our bodies remain passive and the lack of non-verbal cues often leads to misunderstanding.
In this installation, participants are encouraged to use their body to explore new ways of expression through new media and body language. This will create a responsive environment which has the potential to generate interaction. The system responds to physical movement, location, proximity, body structure and posture. These parameters will be collected in real time and then translated into an audio-visual language.'The space between us' examines the instance of the computer as an agent for communication between people. It offers a playful experience which aims to bring people together and celebrate their non-verbal skills in communication.




XTR

Extreme Reality (XTR) is a world leader in 3D Human-Machine Interaction.
Providing software based solution for 3D computer control using hands & body movements.

XTR developed proprietary real-time high-resolution software that analyzes 3D human motions using one simple web cam without any additional accessories.

It allows users to interact with computers and gaming consoles, play games and interact in virtual worlds using natural human motions instead of using keyboards, mouses and joysticks, or any other hardware devices. And use their hands as
a virtual mouse.

XTR universal interface can be tailored and seamlessly integrate without modifications to existing applications. It also runs on mobile devices, and may be ported to any embedded platform.



Project Natal Xbox 360: Full body interaction






connecting digital and physical world:

Freitag, 12. Juni 2009

Multi-User Handheld Projector Demo

Funny interaction using projectors. I don't like the input device, but I like the intraction and some points they considered like privacy od information, and information sharing.

Cross-Dimensional Gestural Interaction Techniques

Interesting way of making information private using gestures:

Multi Display Systems

Multi-user, multi-display interaction:

They are using 5 displays (3 on the walls, one multi touch table, one tablet PC). The application is the Google Earth. Each display show another information on the map, like roads, schools. The input device it touch of the multi touch table and for the tablet PC a stylus.



Jonathan Grudin has worked in this area.


Car display:

value="true">

Fraunhofer IITB: Multi-Display-Arbeitsplatz für den Krisenstab

Stiching method: From microsoft research:




E-conic: a perspective-aware interface for multi-display environments





This is a good method for our Media Room, while using microsoft surface with cube, by using tracking system in addition.

Project Description:
Multi-display environments compose displays that can be at different locations from and different angles to the user; as a result, it can become very difficult to manage windows,read text, and manipulate objects. We investigate the idea of perspective as a way to solve these problems in multi-display environments. We first identify basic display and control factors that are affected by perspective, such as visibility, fracture, and sharing. We then present the design and implementation of E-conic, a multi-display multi-user environment that uses location data about displays and users to dynamically correct perspective. We carried out a controlled experiment to test the benefits of perspective correction in basic interaction tasks like targeting, steering, aligning, pattern-matching and reading. Our results show
that perspective correction significantly and substantially improves user performance in all these tasks.







Interesting Working Groups in this area:

Advanced Interaction for Multi-display Environments:
http://hci.usask.ca/research/mdes.shtml


i-Land Roomware Second Generation

"We describe the i-LAND environment which constitutes an example of our vision of the workspaces of the future, in this case supporting cooperative work of dynamic teams with changing needs. i-LAND requires and provides new forms of human-computer interaction and new forms of computer-supported cooperative work. Its design is based on an integration of information and architectural spaces, implications of new work practices and an empirical requirements study informing our design. i-LAND consists of several roomware components, i.e. computer-aug- mented objects integrating room elements with information technology. We present the current realization of i-LAND in terms of an interactive electronic wall, an interactive table, two computer-enhanced chairs, and two bridges for the Passage-mechanism. This is complemented by the description of the creativity support application and the technological infrastructure. The paper is accompanied by a video figure in the CHI99 video program"




WeSpace:


People and Groups working in the area of Multi Display Environments:

desney S. Tan:
http://research.microsoft.com/en-us/um/people/desney/projects.htm






CRISTAL on multitouch coffee table (Control of Remotely Interfaced Systems using Touch-based Actions in Living spaces)


This system can enable remote controlling of interactive displays using touch table.






more infromation can be found under:
http://mi-lab.org/projects/cristal/



shared design space: Office of tomorrow:

http://www.officeoftomorrow.org/index.php?id=8


A good workshop in the filed of MDE with the title:
Beyond the Laboratory: Supporting Authentic Collaboration with Multiple Displays
at CSCW2008
http://workshops.fxpal.com/cscw2008/AcceptedPapers.aspx


Using tablet PC with a large vertical display for editing and viewing photos. They use the concept of Chinese dinner table.




Project Deskotheue:

It is about using multi displays in office and meeting rooms.

http://studierstube.icg.tu-graz.ac.at/deskotheque/

They are using Caleydo System, which is a information visualization for biomedical data:



http://www.caleydo.org/

Dynamo


Acer Tempo M900

This mobile phone has:
- 3.8 ' display (iPhone has 3.5', Toshiba TG 01 has 4.1' which is the largest till date)
- Fingerprint sensor(multi functional: protect the device, to luck account information(PIM), optical mouse cursor)
- Touch Display
- 5 MegaPixel camera
- Slide out keyboard
- HSDPA,WVGA screen(800*480, which is greatest windows mobile can support), WiFi,GPS, bluetooth
- Software

Unboxing:


Hardware:



Software:
the Acer shell is a 3D view which is trying to emulate a physical desktop.

Mittwoch, 3. Juni 2009

Project Natal Xbox 360: Full body interaction






connecting digital and physical world:

Dienstag, 12. Mai 2009

iPhone Live video

Two applications for streaming live video using iPhone:
http://bambuser.com/
http://qik.com

Montag, 16. Februar 2009

iPhone Vibration

It seems that iPhone SDK has limited the possibility to have control over vibration pattern, duration and intensity. There is just one way of calling this vibration, which is always the same with the following code:

AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);

more about this can be found in:


http://blogs.oreilly.com/iphone/2009/01/on-vibration.html

Dienstag, 10. Februar 2009

Mobile free form gesture

this work uses gestures for going to specific menus.
I personally don't like it :( it is not intuitive, atleast not in this context.


Pointscreen Interaction

This is a work in progress at university of Erlangen, which uses gestures for drawing and interactng with images:

Montag, 9. Februar 2009

iPhone for Blind

Slide Rule uses Multi- Touch Interaction technique to make iPhone accessible for Blind (Shaun K. Kane, et al, University of Washington, DUB Group)

Samstag, 31. Januar 2009

shape Touch

A research demo from microsoft research about the future of multi touch interaction.

Mittwoch, 21. Januar 2009

virtual muscles for haptic feedback

In this video virtual muscles are used for simulation the real physical feedback, which is then used for a touch display as feedback while clicking an on-screen button.

multitouch table

It is interesting that the objects moved to the top of the display appears again from the bottom.
Also the gestures for playing a video ...

Mittwoch, 7. Januar 2009

UiRemote: The Universal Infrared Remote for iPhone

A universal controller with an iPhone + hardware for infrared:



more about it here.