Projects

ChordCubes (Master's Thesis)

Florian Sprengel submitted his Master’s Thesis “ChordCubes”, a music application incorporating the TAOs.

Abstract: “For a musician to grow and become more proficient as an instrumentalist it takes a fair amount of practice. But to also develop musicality requires more than the numb and repetitive exercising of scales. In order to make musical practice both efficient and entertaining, we propose a new musical practice tool that is based on a Tangible User Interface (TUI).

[Read More]
Active Home EntertAinment Desk (Student Project)

The Active Home EntertAinment Desk (AHEAD) is a universal multi-user tabletop media navigation and control system for the living room. It consists of actuated (robotic) and passive tangible objects used on a large interactive surface (Samsung SUR40). With AHEAD, users can browse, navigate and control available media data and devices, implementing the Universal Plug and Play (UPnP) standard, such as modern televisions and other media devices. Thereby our system is compatible with most current television sets and home entertainment systems and acts as a universal control for several multimedia devices.

[Read More]
Integrated Multi-modal actuated TUI (Feasibility and Design Study)

In this paper we showcase an integrative approach for our actuated Tangible Active Objects (TAOs), that demonstrates distributed collaboration support to become a versatile and comprehensive dynamic user interface with multi-modal feedback. We incorporated physical actuation, visual projection in 2D and 3D, and vibro-tactile feedback. We demonstrate this approach in a furniture placing scenario where the users can interactively change the furniture model represented by each TAO using a dial-based tangible actuated menu. We demonstrate virtual constraints between our TAOs to automatically maintain spatial relations.

[Read More]
Embodied Social Networking (User Study)

In this paper we present a novel approach for Tangible User Interfaces (TUIs) which incorporate small mobile platforms to actuate Tangible User Interface Objects (TUIOs). We propose an application of Tangible Active Objects (TAOs) in combination with gestural interaction for social networking. TUIOs represent messages while gestural input using these objects is used for triggering actions with these messages. We conducted a case study and present the results. We demonstrate interaction with a working social networking client.

[Read More]
Saving and Restoring Mechanisms (Design Study)

In this paper we present a proof of concept for saving and restoring mechanisms for Tangible User Interfaces (TUIs). We describe our actuated Tangible Active Objects (TAOs) and explain the design which allows equal user access to a dial-based fully tangible actuated menu metaphor. We present a new application extending an existing TUI for interactive sonification of process data with saving and restoring mechanisms and we outline another application proposal for family therapists.

[Read More]
TAOgotchis (Student Project)

Abstract from the Students’ report:

“This project is about extending the Tangible Active Objects (TAOs) with an active, self-contained marker to become independent from constraints like ambient light and shadows. By using infrared LEDs the active marker can be tracked directly via an infrared camera and thereby provides the possibility of projecting backgrounds onto the plane on which the TAOs are located without having an influence on the tracking performance. After design and construction of the marker the exsisting tracker software was extended and tested. Parallel to the hardware part a scenario was be programmed for usage in an intelligent room. The scenario contains an agent-based system for simulating tamagotchis with embodiment.”

[Read More]
Interactive Auditory Scatter plot (User Study)

In this paper we present an approach that enables visually impaired people to explore multivariate data through scatter plots. Our approach combines Tangible Active Objects (TAOs) and Interactive Sonification into a non-visual multi-modal data exploration interface and thereby translates the visual experience of scatter plots into the audio-haptic domain. Our system and the developed sonification techniques are explained in this paper and a first user study is presented.

[Read More]
Interactive Mobile Seat (Student Project)

The task was to create a self-driving seat that can detect people in a room and offer itself for seating. A camera is installed under the ceiling for visual search and marker tracking. A table representing the room can be used to set tangible objects, which are monitored via a webcam. Such an object corresponds to an actual mobile seat in the room. The seat is controlled via a wireless connection based on coordinates calculated from the pictures of the webcam.

[Read More]

Social Links

Support

This work was supported by the
Cluster of Excellence Cognitive Interaction Technology (CITEC)
https://www.cit-ec.de/