motivation : There is an increase in the number of remote-controllers that come with almost every domestic appliance. But today poeple are much more interested in richer and more intuitive interactions with objects using touch, voice or even gestures. With such a need in existence, there are bound to be more objects that are "smart" and capable of diverse bidirectional interac- tion as we move towards a world of "smart homes" and "smart offices". Considering the end-user perspective for smart home related research, easy interaction with far away objects become extremely important. So we were extremely interested in studying "point and control" interaction in a smart home. However most of the existant technologes have a few common drawbacks. For example when using IR enabled devices, there is a problem of range and directionality. Also most of the existing solutions only take into onsideration the objects in the direct vicinity. So we set about to fnd a solution that enables us to control both objects inside our room, and also in an adjacent room by just pointing at them.
The idea was to develop a hand-held controller, which could serve several puproses at the same time.
My role : The roles I played varied as the project progressed in time. In the initial days, I was reading as many research papers as possible, and also discussing with my supervisors, to formulate the right research question and to decide the path to be taken. Once that was done, it was decided that we would be using the advanced realitime tracking, ART system, to judge the direction of pointing. So I started learning to use the motion tracking system. Using the iterative process of prototype design, I conducted a few small experiments at first and then changed them to include more functionality. Together with the help of Mr. Patrick Busch of the hardware lab, we came up with the final prototype. With help from my supervisors, I developed the different interfaces intended for the different objects to be kept in the room. Later on, when we started using a camera, a wireless lamp and also simulated a TV using a laptop, my colleague, Mr. Robin Despouys helped me to set up the networking between the different objecs, to enable information exchange from the main computer.
Working of the prototype: The prototype of the handheld controller, had an arduino mega board, an XBee module, a slider potentiometer, and a few buttons. Inside it was kept an off the shelf philips pocket projector. On top of it we fixed a tree target. A tree target is basically a collection of small balls coated with retro-reflective materials. These balls are kept in a particular arrangement at fixed distances, so that the motion tracking system can track them and also their orientation. So usng this controller, the system could recognize the object at which we were pointing. After pointing at an object, when we pressed the "select button", the system displayed an interface suitable to the object selected. This interface was then passed to the projector, from the main computer, through WiFi. The projector projected the interface onto the object selected. The buttons were connected to the arduino and coommunication between the arduino and the main computer was achieved by utilizing the XBee module. Further interaction with the object was possible using the buttons and dragging gestures. The slider potentiometer came into use for the other part of the project which involved controlling objects kept in an adjacent room.
The SuperMan Vision
The aim was to utilize the fact that users are very much likely to remember the spatial locations of objects in adjacent rooms, for familiar surroundings, like a home or an office. Thus they could still point at such objects behind a wall, even if it was not visible. To enable interaction with such objects hidden from view, we placed a camera in an adjacent room. The feed from this camera was then passed to the projector, which projected it onto the wall. Thus the effect was that of looking through the wall. The camera was capable of panning and the its direction could be controlled, based on our pointing direction. The slider potentiometer fixed on the hand-held controller, was utilized to create a small graphical animation, which created the illusion of passing through the bricks of the wall , to view the other room. For example, when the slider was kept at minimum position, we could interact with the objects kept inside the current room. Upon moving the slider up, the projected view changes. It first shows the wall texture, then the texture ruptures to reveal a small hole in the wall. On further increasing the slider value, the hole in the wall widens and reveals the part of the room, directly behind the virtual hole in the wall. Using the vrtual hole in the wall we could locate objects kept behind the wall, and interact in the same way as those kept inside the room.
Example interaction with a lamp
Example applications : Apart from controlling lamps, other applications developed include, setting the temperature of a heater, and also changing channels of a television. For a televion the area around the screen can be used to display channel logos, from which we can select a new channel. This does not disturb the view of the current channel, while choosing another one.