Bron; www.inavateonthenet.net
It aims to streamline the flow of this information by detecting where the user has their eyes – and attentions – focused. The company describe this as a means to overcome the ‘last meter bandwidth’; the bottleneck between human and machine that occurs when the benefits of technology are slowed by the cognitive limitations of the human user.
The user interface harnesses cognitive modeling, proximity detection, gaze-tracking and task automation to operate. It aims to present the user with the tools they need while automating repetitive tasks either fully or partly.
For instance, if the system detects the user’s gaze on a notification window the duration of the gaze would trigger relevant controls to appear beneath the user’s hands.
If the notification is not visible to the user, bRIGHT will bring it into view. If the user does not show interest in the notification, as determined by the duration of their gaze, it will register disinterest and hide the window without further action.
Using CALO (Cognitive Assistant that Learns and Organises) software, versions of which have appeared in several Apple products, task automation is also possible. bRIGHT would turn a request within an incoming email into a notification which can be accepted or rejected by the user – updating the calendar if needed.
In the future, bRIGHT will boast a control interface that consists of a polymorphic layer over a traditional plasma or LCD display. The current opaque prototype of a dynamic control surface has 60 taxels, or keys, that can be lowered or raised and latched into position. This surface can act like a keyboard if all keys are raised it can also be a touchpad when all keys are in the lowered position.
The framework has been implemented across a range of platforms from tablets to large, multitouch displays. The prototype was created using a standard HD display, webcams, an IR gaze tracking system and a workstation consisting of a table-height touch panel. Beneath the display there are several IR sensors for detecting multiple touches, proximity detection and to register gestures.
The user interface harnesses cognitive modeling, proximity detection, gaze-tracking and task automation to operate. It aims to present the user with the tools they need while automating repetitive tasks either fully or partly.
For instance, if the system detects the user’s gaze on a notification window the duration of the gaze would trigger relevant controls to appear beneath the user’s hands.
If the notification is not visible to the user, bRIGHT will bring it into view. If the user does not show interest in the notification, as determined by the duration of their gaze, it will register disinterest and hide the window without further action.
Using CALO (Cognitive Assistant that Learns and Organises) software, versions of which have appeared in several Apple products, task automation is also possible. bRIGHT would turn a request within an incoming email into a notification which can be accepted or rejected by the user – updating the calendar if needed.
In the future, bRIGHT will boast a control interface that consists of a polymorphic layer over a traditional plasma or LCD display. The current opaque prototype of a dynamic control surface has 60 taxels, or keys, that can be lowered or raised and latched into position. This surface can act like a keyboard if all keys are raised it can also be a touchpad when all keys are in the lowered position.
The framework has been implemented across a range of platforms from tablets to large, multitouch displays. The prototype was created using a standard HD display, webcams, an IR gaze tracking system and a workstation consisting of a table-height touch panel. Beneath the display there are several IR sensors for detecting multiple touches, proximity detection and to register gestures.
Geen opmerkingen:
Een reactie posten