Master Thesis (I)

Topic: Development of interfaces for the use of communication and entertainment applications by people with a serious motor and verbal disability.

Responsible: Nicolas Simon

The aim of this project is to develop a user interface for people who cannot move any parts neck downwards of their body and in addition are seriously limited in their verbal communication. For these people using daily applications like writing an email, browsing, watching videos and listening to music becomes nearly impossible without supplementary tools. Using a keyboard and a mouse is not possible so other approaches must be considered. Every interaction can also be done by the Sweep-and-Click-Method introduces by Lopes [1]. GUI-elements like buttons, lists, etc. are visually highlighted after one another (sweeping) and if the desired GUI-element is highlighted the user can perform a selection (click). The sweeping is done automatically by the program, so the important part is to think about how the user, who has no movable body part downwards his neck, can execute this selection.
For this problem a lot of different solutions of the following categories were proposed.

  • Using vision based approaches like a camera with object recognition algorithms,
  • Using BCI approaches by analyzing EEG signals,
  • Using the remaining muscle activity with EMGs or

Other creative ideas like the IntegraMous [2], where mouse clicks are simulated by sucking and blowing.
The problem of these applications is that they consider just one or a subset of these categories. For example the user can navigate and control applications just by EEG or just by EMG.
In this approach an event layer is added between the hardware (e.g. an EEG neuro headset) and the software (e.g. a multimedia and communication center). The software is just receiving events like a selection event or focusing the next GUI-element. And the hardware with the signal processing application provides these events, for example a selection event each time the user thinks of an imaginary arm movement.
The whole concept is shown by figure 1.

Schaubild1_v4_english

 

The GUI-Application is the multimedia and communication center, the Eventgenerator-Application does the signal processing and sends the events. The Configuration-Client updates the configuration used by the GUI-Application and handles things like font size, account information for the e-mail application etc.

The main user interface shows figure 2. The following entertainment and communication applications are implemented.

  • An alarm function to call for help.
  • A food application to select breakfast, lunch and supper.
  • A video player to watch locally stored movies.
  • A communication app to show configured sentences in a dialog box.
  • A music player to play locally stored music and listen to configured online radio station
  • A web browser for surfing in the internet.
  • An e-mail client to receive and send simple text e-mails.
  • A gaming application. Just now only windows pinball is playable.

GuiApp_Startscreen

 

The whole interface is controlled by events and the following are supported:

  • The “selection”-event to select a GUI -element like a button or a list.
  • The ”next”-event to highlight the next GUI -element.
  • The “previous”-event to highlight the previous GUI -element.
  •  The “back”-event to quit dialogs.
  • The “up”- and “down”-events are used for games or scrolling in lists and the browser.
  •  The “alarm”-event for immediately triggering the alarm function.

One big advantage of the event based approach is that not just one but a lot event generator applications can be connected, also simultaneously. So for example it becomes possible to have one device (e. g. EMG) for just triggering the “alarm”-event and a second device (e.g. the EEG) for other events like the “selection”-event.

The application was first tested with healthy people to remove bugs and correct wrong behaviors. So a next step will be the testing with disabled people e.g.  people with Locked- In-Syndrom to evaluate the real use of this application.

References:

  1.  Designing User Interfaces for Severely Handicapped Persons. João Brisson Lopes, 2001, http://dl.acm.org/citation.cfm?id=564553
  2.  “IntegraMouse”, http://www.rehavista.de/?at=Produkte&p=40870
This entry was posted in Projects. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>