New device translates breath into words

September 1, 2015

Scientists from the Loughborough University invented a device that would translate breath patterns into words.

Other Augmentative and Alternate Communication (AAC) devices use different cues like sniffing or blinking. The new prototype aims to help paralyzed patients who lost voluntary muscle control—hence can’t blink or sniff.

Patients breathe into a mask connected to a computer. A specific breath pattern can then be programmed as a word. A speech synthesizer then translates the pre-programmed breath pattern and “reads it aloud.” The device learns with the patient, programming more breath patterns to increase the vocabulary.

The prototype was made by Dr David Kerr, Senior Lecturer in the School of Mechanical and Manufacturing Engineering, and Dr Kaddour Bouazza-Marouf, Reader in Mechatronics in Medicine and by Dr Atul Gaur, Consultant Anaesthetist at Glenfield Hospital.

“What we are proposing is a system that learns with the user to form an effective vocabulary that suits the person rather than the machine,” said Dr Kerr.

“When it comes to teaching our invention to recognise words and phrases, we have so far recorded a 97.5% success rate. Current AAC devices are slow and range from paper-based tools to expensive, sophisticated electronic devices. Our AAC device uses analogue signals in continuous form, which should give us a greater speed advantage because more information can be collected in a shorter space of time.”

“This device could transform the way people with severe muscular weakness or other speech disorders communicate. In an intensive care setting, the technology has the potential to be used to make an early diagnosis of locked-in syndrome (LIS), by allowing patients, including those on ventilators, to communicate effectively for the first time by breathing – an almost effortless act which requires no speech, limb or facial movements.” Dr Gaur added.

Tags:

Category: Features, Technology & Devices

Comments are closed.