SwiftKey launches communication app for people who are non-verbal
Americas, December 14 2015
SwiftKey, the predictive smartphone keyboard company, wants to help people who are non-verbal communicate with others. The company launched a new “assistive” symbol-based communication app called ‘SwiftKey Symbols’, which it says can be used to build sentences using pictures.
SwiftKey staff who have family members with autism came up with the idea of developing an assistive app.
The app, which is free and available on Android, makes use of SwiftKey’s core contextual language prediction technology to suggest symbols that might be used to finish a sentence.
This team visited Riverside School in southwest England earlier this year, where the pupils have a range of learning disabilities and many are on the autistic.
A lot of the current communication tools on the market are often too slow to select a particular image a child might choose. The SwiftKey’s core prediction and personalization technology – which learns from each individual as they use it – would be a natural fit for people with autism who respond particularly well to routine-based activity. Although other apps make it easy to define favorites, only SwiftKey Symbols attempts to simplify finding the right symbols through machine learning prediction. The ability to provide the technology free is also a huge benefit to this community where assistive tools can be costly and inaccessible.
Users of SwiftKey Symbols can build a sentence by choosing images, hand-drawn by a SwiftKey team member, from a set of categories or from a prediction slider powered by the SwiftKey SDK. Employing SwiftKey’s input and predictive technology, the app’s tech complements routine-based activity and learns from each individual’s behavior to surface images relevant to them quickly.
One key feature of SwiftKey Symbols is that it factors in the time of day and day of the week so symbol predictions are as accurate and personalized as possible . For example, if the child has music class on Tuesdays at 11:00am, and has previously selected symbols during that time, these will appear as predicted symbols in the sentence strip. SwiftKey Symbols can also be deeply customized to be even more useful; users may add their own images and categories from their device and use audio playback, a speech-to-text feature that can read out the sentence that is formed for a child who has verbal impairments.
The team worked closely with members of staff at Riverside School including Charlotte Parkhouse, Speech & Language Therapist. She says,
“The communication opportunities that this app will provide are amazing. The flexible use of symbols will allow pupils with severe communication difficulties to express themselves in meaningful ways and the predictive symbol function means that it can be truly personalised. Brilliant!”
This technology to help people with communications disabilities as much as we can. SwiftKey Symbols follows our ongoing work with Professor Stephen Hawking and Israeli start-up Click2Speak, both of which use SwiftKey’s core technology to enable people with mobility issues to communicate more easily.