Share This

 

Alistair Cruickshank explains how he has embraced technological changes to improve his day-to-day listening experiences as a hearing aid user. He explains the importance of experimenting and trying out different approaches and how much he values working closely with his audiologist to make the hearing technology work for him.

 

We decided to interview Alistair because he has some interesting experiences using electronic resources to improve his listening experiences. Alistair is an avid music listener and performer (guitar, keyboard and singing in choir), who suddenly lost most of his hearing 12 years ago in both ears. He attended a ‘Hearing Futures’ event at the Victoria & Albert Museum, organised by Imperial College and, since then, he has regularly been in touch to discuss how digital technologies could be used to improve his music listening experience.

Together we tried the Musiclarity web-app (www.musiclarity.com), an interactive music listening service dedicated to improving the sound of music for hearing aid users by allowing them to customise the position and distance of the various instruments within the music track. We’ve asked Alistair a few questions about his experience with hearing aids, music listening and other hearing technologies.

 

 

What hearing aid changes have you made over the years to improve your hearing? What did you do yourself, and how did your audiologist help you?

In the 12 years since I lost my hearing, there have been significant advances in hearing aid technology. I have had four different hearing aids in that time and each generation has been an improvement on the last. For each new hearing aid, my audiologist has been an integral part of this improvement, by programming to suit my hearing loss. I make sure to understand the new features of each hearing aid to get the most from them. The addition of specialist programmes for music and speech in noise have also helped greatly.

“I don’t think that finding better ways of listening to music has helped my speech understanding. I do feel, however that the more time spent on both activities, the better my abilities become”

I have done a lot of experimentation myself, without the help of an audiologist, looking at different solutions to find what is best for me, such as noise-cancelling headphones, direct input audio, hearing loops, assistive hearing systems, and using the Roger system.

 

RELATED TOPIC - Tap the box below

 

What sounds are improved? For example, you have made the changes to improve music but how does it affect your speech understanding?

From what I have read and personally experienced, the development of hearing aid technology has been focused on improving speech understanding. This is understandable as the biggest impact of hearing loss is to limit communication, which can be very damaging to a person’s wellbeing. Advancements in hearing aids and assistive technology have improved things dramatically, but the biggest problem of understanding speech in noise is still far from being solved.

I don’t think that finding better ways of listening to music has helped my speech understanding. I do feel, however that the more time spent on both activities, the better my abilities become. Music and speech are very distinct concepts. Music, for me, is a more direct, personal, and abstract form. I don’t need to hear or understand every single part of a piece of music to be able to enjoy it. With speech, it is much more important to understand all parts, as one word not heard properly can mean a whole sentence or even a whole conversation is misunderstood.

 

Screenshot of the Musiclarity app and how it can be used for adjusting the musical scene.

 

What was the single change that you believe made the most difference for improving music listening?

With personal music listening via hearing aids as someone would use headphones, I have found the use of streaming technology has made the biggest difference. I use a neck-worn device to stream high-quality stereo audio directly to my ears. The latest hearing aids can do this without a secondary device which is an exciting advancement.

Live music has been a much bigger challenge as the acoustic environment can be much more dynamic and unpredictable. I worked with Lorenzo Picinali to try out different systems for live music listening, including very high-quality microphones and headphones. The biggest breakthrough came with having direct access to the mixing desk audio signal. Again, using streaming technology, I was able to access the signal that was coming directly from the desk without having the room acoustics, audience background noise, and bass-heavy frequency of venue speakers to alter the audio reaching my ears.

“Finding the best audiologist was a key factor to help me live as well as possible with my hearing loss”

I feel I should point out that this is a much more sophisticated solution than the use of a hearing loop system, resulting in a far better listening experience. I have found that loop systems are great for speech understanding in isolation, but not so good for music listening.

How easy is the technology to use? Do you have recommendations for how it could be easier for other hearing device wearers to use?

Streaming technology is widespread now and pretty much all hearing aids can access it, either directly or using a secondary device. For live music, if venues were made aware that they could offer streaming, most hearing aid users would take advantage of it and benefit. It could be as simple as the mixing desk having several wireless (e.g. Bluetooth) connections (latency-free would be a very important feature) that the hearing aid wearer could hook up to.

What do you think about giving people the opportunity to self-fit their own hearing aids?

I feel that the importance of the audiologist’s role in hearing loss treatment is enormous. Finding the best audiologist was a key factor to help me live as well as possible with my hearing loss. For this reason, I’m wary of the idea of a fully self-fit solution. I do think that flexibility of hearing solutions is important, and this goes beyond just the hearing aids themselves and includes assistive technologies. I have several different programmes on my hearing aids in addition to the automatic settings, including speech in loud noise, speech in calm and music listening. Each of these has been modified separately to give the best results. The development of smartphone apps to give even more control over your hearing aids in different, complex environments is also a great step forward.

Do you have recommendations to help hearing professionals improve satisfaction of using hearing devices?

I think a personalised approach is very important, tailoring the solution to each patient. Also, it is important to take enough time to listen to the needs of the hearing aid user, both pre-fit and post-fit, to ensure they are getting the most from their devices. Providing information on assistive devices and guidance on their use would also help many people gain the benefit of technology beyond the hearing aids themselves.

What made you think about trying out Musiclarity and, more in general, spatial audio to improve the way you listen to music?

I’m interested in all developments relating to the treatment of hearing loss. As a music lover, losing most of my hearing in 2009 was quite a shock and it was a case of knowing that I’d have to learn to listen to music again, pretty much from scratch. When I first began listening to music again, it was a very distorted mess. Now, after 12 years of re-learning, I can hear the varying timbres of different musical instruments, melody, rhythm, and dynamics.

The interesting thing for me with Musiclarity was that, although my hearing loss looks fairly symmetrical on my audiogram, I hear certain things better with my left ear than my right and vice versa. So, with Musiclarity I can put the drums, bass, and percussion on the left side where my hearing is more suited to hearing the lower frequencies. Guitars, strings, woodwind etc. can be placed to the right where my hearing is better with higher frequencies and harmonic content. Vocals can be put centre stage and closer to the listener if the lyrical content is important. I’m sure Musiclarity could help others with hearing loss too.

 

Share This
CONTRIBUTOR
Alistair Cruickshank

BA Hons Dip Arch, Hives Architects, Reading, UK.

View Full Profile
CONTRIBUTOR
Lorenzo Picinali

Imperial College London,UK. www.axdesign.co.uk

View Full Profile
CONTRIBUTOR
Deborah Vickers

PhD, University of Cambridge, UK. Twitter: @SOUNDLabCam / @DebiVickers_ / @BEARS_CI

View Full Profile