Rationale:
Hearing loss affects millions of people, but the brain does not respond to it in a uniform way. In some individuals hearing pathways continue to work relatively well, while others make up for it by using different parts of the brain. These differences in how the brain adapts help explain why people with similar levels of hearing difficulty can experience very different outcomes, ranging from communication problems and social withdrawal to cognitive decline. Most brain imaging studies of hearing loss have been small or have grouped people into broad categories (“normal” versus “impaired”), making it hard to see differences between individuals. Even in large datasets like UK Biobank, the hearing test (Digit Triplet Test) has rarely been combined with brain and other health data to uncover the brain pathways and behavioural factors that explain why hearing loss affects people so differently.
Research question: Can patterns in brain structure and connectivity predict how well people understand speech in noise? Do individuals with similar hearing test scores rely on different brain pathways to achieve this, and what lifestyle, cognitive, or health factors help explain these differences?
Aims and objectives:
1. To test whether artificial intelligence (so called supervised machine learning models) can find patterns in brain structure and connectivity that predict performance of hearing test scores (Digit Triplet Test).
2. To find out if people who score the same on the Digit Triplet Test can differ in brain structure and function – for example, some relying on hearing pathways, others on additional brain regions to support.
3. To describe these groups in terms of cognitive, lifestyle, and health measures, in order to identify protective and risk factors that shape how hearing loss impacts daily life.