Neural network analysis of sleep stages enables efficient diagnosis of narcolepsy

Jens B. Stephansen, Alexander Neergaard Olesen, Mads Olsen, Aditya Ambati, Eileen B. Leary, Hyatt E. Moore, Oscar Carrillo, Ling Lin, Fang Yan, Yun L. Sun, Yves Dauvilliers, Sabine Scholz, Lucie Barateau, Birgit Hogl, Ambra Stefani, Seung Chul Hong, Tae Won Kim, Fabio Pizza, Giuseppe Plazzi, Stefano VandiElena Antelmi, Dimitri Perrin, Samuel T. Kuna, Paula K. Schweitzer, Clete Kushida, Paul E. Peppard, Helge Bjarup Dissing Sørensen, Poul Jennum, Emmanuel Mignot*

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

655 Downloads (Pure)

Abstract

Analysis of sleep for the diagnosis of sleep disorders such as Type-1 Narcolepsy (T1N) currently requires visual inspection of polysomnography records by trained scoring technicians. Here, we used neural networks in approximately 3,000 normal and abnormal sleep recordings to automate sleep stage scoring, producing a hypnodensity graph—a probability distribution conveying more information than classical hypnograms. Accuracy of sleep stage scoring was validated in 70 subjects assessed by six scorers. The best model performed better than any individual scorer (87% versus consensus). It also reliably scores sleep down to 5 s instead of 30s scoring epochs. A T1N marker based on unusual sleep stage overlaps achieved a specificity of 96% and a sensitivity of 91%, validated in independent datasets. Addition of HLA-DQB1*06:02 typing increased specificity to 99%. Our method can reduce time spent in sleep clinics and automates T1N diagnosis. It also opens the possibility of diagnosing T1N using home sleep studies.
Original languageEnglish
Article number5229
JournalNature Communications
Volume2018
Issue number9
Number of pages15
ISSN2041-1723
DOIs
Publication statusPublished - 2018

Fingerprint

Dive into the research topics of 'Neural network analysis of sleep stages enables efficient diagnosis of narcolepsy'. Together they form a unique fingerprint.

Cite this