Objective: Sleep stages can provide valuable insights into an individual's sleep quality. By leveraging movement and heart rate data collected by modern smartwatches, it is possible to enable the sleep staging feature and enhance users' understanding about their sleep and health conditions.
Method: In this paper, we present and validate a recurrent neural network based model with 23 input features extracted from accelerometer and photoplethysmography sensors data for both healthy and sleep apnea populations. We designed a lightweight and fast solution to enable the prediction of sleep stages for each 30-s epoch. This solution was developed using a large dataset of 1522 night recordings collected from a highly heterogeneous population and different versions of Samsung smartwatch.
Results: In the classification of four sleep stages (wake, light, deep, and rapid eye movements sleep), the proposed solution achieved 71.6 % of balanced accuracy and a Cohen's kappa of 0.56 in a test set with 586 recordings.
Conclusion: The results presented in this paper validate our proposal as a competitive wearable solution for sleep staging. Additionally, the use of a large and diverse data set contributes to the robustness of our solution, and corroborates the validation of algorithm's performance. Some additional analysis performed for healthy and sleep apnea population demonstrated that algorithm's performance has low correlation with demographic variables.
Keywords: Machine learning; Sleep; Sleep stages; Smartwatch; Wearable.
Copyright © 2024 Elsevier B.V. All rights reserved.