Publicación: Fall Detection in Low-illumination Environments from Far-infrared Images Using Pose Detection and Dynamic Descriptors
Cargando...
Fecha
2024-03-18
Autores
Rodriguez, Victor
Albiol, Sergio
Medrano, Carlos
Martinez, Javier
Editor/a
Director/a
Tutor/a
Coordinador/a
Prologuista
Revisor/a
Ilustrador/a
Derechos de acceso
info:eu-repo/semantics/openAccess
Título de la revista
ISSN de la revista
Título del volumen
Editor
IEEE Xplore
Resumen
In an increasingly aging world, the effort to automate tasks associated with the care of elderly dependent individuals becomes more and more relevant if quality care provision at sustainable costs is desired. One of the tasks susceptible to automation in this field is the automatic detection of falls. The research effort undertaken to develop automatic fall detection systems has been quite substantial and has resulted in reliable fall detection systems. However, individuals who could benefit from these systems only consider their use in certain scenarios. Among them, a relevant scenario is the one associated to semi-supervised patients during the night who wake up and get out of bed, usually disoriented, feeling an urgent need to go to the toilet. Under these circumstances, usually, the person is not supervised, and a fall could go unnoticed until the next morning, delaying the arrival of urgently needed assistance. In this scenario, associated with nighttime rest, the patient prioritizes comfort, and in this situation, body-worn sensors typical of wearable systems are not a good option. Environmental systems, particularly visual-based ones with cameras deployed in the patient's environment, could be the ideal option for this scenario. However, it is necessary to work with far-infrared (FIR) images in the low-light conditions of this environment. This work develops and implements, for the first time, a fall detection system that works with FIR imagery. The system integrates the output of a human pose estimation neural network with a detection methodology which uses the relative movement of the body's most important joints in order to determine whether a fall has taken place. The pose estimation neural networks used represent the most relevant architectures in this field and have been trained using the first large public labeled FIR dataset. Thus, we have developed the first vision-based fall detection system working on FIR imagery able to operate in conditions of absolute darkness whose performance indexes are equivalent to the ones of equivalent systems working on conventional RGB images.
Descripción
Este es el manuscrito aceptado del artículo. La versión registrada fue publicada por primera vez en IEEE Access, vol. 12, pp. 41659-41675, 2024, está disponible en línea en el sitio web del editor: https://doi.org/10.1109/ACCESS.2024.3375767
This is the accepted manuscript of the article. The registered version was first published in IEEE Access, vol. 12, pp. 41659-41675, 2024, is available online on the publisher's website: https://doi.org/10.1109/ACCESS.2024.3375767
Categorías UNESCO
Palabras clave
computer vision, convolutional neural, fall detection, infrared imaging
Citación
J. Gutiérrez et al., "Fall Detection in Low-Illumination Environments From Far-Infrared Images Using Pose Detection and Dynamic Descriptors," in IEEE Access, vol. 12, pp. 41659-41675, 2024, https://doi.org/10.1109/ACCESS.2024.3375767
Centro
Facultades y escuelas::E.T.S. de Ingenieros Industriales
Departamento
Ingeniería Eléctrica, Electrónica, Control, Telemática y Química Aplicada a la Ingeniería