Persona:
Plaza Agudo, Inmaculada

Cargando...
Foto de perfil
Dirección de correo electrónico
ORCID
0000-0002-2295-4361
Fecha de nacimiento
Proyectos de investigación
Unidades organizativas
Puesto de trabajo
Apellidos
Plaza Agudo
Nombre de pila
Inmaculada
Nombre

Resultados de la búsqueda

Mostrando 1 - 1 de 1
  • Publicación
    Fall Detection in Low-illumination Environments from Far-infrared Images Using Pose Detection and Dynamic Descriptors
    (IEEE Xplore, 2024-03-18) Martín Gutiérrez, Sergio; Rodriguez, Victor; Albiol, Sergio; Plaza Agudo, Inmaculada; Medrano, Carlos; Martinez, Javier
    In an increasingly aging world, the effort to automate tasks associated with the care of elderly dependent individuals becomes more and more relevant if quality care provision at sustainable costs is desired. One of the tasks susceptible to automation in this field is the automatic detection of falls. The research effort undertaken to develop automatic fall detection systems has been quite substantial and has resulted in reliable fall detection systems. However, individuals who could benefit from these systems only consider their use in certain scenarios. Among them, a relevant scenario is the one associated to semi-supervised patients during the night who wake up and get out of bed, usually disoriented, feeling an urgent need to go to the toilet. Under these circumstances, usually, the person is not supervised, and a fall could go unnoticed until the next morning, delaying the arrival of urgently needed assistance. In this scenario, associated with nighttime rest, the patient prioritizes comfort, and in this situation, body-worn sensors typical of wearable systems are not a good option. Environmental systems, particularly visual-based ones with cameras deployed in the patient's environment, could be the ideal option for this scenario. However, it is necessary to work with far-infrared (FIR) images in the low-light conditions of this environment. This work develops and implements, for the first time, a fall detection system that works with FIR imagery. The system integrates the output of a human pose estimation neural network with a detection methodology which uses the relative movement of the body's most important joints in order to determine whether a fall has taken place. The pose estimation neural networks used represent the most relevant architectures in this field and have been trained using the first large public labeled FIR dataset. Thus, we have developed the first vision-based fall detection system working on FIR imagery able to operate in conditions of absolute darkness whose performance indexes are equivalent to the ones of equivalent systems working on conventional RGB images.