Villaplana Moreno, AitanaMartínez Unanue, RaquelMontalvo Herranz, Soto2025-02-072025-02-072023Villaplana, A., Martínez, R., & Montalvo, S. (2023). Improving Medical Entity Recognition in Spanish by Means of Biomedical Language Models. Electronics, 12(23), 4872. https://doi.org/10.3390/electronics122348722079-9292https://doi.org/10.3390/electronics12234872https://hdl.handle.net/20.500.14468/25850The registered version of this article, first published in “Electronics, 12, 2023", is available online at the publisher's website: MDPI, https://doi.org/10.3390/electronics12234872 La versión registrada de este artículo, publicado por primera vez en “Electronics, 12, 2023", está disponible en línea en el sitio web del editor: MDPI, https://doi.org/10.3390/electronics12234872Named Entity Recognition (NER) is an important task used to extract relevant information from biomedical texts. Recently, pre-trained language models have made great progress in this task, particularly in English language. However, the performance of pre-trained models in the Spanish biomedical domain has not been evaluated in an experimentation framework designed specifically for the task. We present an approach for named entity recognition in Spanish medical texts that makes use of pre-trained models from the Spanish biomedical domain. We also use data augmentation techniques to improve the identification of less frequent entities in the dataset. The domain-specific models have improved the recognition of name entities in the domain, beating all the systems that were evaluated in the eHealth-KD challenge 2021. Language models from the biomedical domain seem to be more effective in characterizing the specific terminology involved in this task of named entity recognition, where most entities correspond to the "concept" type involving a great number of medical concepts. Regarding data augmentation, only back translation has slightly improved the results. Clearly, the most frequent types of entities in the dataset are better identified. Although the domain-specific language models have outperformed most of the other models, the multilingual generalist model mBERT obtained competitive results.eninfo:eu-repo/semantics/openAccess33 Ciencias TecnológicasImproving Medical Entity Recognition in Spanish by Means of Biomedical Language Modelsartículobiomedical natural language processingSpanish biomedical entity recognitionpre-trained language modelsdata augmentation