Publicación:
Transformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language models

dc.contributor.authorRosa, Javier de la
dc.contributor.authorPérez Pozo, Álvaro
dc.contributor.authorSisto, Mirella De
dc.contributor.authorHernández Lorenzo, Laura
dc.contributor.authorDíaz Paredes, Aitor
dc.contributor.authorRos Muñoz, Salvador
dc.contributor.authorGonzález Blanco, Elena
dc.date.accessioned2024-07-31T09:05:34Z
dc.date.available2024-07-31T09:05:34Z
dc.date.issued2023
dc.description.abstractThe splitting of words into stressed and unstressed syllables is the foundation for the scansion of poetry, a process that aims at determining the metrical pattern of a line of verse within a poem. Intricate language rules and their exceptions, as well as poetic licenses exerted by the authors, make calculating these patterns a nontrivial task. Some rhetorical devices shrink the metrical length, while others might extend it. This opens the door for interpretation and further complicates the creation of automated scansion algorithms useful for automatically analyzing corpora on a distant reading fashion. In this paper, we compare the automated metrical pattern identification systems available for Spanish, English, and German, against fine-tuned monolingual and multilingual language models trained on the same task. Despite being initially conceived as models suitable for semantic tasks, our results suggest that transformers-based models retain enough structural information to perform reasonably well for Spanish on a monolingual setting, and outperforms both for English and German when using a model trained on the three languages, showing evidence of the benefits of cross-lingual transfer between the languages.en
dc.description.versionversión publicada
dc.identifier.citatione la Rosa, J., Pérez, Á., de Sisto, M. et al. Transformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language models. Neural Comput & Applic 35, 18171–18176 (2023). https://doi.org/10.1007/s00521-021-06692-2
dc.identifier.doihttps://doi.org/10.1007/s00521-021-06692-2
dc.identifier.issn1433-3058
dc.identifier.urihttps://hdl.handle.net/20.500.14468/23168
dc.journal.titleNeural Computing and Applications
dc.journal.volume35
dc.language.isoes
dc.page.final18176
dc.page.initial18171
dc.publisherSpringer
dc.relation.centerFacultades y escuelas::Facultad de Filología
dc.relation.departmentLiteratura Española y Teoría de la Literatura
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rights.licenseAtribución 4.0 Internacional
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/deed.es
dc.subject57 Lingüística::5701 Lingüística aplicada::5701.07 Lengua y literatura
dc.subject55 Historia::5505 Ciencias auxiliares de la historia::5505.10 Filología
dc.subject.keywordsNatural language processingen
dc.subject.keywordsLanguage modelsen
dc.subject.keywordsDigital humanitiesen
dc.subject.keywordsPoetryen
dc.titleTransformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language modelses
dc.typeartículoes
dc.typejournal articleen
dspace.entity.typePublication
relation.isAuthorOfPublicationbe1bfc00-6641-4ae9-91be-440c0945f461
relation.isAuthorOfPublicationd374be8f-46ec-4bc2-9ea1-934f86b2251f
relation.isAuthorOfPublication89f01009-fad2-49e0-bed5-fc907ccb16ef
relation.isAuthorOfPublicationd25ad74f-42fc-47ac-911d-1e5515319a58
relation.isAuthorOfPublication.latestForDiscoverybe1bfc00-6641-4ae9-91be-440c0945f461
Archivos
Bloque original
Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
HernandezLorenzo_Laura_Transformers.pdf
Tamaño:
234.85 KB
Formato:
Adobe Portable Document Format
Bloque de licencias
Mostrando 1 - 1 de 1
No hay miniatura disponible
Nombre:
license.txt
Tamaño:
3.62 KB
Formato:
Item-specific license agreed to upon submission
Descripción: