Publicación: Transformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language models
dc.contributor.author | Rosa, Javier de la | |
dc.contributor.author | Pérez Pozo, Álvaro | |
dc.contributor.author | Sisto, Mirella De | |
dc.contributor.author | Hernández Lorenzo, Laura | |
dc.contributor.author | Díaz Paredes, Aitor | |
dc.contributor.author | Ros Muñoz, Salvador | |
dc.contributor.author | González Blanco, Elena | |
dc.date.accessioned | 2024-07-31T09:05:34Z | |
dc.date.available | 2024-07-31T09:05:34Z | |
dc.date.issued | 2023 | |
dc.description.abstract | The splitting of words into stressed and unstressed syllables is the foundation for the scansion of poetry, a process that aims at determining the metrical pattern of a line of verse within a poem. Intricate language rules and their exceptions, as well as poetic licenses exerted by the authors, make calculating these patterns a nontrivial task. Some rhetorical devices shrink the metrical length, while others might extend it. This opens the door for interpretation and further complicates the creation of automated scansion algorithms useful for automatically analyzing corpora on a distant reading fashion. In this paper, we compare the automated metrical pattern identification systems available for Spanish, English, and German, against fine-tuned monolingual and multilingual language models trained on the same task. Despite being initially conceived as models suitable for semantic tasks, our results suggest that transformers-based models retain enough structural information to perform reasonably well for Spanish on a monolingual setting, and outperforms both for English and German when using a model trained on the three languages, showing evidence of the benefits of cross-lingual transfer between the languages. | en |
dc.description.version | versión publicada | |
dc.identifier.citation | e la Rosa, J., Pérez, Á., de Sisto, M. et al. Transformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language models. Neural Comput & Applic 35, 18171–18176 (2023). https://doi.org/10.1007/s00521-021-06692-2 | |
dc.identifier.doi | https://doi.org/10.1007/s00521-021-06692-2 | |
dc.identifier.issn | 1433-3058 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14468/23168 | |
dc.journal.title | Neural Computing and Applications | |
dc.journal.volume | 35 | |
dc.language.iso | es | |
dc.page.final | 18176 | |
dc.page.initial | 18171 | |
dc.publisher | Springer | |
dc.relation.center | Facultades y escuelas::Facultad de Filología | |
dc.relation.department | Literatura Española y Teoría de la Literatura | |
dc.rights | info:eu-repo/semantics/openAccess | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/deed.es | |
dc.subject | 57 Lingüística::5701 Lingüística aplicada::5701.07 Lengua y literatura | |
dc.subject | 55 Historia::5505 Ciencias auxiliares de la historia::5505.10 Filología | |
dc.subject.keywords | Natural language processing | en |
dc.subject.keywords | Language models | en |
dc.subject.keywords | Digital humanities | en |
dc.subject.keywords | Poetry | en |
dc.title | Transformers analyzing poetry: multilingual metrical pattern prediction with transfomer-based language models | es |
dc.type | artículo | es |
dc.type | journal article | en |
dspace.entity.type | Publication | |
relation.isAuthorOfPublication | be1bfc00-6641-4ae9-91be-440c0945f461 | |
relation.isAuthorOfPublication | d374be8f-46ec-4bc2-9ea1-934f86b2251f | |
relation.isAuthorOfPublication | 89f01009-fad2-49e0-bed5-fc907ccb16ef | |
relation.isAuthorOfPublication | d25ad74f-42fc-47ac-911d-1e5515319a58 | |
relation.isAuthorOfPublication.latestForDiscovery | be1bfc00-6641-4ae9-91be-440c0945f461 |
Archivos
Bloque original
1 - 1 de 1
Cargando...
- Nombre:
- HernandezLorenzo_Laura_Transformers.pdf
- Tamaño:
- 234.85 KB
- Formato:
- Adobe Portable Document Format
Bloque de licencias
1 - 1 de 1
No hay miniatura disponible
- Nombre:
- license.txt
- Tamaño:
- 3.62 KB
- Formato:
- Item-specific license agreed to upon submission
- Descripción: