Publicación:
Is Anisotropy Really the Cause of BERT Embeddings not being Semantic?

dc.contributor.authorFuster Baggetto, Alejandro
dc.contributor.directorFresno Fernández, Víctor
dc.date.accessioned2024-05-20T12:26:37Z
dc.date.available2024-05-20T12:26:37Z
dc.date.issued2022-09-01
dc.description.abstractWe conduct a set of experiments aimed to improve our understanding of the lack of semantic isometry (correspondence between the embedding and meaning spaces) of contextual word embeddings of BERT. Our empirical results show that, contrary to popular belief, the anisotropy is not the root cause of the poor performance of these contextual models’ embeddings in semantic tasks. What does affect both anisotropy and semantic isometry are a set of biased tokens, that distort the space with non semantic information. For each bias category (frequency, subword, punctuation, and case), we measure its magnitude and the effect of its removal. We show that these biases contribute but not completely explain the anisotropy and lack of semantic isometry of these models. Therefore, we hypothesise that the finding of new biases will contribute to the objective of correcting the representation degradation problem. Finally, we propose a new similarity method aimed to smooth the negative effect of biased tokens in semantic isometry and to increase the explainability of semantic similarity scores. We conduct an in depth experimentation of this method, analysing its strengths and weaknesses and propose future applications for it.en
dc.description.versionversión final
dc.identifier.urihttps://hdl.handle.net/20.500.14468/14260
dc.language.isoen
dc.publisherUniversidad Nacional de Educación a Distancia (España). Escuela Técnica Superior de Ingeniería Informática. Departamento de Inteligencia Artificial
dc.relation.centerFacultades y escuelas::E.T.S. de Ingeniería Informática
dc.relation.degreeMáster universitario en Investigación en Inteligencia Artificial
dc.relation.departmentInteligencia Artificial
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/deed.es
dc.subject.keywordssemantic textual similarity
dc.subject.keywordssentence embeddings
dc.subject.keywordstransformers
dc.subject.keywordsnatural language processing
dc.subject.keywordsdeep learning
dc.titleIs Anisotropy Really the Cause of BERT Embeddings not being Semantic?es
dc.typetesis de maestríaes
dc.typemaster thesisen
dspace.entity.typePublication
Archivos
Bloque original
Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
Fuster_Baggetto_Alejandro_TFM.pdf
Tamaño:
1.67 MB
Formato:
Adobe Portable Document Format