Publicación:
Reviewing and analyzing peer review Inter-Rater Reliability in a MOOC platform

dc.contributor.authorGarcía Loro, Félix
dc.contributor.authorMartín Gutiérrez, Sergio
dc.contributor.authorRuipérez Valiente, José A.
dc.contributor.authorSancristobal, Elio
dc.contributor.authorCastro Gil, Manuel Alonso
dc.date.accessioned2024-12-19T13:08:00Z
dc.date.available2024-12-19T13:08:00Z
dc.date.issued2020-09
dc.descriptionEste es el manuscrito aceptado del artículo. La versión registrada fue publicada por primera vez Computers & Education, Volume 154, 2020, 103894, ISSN 0360-1315, está disponible en línea en el sitio web del editor:https://doi.org/10.1016/j.compedu.2020.103894. This is the accepted manuscript of the article. The registered version was first published Computers & Education, Volume 154, 2020, 103894, ISSN 0360-1315, is available online at the publisher's website: https://doi.org/10.1016/j.compedu.2020.103894
dc.description.abstractPeer assessment activities might be one of the few personalized assessment alternatives to the implementation of auto-graded activities at scale in Massive Open Online Course (MOOC) environments. However, teacher's motivation to implement peer assessment activities in their courses might go beyond the most straightforward goal (i.e., assessment), as peer assessment activities also have other side benefits, such as showing evidence and enhancing the critical thinking, comprehension or writing capabilities of students. However, one of the main drawbacks of implementing peer review activities, especially when the scoring is meant to be used as part of the summative assessment, is that it adds a high degree of uncertainty to the grades. Motivated by this issue, this paper analyses the reliability of all the peer assessment activities performed as part of the MOOC platform of the Spanish University for Distance Education (UNED) UNED-COMA. The following study has analyzed 63 peer assessment activities from the different courses in the platform, and includes a total of 27,745 validated tasks and 93,334 peer reviews. Based on the Krippendorff's alpha statistic, which measures the agreement reached between the reviewers, the results obtained clearly point out the low reliability, and therefore, the low validity of this dataset of peer reviews. We did not find that factors such as the topic of the course, number of raters or number of criteria to be evaluated had a significant effect on reliability. We compare our results with other studies, discuss about the potential implications of this low reliability for summative assessment, and provide some recommendations to maximize the benefit of implementing peer activities in online courses.en
dc.description.versionversión final
dc.identifier.citationFelix Garcia-Loro, Sergio Martin, José A. Ruipérez-Valiente, Elio Sancristobal, Manuel Castro, Reviewing and analyzing peer review Inter-Rater Reliability in a MOOC platform, Computers & Education, Volume 154, 2020, 103894, ISSN 0360-1315, https://doi.org/10.1016/j.compedu.2020.103894.
dc.identifier.doihttps://doi.org/10.1016/j.compedu.2020.103894
dc.identifier.issn0360-1315
dc.identifier.urihttps://hdl.handle.net/20.500.14468/25005
dc.journal.titleComputers & Education
dc.journal.volume154
dc.language.isoen
dc.publisherElsevier
dc.relation.centerFacultades y escuelas::E.T.S. de Ingenieros Industriales
dc.relation.departmentIngeniería Eléctrica, Electrónica, Control, Telemática y Química Aplicada a la Ingeniería
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.es
dc.subject33 Ciencias Tecnológicas
dc.subject.keywordsPeer assessmenten
dc.subject.keywordsMOOCsen
dc.subject.keywordsKrippendorff's alphaen
dc.subject.keywordsinter-rater reliability (IRR)en
dc.subject.keywordsreliabilityen
dc.titleReviewing and analyzing peer review Inter-Rater Reliability in a MOOC platformes
dc.typeartículoes
dc.typejournal articleen
dspace.entity.typePublication
relation.isAuthorOfPublication91c8c018-1ced-44f0-882e-636353db3118
relation.isAuthorOfPublication634efe13-d6e9-4ddb-b4e6-565d60469512
relation.isAuthorOfPublication6b0b2577-77e7-4764-ba63-f9df1a54ebff
relation.isAuthorOfPublication.latestForDiscovery91c8c018-1ced-44f0-882e-636353db3118
Archivos
Bloque original
Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
Reviewing-and-Analyzing-Peer-Review_Garcia-Loro.pdf
Tamaño:
3.13 MB
Formato:
Adobe Portable Document Format
Bloque de licencias
Mostrando 1 - 1 de 1
No hay miniatura disponible
Nombre:
license.txt
Tamaño:
3.62 KB
Formato:
Item-specific license agreed to upon submission
Descripción: