Paper details

Title: A Comparative Study of Typing and Speech For Map Metadata Creation

Authors: Pei-Chun Lai, Auriol Degbelo

Abstract: Obtained from CrossRef

Abstract. Metadata is key to effective knowledge organization, and designing user interfaces that maximize user performance and user experience during metadata creation would benefit several areas of GIScience. Yet, empirically-derived guidelines for user interfaces supporting GI-metadata creation are still scarce. As a step towards mitigating that gap, this work has implemented and evaluated a prototype that produces semantically-rich metadata for web maps via one of two input modalities: typing or speech. A controlled experiment (N=12) to investigate the merits of both modalities has revealed that (i) typing and speech were comparable as far as input duration time is concerned; and (ii) they received opposed ratings concerning their pragmatic and hedonic qualities. Combining both might thus be beneficial for GI-metadata creation user interfaces. The findings are useful to ongoing work on semantic enablement for spatial data infrastructure and note-taking during visual analytics.

Codecheck details

Certificate identifier: 2021-005

Codechecker names: Frank Ostermann, Daniel Nüst

Time of codecheck: 2021-06-10 12:00:00

Repository: https://osf.io/7fqtm

Codecheck report: https://doi.org/10.17605/osf.io/7fqtm

Summary:

The paper presents the results of a user experiment to improve GI-metadata using speech. A complete reproduction is practically impossible to achieve. This reproducibility report therefore investigated two components: First, whether sufficient information is provided to replicate the experiment elsewhere with a different group of participants. Second, whether sufficient information is provided to reproduce the analysis of the experimental results. The conclusion is positive for both. The original prototype is accessible online at the time of this writing. To reproduce the results, the provided input data, R code, and Excel spreadsheet lead to the same results as given in the paper and the prototype for the user experiment could be run locally.


https://codecheck.org.uk/ | GitHub codecheckers

© Stephen Eglen & Daniel Nüst

Published under CC BY-SA 4.0

DOI of Zenodo Deposit

CODECHECK is a process for independent execution of computations underlying scholarly research articles.