Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study

Andreas Sackl, Bruno Gardlo, Raimund Schatz


Over the last couple of years, crowdsourcing has become a widely used method for conducting subjective QoE experiments over the Internet. However, the scope of crowdsourced QoE experiments so far has been mostly limited to video and image quality testing, despite the existence of many other relevant application categories. In this paper we demonstrate the applicability of crowdsourced QoE testing to the case of file downloads. We conducted several campaigns in which participants had to download large (10-50MB) media files (with defined waiting times) and subsequently rate their QoE. The results are compared with those of a lab-based file download QoE study featuring an equivalent design. Our results show that crowdsourced QoE testing can also be applied to file downloads with a size of 10 MB as rating results are very similar to the lab. However, beyond user reliability checks and filtering, we found the study design to be a highly critical element as it exerted strong influence on overall participant behavior. For this reason we also present a discussion of valuable lessons learned in terms of test design and participant behavior.


 DOI: 10.21437/PQS.2016-27

Cite as: Sackl, A., Gardlo, B., Schatz, R. (2016) Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study . Proc. 5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016), 127-131, DOI: 10.21437/PQS.2016-27.


@inproceedings{Sackl2016,
  author={Andreas Sackl and Bruno Gardlo and Raimund Schatz},
  title={Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study	},
  year=2016,
  booktitle={Proc. 5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016)},
  pages={127--131},
  doi={10.21437/PQS.2016-27},
  url={http://dx.doi.org/10.21437/PQS.2016-27}
}