QualityCrowd - Crowdsourcing for Subjective Video Quality Tests

Abstract

Despite continuing research on the development of better quality metrics, subjective tests are still indispensable for the assessment of video quality. These tests are both time-consuming and expensive and require installing a suitable laboratory that fulfills the corresponding ITU recommendations. In this thesis the use of crowdsourcing in conjunction with the internetbased performing of such tests shall be examined comparing the results of such a test and the results of conventional laboratory tests. For performing this test the web-based software QualityCrowd was developed, which allows the simple planning and conducting of subjective tests. The software uses Amazon’s crowdsourcing platform Mechanical Turk to assign the assessment of the videos to the crowd. Amazon provides the infrastructure for distibuting large numbers of almost any task and paying the workers afterwards. Another aspect is the evaluation of the technical issues that arise from an internet-based video test. In particular, the problems concerning the compression, delivery and playback of the videos in the participants’ browsers are discussed. After considering the various possibilities, a decision in favour of lossless compression using H.264/AVC and playback with Adobe’s Flash Player is taken. The gathered data show very high correlation with the data from the laboratories they are compared with. Although there are also some significant deviations, the results in general are quite promising and indicate the suitability of the use of crowdsourcing for subjective video tests. Even though the test could not be conducted publicly and the workers be paid, the costs of a test like this one are estimated. It shows that - compared to conventional laboratory tests - a clear cut in costs can be achieved.

Type