Paper
17 March 2015 Parameterized framework for the analysis of visual quality assessments using crowdsourcing
Anthony Fremuth, Velibor Adzic, Hari Kalva
Author Affiliations +
Proceedings Volume 9394, Human Vision and Electronic Imaging XX; 93940C (2015) https://doi.org/10.1117/12.2080661
Event: SPIE/IS&T Electronic Imaging, 2015, San Francisco, California, United States
Abstract
The ability to assess the quality of new multimedia tools and applications relies heavily on the perception of the end user. In order to quantify the perception, subjective tests are required to evaluate the effectiveness of new technologies. However, the standard for subjective user studies requires a highly controlled test environment and is costly in terms of both money and time. To circumvent these issues we are utilizing crowdsourcing platforms such as CrowdFlower and Amazon's Mechanical Turk. The reliability of the results relies on factors that are not controlled and can be considered “hidden”. We are using pre-test survey to collect responses from subjects that reveal some of the hidden factors. Using statistical analysis we build parameterized model allowing for proper adjustments to collected test scores.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Anthony Fremuth, Velibor Adzic, and Hari Kalva "Parameterized framework for the analysis of visual quality assessments using crowdsourcing", Proc. SPIE 9394, Human Vision and Electronic Imaging XX, 93940C (17 March 2015); https://doi.org/10.1117/12.2080661
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Video

Statistical analysis

Data modeling

Process modeling

Visualization

Reliability

Visual analytics

Back to Top