In this work, we propose a supervised no-reference (NR) Image Quality Assessment (IQA) model for the objective evaluation of the perceptual quality of 3D virtual reality (VR) images. To achieve such practical algorithm, we first study the scene statistics of saliency maps of the individual left and right views of VR images, and empirically model these statistics with Univariate Generalized Gaussian Distribution (UGGD). We compute the UGGD model parameters at multi-scale and multi-orient steerable subband decomposition, and introduce these features as distortion discriminables. This is followed by the computation of the entropy and normalized root mean square scores of each subband and then these values are utilized as weights to pool the individual view features. We apply the popular 2D supervised BRISQUE model on the individual views to estimate the overall spatial quality of VR images. As the last step of the algorithm, the predicted saliency score and the spatial BRISQUE score are pooled to derive the final quality score of VR images. The performance of the proposed model is evaluated on the popular LIVE 3D VR IQA dataset. The results indicate robust and competitive performance against the off-the-shelf 2D full-reference and NR supervised algorithms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.