Literature DB >> 31247551

Two-Level Approach for No-Reference Consumer Video Quality Assessment.

Jari Korhonen.   

Abstract

Smartphones and other consumer devices capable of capturing video content and sharing it on social media in nearly real time are widely available at a reasonable cost. Thus, there is a growing need for no-reference video quality assessment (NR-VQA) of consumer produced video content, typically characterized by capture impairments that are qualitatively different from those observed in professionally produced video content. To date, most of the NR-VQA models in prior art have been developed for assessing coding and transmission distortions, rather than capture impairments. In addition, the most accurate NR-VQA methods known in prior art are often computationally complex, and therefore impractical for many real life applications. In this paper, we propose a new approach for learning-based video quality assessment, based on the idea of computing features in two levels so that low complexity features are computed for the full sequence first, and then high complexity features are extracted from a subset of representative video frames, selected by using the low complexity features. We have compared the proposed method against several relevant benchmark methods using three recently published annotated public video quality databases, and our results show that the proposed method can predict subjective video quality more accurately than the benchmark methods. The best performing prior method achieves nearly similar accuracy, but at substantially higher computational cost.

Year:  2019        PMID: 31247551     DOI: 10.1109/TIP.2019.2923051

Source DB:  PubMed          Journal:  IEEE Trans Image Process        ISSN: 1057-7149            Impact factor:   10.856


  2 in total

1.  Research on Video Quality Evaluation of Sparring Motion Based on BPNN Perception.

Authors:  Zhao Changbi; Wang Jinjuan; Ke Li
Journal:  Comput Intell Neurosci       Date:  2021-12-27

2.  Critical analysis on the reproducibility of visual quality assessment using deep features.

Authors:  Franz Götz-Hahn; Vlad Hosu; Dietmar Saupe
Journal:  PLoS One       Date:  2022-08-16       Impact factor: 3.752

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.