Working Group Details
HFVE_WG - Human Factors for Visual Experiences Working Group
|Working Group Chair||
|IEEE Program Manager|
P3333.1.3 - Standard for the Deep Learning-Based Assessment of Visual Experience Based on Human Factors
This standard defines deep learning-based metrics of content analysis and quality of experience (QoE) assessment for visual contents, which is an extension of Standard for the Quality of Experience (QoE) and Visual-Comfort Assessments of Three-Dimensional (3D) Contents Based on Psychophysical Studies (IEEE STD 3333.1.1)) and Standard for the Perceptual Quality Assessment of Three Dimensional (3D) and Ultra High Definition (UHD) Contents (IEEE 3333.1.2). The scope covers the following. * Deep learning models for QoE assessment (multilayer perceptrons, convolutional neural networks, deep generative models) * Deep metrics of visual experience from High Definition (HD), UHD, 3D, High Dynamic Range (HDR), Virtual Reality (VR) and Mixed Reality (MR) contents * Deep analysis of clinical (electroencephalogram (EEG), electrocardiogram (ECG), electrooculography (EOG), and so on) and psychophysical (subjective test and simulator sickness questionnaire (SSQ)) data for QoE assessment * Deep personalized preference assessment of visual contents * Building image and video databases for performance benchmarking purpose if necessary
P3333.1.1 - Standard for Quality of Experience (QoE) and Visual-Comfort Assessments of Three-Dimensional (3D) Contents Based on Psychophysical Studies
This standard establishes various traditional and deep learning-based methods for visual saliency prediction, visual contents analysis, and subjective assessment for quantifying the visual discomfort and quality of experience (QoE) of 3D image and video.
P3333.1.2 - Standard for the Perceptual Quality Assessment of Three Dimensional (3D), Ultra High Definition (UHD) and High Dynamic Range (HDR) Contents
This standard establishes methods for quality assessment of 3D, UHD and HDR contents based on physiological mechanisms such as perceptual quality and visual attention. This standard identifies and quantifies the following: -- Causes and visual attention of perceptual quality degradation for 3D, UHD and HDR image and video contents: -- Compression distortion, such as multi-view image and video compression, -- Interpolation distortion by intermediate view rendering, such as 3D, UHD and HDR warping, view synthesis, -- Structural distortion, such as bit errors on wireless/wired transmission errors, -- Visual attention according to the quality degradation. -- Deep learning based model for saliency detection and QoE assessment. Key items are needed to characterize the 3D, UHD and HDR database in terms of the human visual system. These key factors are constructed in conjunction with the visual factors used to perceptual quality and visual attention.
IEEE 3333.1.2-2017 - IEEE Standard for the Perceptual Quality Assessment of Three-Dimensional (3D) and Ultra-High-Definition (UHD) Contents
The world is witnessing a rapid advance in stereoscopic 3D (S3D), and ultra-highdefinition (UHD) technology. As a result, the need for accurate quality and visual-comfort assessment techniques to foster the display device industry as well as signal-processing area. In this standard, thorough assessments with respect to the human visual system (HVS) for S3D and UHD contents shall be presented. Moreover, several image and video databases are also publicly provided for any research purpose.