Working Group Details
HFVE_WG - Human Factors for Visual Experiences Working Group
|Working Group Chair||
|IEEE Program Manager|
P3333.1.3 - Standard for the Deep Learning-Based Assessment of Visual Experience Based on Human Factors
This standard defines deep learning-based metrics of content analysis and quality of experience (QoE) assessment for visual contents, which is an extension of Standard for the Quality of Experience (QoE) and Visual-Comfort Assessments of Three-Dimensional (3D) Contents Based on Psychophysical Studies (IEEE STD 3333.1.1)) and Standard for the Perceptual Quality Assessment of Three Dimensional (3D) and Ultra High Definition (UHD) Contents (IEEE 3333.1.2). The scope covers the following. * Deep learning models for QoE assessment (multilayer perceptrons, convolutional neural networks, deep generative models) * Deep metrics of visual experience from High Definition (HD), UHD, 3D, High Dynamic Range (HDR), Virtual Reality (VR) and Mixed Reality (MR) contents * Deep analysis of clinical (electroencephalogram (EEG), electrocardiogram (ECG), electrooculography (EOG), and so on) and psychophysical (subjective test and simulator sickness questionnaire (SSQ)) data for QoE assessment * Deep personalized preference assessment of visual contents * Building image and video databases for performance benchmarking purpose if necessary
P3333.1.1 - Standard for Quality of Experience (QoE) and Visual-Comfort Assessments of Three-Dimensional (3D) Contents Based on Psychophysical Studies
This standard establishes various traditional and deep learning-based methods for visual saliency prediction, visual contents analysis, and subjective assessment for quantifying the visual discomfort and quality of experience (QoE) of 3D image and video.
P3333.1.2 - Standard for the Perceptual Quality Assessment of Three Dimensional (3D), Ultra High Definition (UHD) and High Dynamic Range (HDR) Contents
This standard establishes methods for quality assessment of 3D, UHD and HDR contents based on physiological mechanisms such as perceptual quality and visual attention. This standard identifies and quantifies the following: -- Causes and visual attention of perceptual quality degradation for 3D, UHD and HDR image and video contents: -- Compression distortion, such as multi-view image and video compression, -- Interpolation distortion by intermediate view rendering, such as 3D, UHD and HDR warping, view synthesis, -- Structural distortion, such as bit errors on wireless/wired transmission errors, -- Visual attention according to the quality degradation. -- Deep learning based model for saliency detection and QoE assessment. Key items are needed to characterize the 3D, UHD and HDR database in terms of the human visual system. These key factors are constructed in conjunction with the visual factors used to perceptual quality and visual attention.
3333.1.1-2015 - IEEE Standard for Quality of Experience (QoE) and Visual-Comfort Assessments of Three-Dimensional (3D) Contents Based on Psychophysical Studies
As the demand and supply for 3D technologies grows, the development of accurate quality-assessment techniques shall be used to develop the 3D display device and signal-processing engine industries. The underlying principles and statistical characteristics of 3D contents based on the human visual system (HVS) are described in this standard. In addition, a reliable 3D subjective assessment methodology that covers the characteristics of human perception, display mechanism, and the viewing environment is introduced in this standard.