I have a question regarding the color space conversion of the HDR function. #2582
xujunjie12345
started this conversation in
General
Replies: 1 comment 1 reply
-
By specification, h264, HEVC and AV1 use the standard sRGB gamma curve (which is clamped near blacks). The conversion from sRGB to RGB is done because MediaCodec is not doing this conversion for whatever reason. Instead the Apple Vision Pro can use the decoded sRGB images without any extra conversion. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In the latest 21-dev version, after the HDR function is enabled, the linear space RGBA16F is adopted on both the server side and the client side.
On the server side, the linear values are actively converted into sRGB values for storage in FrameRender.fx.
On the client side, the sRGB values are converted back into linear values in stream.wgsl.
I wonder if these two conversions are redundant and waste performance.
I've tested that when these two conversions are blocked, there seems to be no change in the effect when HDR is turned on.
Is it because the precision loss of sRGB color values is smaller during GPU encoding?
Beta Was this translation helpful? Give feedback.
All reactions