且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

将纹理数据写入深度缓冲区

更新时间:2023-11-07 23:50:34

深度缓冲区比你在OpenGL ES中想象的更模糊;不仅glDrawPixels缺席,而且gl_FragDepth已从GLSL中删除。因此,您无法编写自定义片段着色器以将值假脱机到深度缓冲区,因为您可能会推送颜色。

The depth buffer is more obscured than you think in OpenGL ES; not only is glDrawPixels absent but gl_FragDepth has been removed from GLSL. So you can't write a custom fragment shader to spool values to the depth buffer as you might push colours.

最明显的解决方案是将深度信息打包到纹理并使用自定义片段着色器,在它生成的片段和从您提供的纹理中查找的片段之间进行深度比较。只有生成的片段更近才允许继续。正常的深度缓冲区将捕获其他遮挡情况 - 原则上 - 您可以使用帧缓冲对象来创建深度纹理,为您提供完整的GPU上往返,尽管它与您的直接相关问题。

The most obvious solution is to pack your depth information into a texture and to use a custom fragment shader that does a depth comparison between the fragment it generates and one looked up from a texture you supply. Only if the generated fragment is closer is it allowed to proceed. The normal depth buffer will catch other cases of occlusion and — in principle — you could use a framebuffer object to create the depth texture in the first place, giving you a complete on-GPU round trip, though it isn't directly relevant to your problem.

缺点是绘图会花费额外的纹理单位,而纹理会使用整数分量。

Disadvantages are that drawing will cost you an extra texture unit and textures use integer components.

编辑:为了使示例简单,假设您将所有深度信息打包到纹理的红色通道中。这会给你一个非常低精度的深度缓冲区,但为了保持清晰,你可以编写一个快速片段着色器,如:

for the purposes of keeping the example simple, suppose you were packing all of your depth information into the red channel of a texture. That'd give you a really low precision depth buffer, but just to keep things clear, you could write a quick fragment shader like:

void main()
{
    // write a value to the depth map
    gl_FragColor = vec4(gl_FragCoord.w, 0.0, 0.0, 1.0);
}

在红色通道中存储深度。因此,您已经部分地重新创建了旧的深度纹理扩展 - 您将拥有一个更亮的红色像素更近的图像,更暗的红色像素更远。我认为在您的问题中,您实际上是从磁盘加载此图像。

To store depth in the red channel. So you've partially recreated the old depth texture extension — you'll have an image that has a brighter red in pixels that are closer, a darker red in pixels that are further away. I think that in your question, you'd actually load this image from disk.

要在未来的片段着色器中使用纹理,您可以执行以下操作:

To then use the texture in a future fragment shader, you'd do something like:

uniform sampler2D depthMap;

void main()
{
    // read a value from the depth map
    lowp vec3 colourFromDepthMap = texture2D(depthMap, gl_FragCoord.xy);

    // discard the current fragment if it is less close than the stored value
    if(colourFromDepthMap.r > gl_FragCoord.w) discard;

    ... set gl_FragColor appropriately otherwise ...
}

EDIT2:你可以看到从深度到RGBA值的更智能的映射这里。要直接绑定到该文档,iPad上或第三代iPhone上肯定不支持OES_depth_texture。我没有在其他地方进行过完整的测试。

you can see a much smarter mapping from depth to an RGBA value here. To tie in directly to that document, OES_depth_texture definitely isn't supported on the iPad or on the third generation iPhone. I've not run a complete test elsewhere.