I am new to DX12 and I am currently trying to get the depth buffer to work. I have done pretty much everything identical to blog posts I've seen online. I can also see that the depthbuffer-texture
is there in RenderDoc, it's just completely white:
Also, the depth target is in D3D12_RESOURCE_STATE_DEPTH_WRITE
state. I've set all the settings in my pipeline so it supports depth testing, matched the formats and everything.
Pipeline setup:
psoDesc.RasterizerState = CD3DX12_RASTERIZER_DESC(D3D12_DEFAULT);
psoDesc.RasterizerState.CullMode = D3D12_CULL_MODE_NONE;
psoDesc.BlendState = CD3DX12_BLEND_DESC(D3D12_DEFAULT);
psoDesc.DepthStencilState = CD3DX12_DEPTH_STENCIL_DESC(D3D12_DEFAULT);
psoDesc.DSVFormat = DXGI_FORMAT_D32_FLOAT;
psoDesc.SampleMask = UINT_MAX;
psoDesc.PrimitiveTopologyType = D3D12_PRIMITIVE_TOPOLOGY_TYPE_TRIANGLE;
psoDesc.NumRenderTargets = 1;
psoDesc.RTVFormats[0] = DXGI_FORMAT_R8G8B8A8_UNORM;
psoDesc.SampleDesc.Count = 1;
ThrowIfFailed(_device->CreateGraphicsPipelineState(&psoDesc, IID_PPV_ARGS(&_pipelineState)));
DepthBuffer-Resource:
// Create the DSV Heap
D3D12_DESCRIPTOR_HEAP_DESC dsvHeapDesc = {};
dsvHeapDesc.NumDescriptors = 1;
dsvHeapDesc.Type = D3D12_DESCRIPTOR_HEAP_TYPE_DSV;
dsvHeapDesc.Flags = D3D12_DESCRIPTOR_HEAP_FLAG_NONE;
ThrowIfFailed(_device->CreateDescriptorHeap(&dsvHeapDesc, IID_PPV_ARGS(&_dsvHeap)));
// Heap properties for creating the texture (GPU read/write)
D3D12_HEAP_PROPERTIES heapProps = {};
heapProps.Type = D3D12_HEAP_TYPE_DEFAULT;
heapProps.CPUPageProperty = D3D12_CPU_PAGE_PROPERTY_UNKNOWN;
heapProps.MemoryPoolPreference = D3D12_MEMORY_POOL_UNKNOWN;
heapProps.CreationNodeMask = 1;
heapProps.VisibleNodeMask = 1;
// Create Depth-Stencil Resource (Texture2D)
D3D12_RESOURCE_DESC depthResourceDesc = {};
depthResourceDesc.Dimension = D3D12_RESOURCE_DIMENSION_TEXTURE2D;
depthResourceDesc.Alignment = 0;
depthResourceDesc.Width = _width; // Width of the texture
depthResourceDesc.Height = _height; // Height of the texture
depthResourceDesc.DepthOrArraySize = 1;
depthResourceDesc.MipLevels = 1;
depthResourceDesc.Format = DXGI_FORMAT_D32_FLOAT; // Depth format (could also be D24_UNORM_S8_UINT)
depthResourceDesc.SampleDesc.Count = 1;
depthResourceDesc.SampleDesc.Quality = 0;
depthResourceDesc.Layout = D3D12_TEXTURE_LAYOUT_UNKNOWN;
depthResourceDesc.Flags |= D3D12_RESOURCE_FLAG_ALLOW_DEPTH_STENCIL;
D3D12_CLEAR_VALUE depthOptimizedClearValue = {};
depthOptimizedClearValue.Format = DXGI_FORMAT_D32_FLOAT;
depthOptimizedClearValue.DepthStencil.Depth = 1.0f; // Default clear depth
depthOptimizedClearValue.DepthStencil.Stencil = 0; // Default clear stencil
ThrowIfFailed(_device->CreateCommittedResource(
&heapProps,
D3D12_HEAP_FLAG_NONE,
&depthResourceDesc,
D3D12_RESOURCE_STATE_DEPTH_WRITE,
&depthOptimizedClearValue,
IID_PPV_ARGS(&_depthStencilBuffer)
));
D3D12_DEPTH_STENCIL_VIEW_DESC dsvDesc = {};
dsvDesc.Format = DXGI_FORMAT_D32_FLOAT;
dsvDesc.ViewDimension = D3D12_DSV_DIMENSION_TEXTURE2D;
dsvDesc.Flags = D3D12_DSV_FLAG_NONE;
// Create the DSV for the depth-stencil buffer
_device->CreateDepthStencilView(_depthStencilBuffer.Get(), &dsvDesc, _dsvHeap->GetCPUDescriptorHandleForHeapStart());
And lastly binding:
D3D12_CPU_DESCRIPTOR_HANDLE rtvHandle = _rtvHeap->GetCPUDescriptorHandleForHeapStart();
rtvHandle.ptr += (_frameIndex * _rtvDescriptorSize);
D3D12_CPU_DESCRIPTOR_HANDLE dsvHandle = _dsvHeap->GetCPUDescriptorHandleForHeapStart();
_commandList->OMSetRenderTargets(1, &rtvHandle, FALSE, &dsvHandle);
_commandList->ClearDepthStencilView(dsvHandle, D3D12_CLEAR_FLAG_DEPTH, 1.0f, 0.0f, 0, nullptr);
I don't understand why it's not working — help would be very much appreciated. I don't want to spam this thread with code (since it's quite a lot of boilerplate) so if you want to look at the complete code you can check it out here: https://github.com/eliasfuericht/artisDX/blob/main/src/Application.cpp
-
\$\begingroup\$ If you press the magic wand icon in the range part of the toolbar, does that show something other than white? Depth buffers often have a weird range of values that don't show up well as colours with the default 0-1 range. \$\endgroup\$Adam– Adam2025年02月01日 17:34:28 +00:00Commented Feb 1 at 17:34
-
\$\begingroup\$ Okay if I press that the range changes from 1.0-1.0 and the texture is completely black. \$\endgroup\$eli3D– eli3D2025年02月01日 19:25:57 +00:00Commented Feb 1 at 19:25
2 Answers 2
I finally found the solution to my problem in this ComputerGraphics.SE question. The minDepth
and maxDepth
of my viewport were wrong, they have to be:
_viewport.MinDepth = 0.0f;
_viewport.MaxDepth = 1.0f;
Or simply use these default settings:
_viewport = CD3DX12_VIEWPORT{ 0.0f, 0.0f, static_cast<float>(width), static_cast<float>(height) };
Previously, I had set them to 0.1f
and 1000.0f
because I had been following this tutorial. Please, be careful!
Here are a few things you could do to debug this:
- Make sure the debug layer is enabled, and fix any problems that it reports: https://learn.microsoft.com/en-us/windows/win32/direct3d12/using-d3d12-debug-layer-gpu-based-validation.
- In the
Overlay
drop down in Renderdoc, you can show the result of the depth test. That will show pass in green, and fail in red. You may find some of the other overlays interesting too - e.g. to double check which pixels your geometry covers. - In Renderdoc, get the pixel history for any of the pixels where it is rendering stuff. The pixel history window will show for each triangle in each draw call what the outputs were, or why it failed to output values (e.g. because it failed the depth test, or was back-facing).
- Look at the pipeline state tab in Renderdoc to see what depth operations you're doing, and make sure they are sensible. For example, since you're clearing to 1.0f, your comparison function should probably be
D3D12_COMPARISON_FUNC_LESS_EQUAL
orD3D12_COMPARISON_FUNC_LESS
. - Double check your transform matrix and vertex shader - maybe you're not getting appropriate z or w values out of the vertex shader?
-
\$\begingroup\$ Thanks for the suggestions, I looked at them in RenderDoc and found that all pixels failed the depthtest - reverting them from the color they should be to just black. I am pretty sure the values from the vertexshader are correct since if I render without depthtest the models look correct, just planes weirdly intersecting obviously. I wish I could post some screenshots for more context... \$\endgroup\$eli3D– eli3D2025年02月01日 19:23:55 +00:00Commented Feb 1 at 19:23
-
\$\begingroup\$ One thing to add, I just looked at another pixel from the depthtest and its D value was 9999.58 - so I imagine there is where the problem stems from. Do you have any idea why its that high? If I render without depthtest there is no clipping or anything. \$\endgroup\$eli3D– eli3D2025年02月01日 19:33:32 +00:00Commented Feb 1 at 19:33
-
\$\begingroup\$ The values that get written into the depth buffer should normally be in the 0-1 range. It sounds like your vertex shader or transform might be incorrect. \$\endgroup\$Adam– Adam2025年02月01日 21:55:50 +00:00Commented Feb 1 at 21:55
-
\$\begingroup\$ From a quick look at your code, I can see that you're using glm for your matrix maths. D3D and OpenGL have some different conventions for how their Normalized Device Coordinates work. One of those differences in in the Z range for the viewport. D3D goes from 0 to 1 and OpenGL goes from -1 to 1. You may also need to transpose the matrices, which can be done by specifying them as
row_major
orcolumn_major
in the HLSL code. See stackoverflow.com/questions/13318431/… and stackoverflow.com/questions/57168772/… \$\endgroup\$Adam– Adam2025年02月01日 22:07:39 +00:00Commented Feb 1 at 22:07 -
\$\begingroup\$ Thanks for the resources, I've now checked all my matrices and changed from glm to DirectX math. I still have the same issue, D value is at 999.... \$\endgroup\$eli3D– eli3D2025年02月02日 13:07:03 +00:00Commented Feb 2 at 13:07