8.9 C
New York
Saturday, November 22, 2025

rendering – Does it make sense to make use of a compute shader with Dispatch(1,1,1) and Numthreads[1,1,1] to attract a cone?


There might at instances be causes to make use of a compute shader with Dispatch(1, 1,1) and Numthreads[1,1,1], however drawing a cone will not be the use case for that.

By limiting your compute shader to a single thread, you are giving up the parallel computation advantages you can be getting from the GPU. We might draw doubtlessly each pixel of the cone in parallel, relatively than looping over the cone one step at a time serially. This parallelism minimizes the full latency from once we begin drawing the cone to once we’re finished, and scales significantly better to make use of the {hardware}’s many cores to deal with completely different numbers of pixels that should be drawn.

Myself, I might skip utilizing a compute shader fully. You may obtain this impact with a plain vanilla vertex-fragment shader and no loops in any respect (and the cone is absolutely stable, with out the moiré stripes of skipped pixels seen within the query):

We’ll begin by constructing a mesh that represents a bounding quantity to your cone – one thing that may cowl each fragment of the display you need your cone to occupy, although it is OK if it covers a bit extra. We’ll draw this bounding mesh with a shader that attracts the cone contained inside, and clips-out any extra area outdoors.

For my instance, I am utilizing a regular 1-unit-wide dice mesh, with vertices at (±0.5, ±0.5, ±0.5). To be extra environment friendly, your bounds might be a low-poly cone, so that you needn’t clip as a lot empty area.

We’ll render this mesh flipped inside-out, so we’re drawing its far faces as an alternative of the nearer ones. This implies even when the digicam strikes as much as/contained in the bounding quantity, it’s going to nonetheless see the far face, so our shader will nonetheless run there (relatively than seeing a gap in our cone the place the bounding quantity clips the close to airplane).

Within the vertex shader, we’ll save the place of “this” vertex and the attention vector (pointing from the vertex to the digicam), each in object-local area. These can be interpolated and handed to the fragment shader, to turn out to be the preliminary place and path of a ray we’ll “hint” by way of the bounding quantity, to seek out the place it intersects the cone.

In my case, I will shift the z coordinate within the shader so it runs 0…1, placing the purpose of the cone (z=0) at one face of the bounding dice.

struct v2f {
    float4 vertex : SV_POSITION;
    float3 pos : TEXCOORD0;
    float depth: TEXCOORD1;
    float3 dir : TEXCOORD2;
};

v2f vert (appdata v) {
    v2f o;

    // Mission vertex to clip area, as regular.
    o.vertex = mul(MVP, v.vertex);

    // Save depth individually to make use of in fragment shader.
    o.depth - o.vertex.w;
    
    // Save native place of vertex to interpolate.
    o.pos = v.vertex.xyz;
    o.pos.z += 0.5f; // HACK: remapping dice to 0...1 on the Z
    // (As a result of I used to be too lazy to make a customized mannequin)

    // Get vector from vertex to digicam, additionally in native area.
    o.dir = mul(WorldToObject, float4(CameraPos, 1)).xyz - v.vertex.xyz;

    return o;
}

Then within the fragment shader, we will analytically compute the place this ray intersects the cone (with out raymarching one step at a time), and return the “outdoors” face that ought to be seen by the digicam.

struct fragout {
    fixed4 color : SV_Target;
    float depth : SV_Depth;
};

float eye_to_nonlinear_depth(float depth) {
    // Utilizing Unity conference:
    // _ZBufferParams.z = (1 - far)/(close to * far) 
    //                    (or -1 instances that if utilizing reverse Z buffer)
    // _ZBufferParams.w = 1 / close to 
    //                    (or 1/far if utilizing reverse Z buffer)

    return (1.0 - (depth * _ZBufferParams.w)) 
           / (depth * _ZBufferParams.z);
}

fragout frag (v2f i) {
    fragout o;

    // Coefficients for quadratic system:
    float a = dot(i.dir.xy, i.dir.xy) - 0.25f * i.dir.z * i.dir.z;
    float b = 2 * dot(i.pos.xy, i.dir.xy) - 0.5f * i.pos.z * i.dir.z;
    float c = dot(i.pos.xy, i.pos.xy) - 0.25f * i.pos.z * i.pos.z;

    // Discriminant - if lower than zero, we missed the cone fully.
    float disc = b*b - 4 * a * c;
    clip(disc);
    float step = sqrt(disc);

    // Parameter worth t the place we cross the cone.
    float t = (-b + step)/(2*a);

    // Flip which aspect we're  if outdoors the bounds.
    if (t < 0 || i.pos.z + i.dir.z * t > 1) {
        t -= step/a;
    }
    
    // Reconstruct native place on cone floor.
    float3 intersect = i.pos + t * i.dir;

    // If outdoors of bounds, clip it out.
    clip(intersect.z * (1-intersect.z));

    // For debug functions, show native intersection level as a color.
    o.color = fixed4(intersect, 1.0f);
    o.color.rg *= 2.0f;

    // t = 0 means we're on the unique depth interpolated from the vertices.
    // t = 1 means we have travelled the complete eye vector and hit the digicam.
    // So we will simply scale the unique depth by 1-t and remap it:
    o.depth = eye_to_nonlinear_depth(i.depth * (1.0f - t));

    return o;
}

As a result of we’re utilizing the thing’s (inverse) mannequin matrix, we will rotate and scale the cone simply by orienting and scaling the mannequin:

Oriented cone

And since we calculate the attention depth of the intersection, it Z-sorts accurately with different geometry:

Cone intersecting with cube

This model attracts the cone as if it is hole, however if you wish to cap it off, or draw it translucent so that you see each faces, and even fluctuate its opacity by how a lot of the cone’s stable quantity the view ray cuts by way of, any of these are doable too with just a few tweaks to the mathematics.

As soon as what a part of the cone your view ray is hitting, you’ll be able to shade it any approach you want – you are not restricted to the debug colors I’ve used right here.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles