Thursday, November 11, 2010

Milestone Update: Let There Be Post Processing

Cool! So I have a lot of  updates from just a couple of days ago...

Light Pre-Pass
All lighting issues that I was having before I resolved! That weird thing with the normals was due to recalculating the normal wrong.

So, what I was doing was saying Z was -(1 - abs(X) - abs(Y)) since I know that that magnitude of the normal should be 1. Given that all normals should be facing the camera and that DirectX is a left-handed system, you may safely assume that all Z components are negative. This is almost correct except I was being silly and needed to use the Pythagorean theorem. Therefore, Z should be -sqrt(1 - X*X - Y*Y).

Further, I was packing the X and Y of the normal incorrectly in the GBuffer. If they were negative they would be clamped to 0. I finally realized I needed to repack them as (X+1)/2 and (Y+1)/2 and unpack them as X*2 - 1 and Y*2 - 1. This would finally allow me to recalculate Z reliably and free up the Z component in the GBuffer to be conflated with the W component. That means I could pack the depth component across two channels allowing me 16-bit depth in my lighting calculation!

//Normal Z recalculation:

//Stored as [0, 1], convert to [-1, 1]
float3 lNormal = 2*lGBufferSample.xyz - float3(1,1,1);

//Z is guaranteed to be negative and to make the magnitude of the normal 1:
lNormal.z = -sqrt(1.f - lNormal.x*lNormal.x - lNormal.y*lNormal.y);

NOTE: I'm using ARGB8 for all my render targets for maximum compatibility with target machines.

The GBuffer where only XY of the normal are stored, and ZW contains the packed depth data.
Radius Based Attenuation
I really wanted to be able to have radius-based point lights; however, given the standard attenuation model of 1/(C + L*d + Q*d*d) where C, L, and Q are, respectively, the constant, linear, and quadratic components, being able to reasonably light a scene would require light hulls with massive radii. This completely defeats the purpose of light hulls: to reduce the amount of pixels being lit since lighting is a fillrate intensive operation.

I realized what I really wanted to was to be able to manually describe with the falloff looked like. What I did in the interim is use an Ease Function. This is a function that expects normalize input and gives normalized output. It's used to "ease" something "in" and "out" of a state (1 and 0, respectively). I defined my Ease Function as a simple quadratic and input the distance of a given fragment, normalizing it with respect to the light hull's radius. This output is my attenuation. As simple as that, and with no more computation than the typical attenuation model.
Exaggerated quadratic falloff of the Ease Function.
This opens up a whole new world of lighting models for me. I'm going to give my tool a control that defines the Ease Function as a quadratic Bezier curve. This would allow an artist or designer to describe exactly the falloff look they want.

Here's an example of what I plan for the control:
The distance is normalized and passed in as the "X" axis and results in an attenuation, the "Y" axis.
If the user really does want a more realistic attenuation model, standard 1/r^2 can even be modelled by pulling the control point into a 1/r^2 looking curve. Splines are super quick to calculate as much of the math can be precomputed before hand. It'll just boil down to a few multiplications and some additions when I'm done. The only difficult part will be how I turn a distance into a t-value as Bezier splines are defined in parametric form. I'll probably have to make a few assumptions and short-cuts about negatives, but I think it'll yield very intuitive and useful results for content developers.

Further, even if splines become too heavy, on saving out the light info, the curve could easily be baked to an Ease Function. Most lights won't change their attenuation, but for those that do, the overhead of dynamic attenuation will likely be acceptable.

Material Editing
Now that I've completed my post processing framework, I can finally start on my material editor. This will allow content developers to specify normal maps, specular maps, BRDF maps and the coefficients that control them.

Here's a quick run-through of my (limited) stages. The model in these images is the Alchemist from the Torchlight series. The Torchlight assets were published for modders by Runic Games. Thanks to the roughly 210 MB of assets, I've found many edge cases in my model converter. This allows my tool to be much more robust by being as useful as possible to the artists and designers who would use my tool.
The GBuffer with 2-channel packed depth. The Alchemist didn't have a normal map, so I'm just applying a cool junk normal map to him.

Lighting data constructed from the GBuffer.

Albedo combine with the light information.

A simple Bloom Lighting effect applied.

After AntTweakBar and debug UI are applied.

No comments:

Post a Comment