Friday, 13 July 2012

Crepuscular (God) Rays and Web UI Sample

OK, this is two samples rolled into one, the first part of this sample will cover the post processing effect of Crepuscular rays I have created in XNA based on the GPU Gems 3 article Volumetric Light Scattering as a Post-Process. The second part of this post covers the Web UI that is used in the sample. I initially intended to give a short talk on it at the September 2011 XNA-UK meeting but we had a great talk from the guys over at IndieCity.com and so I never got around to it.
Crepuscular Rays Effect Overview
So, after reading the GPU Gems article I thought it should be easy to get the effect working in my Post Processing framework, which, if you missed it, I posted the source for a while ago, have made a few changes in my latest version, but that framework should still fly. I also thought that I could use my existing sun post process, again, if you missedthat, you can find it on stg conker, and incorporate it into the effect. So, the steps used to create this effect are, render the sun to a render target, black out any occluded pixels in that image by matching them against the depth buffer (created when you rendered the scene), pass this image onto the GPU Gems god ray pixel shader, use a bright pass pixel shader (taken from my Bloom shader) to brighten the rays, then render this final texture and blend it back with the scene image.
All in all it’s a 5 pass effect, this could be reduced by having the occlusion pass in with the rendering of the original sun/light source pass, and negating the bright pass. So, lets get into the shaders, probably wont post any C# here as you have the March 2011 talk and code I gave to fall back on, a change you will notice is that I have moved away from using SpriteBatch to render the RT’s as this restricted the post processing framework to Shader model 2.
LightSourceMask.fx (or the old Sun shader tidied up a bit)
#include "PPVertexShader.fxh"
float3 lightPosition;
float4x4 matVP;
float2 halfPixel;
float SunSize = 1500;
texture flare;
sampler Flare = sampler_state
{
    Texture = (flare);
    AddressU = CLAMP;
    AddressV = CLAMP;
};
float4 LightSourceMaskPS(float2 texCoord : TEXCOORD0 ) : COLOR0
{
    texCoord -= halfPixel;
    // Get the scene
    float4 col = 0;
    // Find the suns position in the world and map it to the screen space.
    float4 ScreenPosition = mul(lightPosition,matVP);
    float scale = ScreenPosition.z;
    ScreenPosition.xyz /= ScreenPosition.w;
    ScreenPosition.x = ScreenPosition.x/2.0f+0.5f;
    ScreenPosition.y = (-ScreenPosition.y/2.0f+0.5f);
    // Are we lokoing in the direction of the sun?
    if(ScreenPosition.w > 0)
    {      
        float2 coord;
        float size = SunSize / scale;
        float2 center = ScreenPosition.xy;
        coord = .5 - (texCoord - center) / size * .5;
        col += (pow(tex2D(Flare,coord),2) * 1) * 2;                      
    }
    return col;  
}
technique LightSourceMask
{
    pass p0
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 LightSourceMaskPS();
    }
}
You can see at the top there a reference to aPPVertexShader.fxh, this is just a header that has the vertex shader in it, I do this so I don’t have to repeat the shader in shaders that share the same vertex shader.
So, like in the sun shader, we find the point in world space of the light source and render the texture.
So we end up with an image like this:


LightSceneMask.fx
#include "PPVertexShader.fxh"
float3 lightPosition;
float4x4 matVP;
float4x4 matInvVP;
float2 halfPixel;
sampler2D Scene: register(s0){
    AddressU = Mirror;
    AddressV = Mirror;
};
texture depthMap;
sampler2D DepthMap = sampler_state
{
    Texture = <depthMap>;
    MinFilter = Point;
    MagFilter = Point;
    MipFilter = None;
};
float4 LightSourceSceneMaskPS(float2 texCoord : TEXCOORD0) : COLOR0
{
    float depthVal = 1 - (tex2D(DepthMap, texCoord).r);
    float4 scene = tex2D(Scene,texCoord);
    float4 position;
    position.x = texCoord.x * 2.0f - 1.0f;
    position.y = -(texCoord.y * 2.0f - 1.0f);
    position.z = depthVal;
    position.w = 1.0f;
    // Pixel pos in the world
    float4 worldPos = mul(position, matInvVP);
    worldPos /= worldPos.w;
    // Find light pixel position
    float4 ScreenPosition = mul(lightPosition, matVP);
    ScreenPosition.xyz /= ScreenPosition.w;
    ScreenPosition.x = ScreenPosition.x/2.0f+0.5f;
    ScreenPosition.y = (-ScreenPosition.y/2.0f+0.5f);
    // If the pixel is infront of the light source, blank it out..
    if(depthVal < ScreenPosition.z - .00025)
        scene = 0;
    return scene;
}
technique LightSourceSceneMask
{
    pass p0
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 LightSourceSceneMaskPS();
    }
}
In this shader we take the renderer light scene and then black out the pixels that are occluded by objects in the scene, which then gives us an image like this:


LightRays.fx
#include "PPVertexShader.fxh"
#define NUM_SAMPLES 128
float3 lightPosition;
float4x4 matVP;
float2 halfPixel;
float Density = .5f;
float Decay = .95f;
float Weight = 1.0f;
float Exposure = .15f;
sampler2D Scene: register(s0){
    AddressU = Clamp;
    AddressV = Clamp;
};
float4 lightRayPS( float2 texCoord : TEXCOORD0 ) : COLOR0
{
    // Find light pixel position
    float4 ScreenPosition = mul(lightPosition, matVP);
    ScreenPosition.xyz /= ScreenPosition.w;
    ScreenPosition.x = ScreenPosition.x/2.0f+0.5f;
    ScreenPosition.y = (-ScreenPosition.y/2.0f+0.5f);
    float2 TexCoord = texCoord - halfPixel;
    float2 DeltaTexCoord = (TexCoord - ScreenPosition.xy);
    DeltaTexCoord *= (1.0f / NUM_SAMPLES * Density);
    DeltaTexCoord = DeltaTexCoord * clamp(ScreenPosition.w * ScreenPosition.z,0,.5f);
    float3 col = tex2D(Scene,TexCoord);
    float IlluminationDecay = 1.0;
    float3 Sample;
    for( int i = 0; i < NUM_SAMPLES; ++i )
    {
        TexCoord -= DeltaTexCoord;
        Sample = tex2D(Scene, TexCoord);
        Sample *= IlluminationDecay * Weight;
        col += Sample;
        IlluminationDecay *= Decay;          
    }
    return float4(col * Exposure,1);
    if(ScreenPosition.w > 0)
        return float4(col * Exposure,1) * (ScreenPosition.w * .0025);
    else
        return 0;
}
technique LightRayFX
{
    pass p0
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 lightRayPS();
    }
}
As you can see, this is pretty much the same shader in the GPU Gems article, but we calculate the onscreen light source position in the shader. This gives an image like this:


Pretty eh :D
BrightPass.fx
#include "PPVertexShader.fxh"
uniform extern float BloomThreshold;
float2 halfPixel;
sampler TextureSampler : register(s0);
float4 BrightPassPS(float2 texCoord : TEXCOORD0) : COLOR0
{
    texCoord -= halfPixel;
    // Look up the original image color.
    float4 c = tex2D(TextureSampler, texCoord);
    // Adjust it to keep only values brighter than the specified threshold.
    return saturate((c - BloomThreshold) / (1 - BloomThreshold));
}
technique BloomExtract
{
    pass P0
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 BrightPassPS();
    }
}
This shader just takes the scene and based on a threshold brightens it up like this:


SceneBlend.fx
#include "PPVertexShader.fxh"
float2 halfPixel;
sampler2D Scene: register(s0){
    AddressU = Mirror;
    AddressV = Mirror;
};
texture OrgScene;
sampler2D orgScene = sampler_state
{
    Texture = <OrgScene>;
    AddressU = CLAMP;
    AddressV = CLAMP;
};
float4 BlendPS(float2 texCoord : TEXCOORD0 ) : COLOR0
{
    texCoord -= halfPixel;
    float4 col = tex2D(orgScene,texCoord) * tex2D(Scene,texCoord);
    return col;
}
float4 AditivePS(float2 texCoord : TEXCOORD0 ) : COLOR0
{
    texCoord -= halfPixel;
    float4 col = tex2D(orgScene,texCoord) + tex2D(Scene,texCoord);
    return col;
}
technique Blend
{
    pass p0
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 BlendPS();
    }
}
technique Aditive
{
    pass p0
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 AditivePS();
    }
}
And Finally we blend this with the original scene, the Additive technique is used in this sample, giving a final image like this:


So there you have the god ray post process.
Web UI
OK, so now onto the UI, I am using a third party library call Awesomium, and it is indeed Awesome, well I think so. It is basically a web renderer, you give it a url, it renders it and spits out a texture, we can then render this texture. Now, if it just did that it would not be much use, thankfully we can wire up call backs to it and pass mouse and keyboard events to it. This means we can, through our game interact with the web page. It means you can create all your UI’s in HTML using great stuff like JQuery and any other web tech you can pile into your game. Now this sample has all it’s web UI local but you could serve the entire game UI from your site.
I first came across this tool while working on ST:Excalibur as we use it to drive the UI and was really impressed with it, so thought I would do a version for XNA. In order to use this tool you need to download the Awsomium source and compile the AwesmiumSharp project, once you have that there are a number of assemblies you will need from that build adding to your project, all the details on how to do this can be found in the ppt what comes with this sample. Once you have all that in place you can create a DrawableGameComponent like this one to handle your Web UI
    public class AwesomiumUIManager : DrawableGameComponent
    {
        public int thisWidth;
        public int thisHeight;
        protected Effect webEffect;
        public WebView webView;
        public Texture2D webRender;
        protected int[] webData;
        public bool TransparentBackground = true;
        protected SpriteBatch spriteBatch
        {
            get { return (SpriteBatch)Game.Services.GetService(typeof(SpriteBatch)); }
        }
        public string URL;
        public AwesomiumUIManager(Game game, string baseUrl)
            : base(game)
        {
            URL = baseUrl;
            DrawOrder = int.MaxValue;
        }
        protected override void LoadContent()
        {
            WebCore.Config config = new WebCore.Config();
            config.enableJavascript = true;
            config.enablePlugins = true;
            WebCore.Initialize(config);
            thisWidth = Game.GraphicsDevice.PresentationParameters.BackBufferWidth;
            thisHeight = Game.GraphicsDevice.PresentationParameters.BackBufferHeight;
            webView = WebCore.CreateWebview(thisWidth, thisHeight);
            webRender = new Texture2D(GraphicsDevice, thisWidth, thisHeight, false, SurfaceFormat.Color);
            webData = new int[thisWidth * thisHeight];
            webEffect = Game.Content.Load<Effect>("Shaders/webEffect");
            ReLoad();
        }
        public virtual void LoadFile(string file)
        {
            LoadURL(string.Format("file:///{0}\\{1}", Directory.GetCurrentDirectory(), file).Replace("\\", "/"));
        }
        public virtual void LoadURL(string url)
        {
            URL = url;
            webView.LoadURL(url);
            webView.SetTransparent(TransparentBackground);
            webView.Focus();
        }
        public virtual void ReLoad()
        {
            if (URL.Contains("http://") || URL.Contains("file:///"))
                LoadURL(URL);
            else
                LoadFile(URL);
        }
        public virtual void CreateObject(string name)
        {
            webView.CreateObject(name);
        }
        public virtual void CreateObject(string name, string method, WebView.JSCallback callback)
        {
            CreateObject(name);
            webView.SetObjectCallback(name, method, callback);
        }
        public virtual void PushData(string name, string method, params JSValue[] args)
        {
            webView.CallJavascriptFunction(name, method, args);
        }
        public void LeftButtonDown()
        {
            webView.InjectMouseDown(MouseButton.Left);
        }
        public void LeftButtonUp()
        {
            webView.InjectMouseUp(MouseButton.Left);
        }
        public void MouseMoved(int X, int Y)
        {
            webView.InjectMouseMove(X, Y);
        }
        public void ScrollWheel(int delta)
        {
            webView.InjectMouseWheel(delta);
        }
        public void KeyPressed(Keys key)
        {
            WebKeyboardEvent keyEvent = new WebKeyboardEvent();
            keyEvent.type = WebKeyType.Char;
            keyEvent.text = new ushort[] { (ushort)key, 0, 0, 0 };
            webView.InjectKeyboardEvent(keyEvent);
        }
        public override void Update(GameTime gameTime)
        {
            WebCore.Update();
            if (webView.IsDirty())
            {
                Marshal.Copy(webView.Render().GetBuffer(), webData, 0, webData.Length);
                webRender.SetData(webData);
            }
            base.Update(gameTime);
        }
        public override void Draw(GameTime gameTime)
        {
            if (webRender != null)
            {
                spriteBatch.Begin(SpriteSortMode.Immediate, BlendState.AlphaBlend, SamplerState.PointClamp, DepthStencilState.Default, RasterizerState.CullCounterClockwise);
                webEffect.CurrentTechnique.Passes[0].Apply();
                spriteBatch.Draw(webRender, new Rectangle(0, 0, Game.GraphicsDevice.Viewport.Width, Game.GraphicsDevice.Viewport.Height), Color.White);
                spriteBatch.End();
                Game.GraphicsDevice.Textures[0] = null;
            }
        }
        protected void SaveTarget()
        {
            FileStream s = new FileStream("UI.jpg", FileMode.Create);
            webRender.SaveAsJpeg(s, webRender.Width, webRender.Height);
            s.Close();
        }
    }
So, this call enables you to use the Awesomium WebView object, create objects that reside on the UI that can then call back into our C# code as well as access functions on the UI that we can call from our C# code. Again, the set up of these elements are described in the ppt and accompanying solution.
Effectively, i have created a html page in the Content project, made sure all it’s elements are not compiled and are copied over if newer, I can then tell my AwesomiumUIManager to go and get that html page in the constructor like this
            HUD = new AwesomiumUIManager(this, "Content\\UI\\MyUI.html");
In this sample I am not adding it to the Components list as I don’t want it included in the post processing, so I have to initialize, update and draw it my self in the respective Game methods.
In the Game.LoadContent method I set up two script objects in the HUD, these can then be called from the html to pass data back to my C# code, one for click events and one for the slider events.
            HUD.CreateObject("UIEventmanager", "click", webEventManager);
            HUD.CreateObject("UIEventmanager", "slide", webEventManager);
In my Game.Update I can push data back to the UI, the push method calls the two methods passing the param to them
            HUD.PushData("", "ShowSunPosition", new JSValue(sunPosition.X), new JSValue(sunPosition.Y), new JSValue(sunPosition.Z));
            HUD.PushData("", "SetVars", new JSValue(GodRays.BrightThreshold), new JSValue(GodRays.Decay), new JSValue(GodRays.Density), new JSValue(GodRays.Exposure), new JSValue(GodRays.Weight));
Also in the Game.Update method I have to ensure the WebView is getting the mouse and keyboard events
            // Manage the mouse and keyboard for the UI
            if (thisMouseState.LeftButton == ButtonState.Pressed)
                HUD.LeftButtonDown();
            if (thisMouseState.LeftButton == ButtonState.Released && lastMouseState.LeftButton == ButtonState.Pressed)
                HUD.LeftButtonUp();
            HUD.MouseMoved(thisMouseState.X, thisMouseState.Y);
            HUD.ScrollWheel(thisMouseState.ScrollWheelValue - lastMouseState.ScrollWheelValue);
            if (thisKBState.GetPressedKeys().Length > 0)
                HUD.KeyPressed(thisKBState.GetPressedKeys()[0]);
You may also notice in the Draw call of the AwesomiumUIManager I am using a shader, this is because  the texture returned from the WebView is in bgra format so I have to switch the channels around in a shader like this
uniform extern texture sceneMap;
sampler screen = sampler_state
{
    texture = <sceneMap>;  
};
struct PS_INPUT
{
    float2 TexCoord    : TEXCOORD0;
};
float4 Render(PS_INPUT Input) : COLOR0
{
    float4 col = tex2D(screen, Input.TexCoord).bgra;  
    return col;
}
technique PostInvert
{
    pass P0
    {
        PixelShader = compile ps_2_0 Render();
    }
}
So, that’s the end of this mammoth post, hope you find the content useful, as ever, let me know if you have any questions or issues. Oh, and before anyone else points it out, I know my web skill’s ain’t all that :P
The code samples for both this post, and the talk I was to give can be found here.




























22 comments:

  1. I see that much of your source is very similar to your 2D Crepuscular (God) Rays example but with a 3D environment.

    Aside from the obvious float2 lightScreenPosition to float3 lightPosition variable in LightXXXXMask.fx files, how do you implement the depthMap instead of the flare in the 3D environment?

    ReplyDelete
    Replies
    1. Hi Jordan,

      I actually did the 3D code before the 2D one, once I did the 3D one, it made sense that this effect would also work great in 2D.

      As to your question, I am not sure what you are asking here? If you are asking how is the depth map created, then all the info is in the post above, but what is happening in the sample is that each object in the scene is writing to 2 render targets, the first is the color map, as would normally occur with a forward render, the second is the depth map, this is where the position of the objects's pixels are in relation to the camera. With this information we can then create the mask and use that in the effect.

      Hope this helps.

      Delete
  2. Thanks Charles,

    I may have pulled the trigger a bit early with the post as I had not discovered your XBLIGUK2011 Talk code yet. I'm going to attempt to apply your creps shader to that source on the sun with the lens flare.

    Funny part was, somehow I managed do the opposite and use your 2D creps shader in a 3D environment (minus proper calculation of the lightscreenposition) hence my misunderstanding of trying to port it from 2D over to 3D.

    So far, your posts have been the only sources to 'shed some light' :P on how to successfully implement creps shaders with limited HLSL experience.

    ReplyDelete
  3. Ahhh, OK, well hope you get it working how you want it.

    Glad you are finding my posts useful :)

    ReplyDelete
  4. Hi Again Charles,

    I have hit one snag that I don't quite understand - I can only 'see' the effect displayed properly when ScreenPosition is calculated below:
    float4 ScreenPosition = mul(lightPosition - cameraPosition, matVP);
    ...and not when it's not the difference between light source and camera.

    Do you know what I'm overlooking?

    ReplyDelete
    Replies
    1. Hi Jordan, not got the code in front of me, might not get chance to have a look at the week end. Check the difference between my sample and yours for when you are passing the lightPosition to the camera..

      Still not 100% what your issue is though...

      Delete
  5. I tracked it down to a difference between Sun.fx and LightSourceMask.fx :

    // Find the suns position in the world and map it to the screen space. float4 ScreenPosition = mul(lightPosition,matVP); //Above post
    float4 ScreenPosition = mul(lightPosition - cameraPosition,VP); //march 2011 talk (working for me)

    I don't understand why the lightPosition calculation for the old shader is working with the new 'tidied up' one ;)

    ReplyDelete
  6. Hi Charles,

    I was finally able to get it to work! The code looks like frankenstein with elements from both the March code and the 2dGodRays. Made use of the SaveTexture(...) function and saw that the scene wasn't occluding the pixels from the depth buffer. After playing with the LightSceneMask effect, my final depthVal value check was adjusted and now everything has fallen into place.

    float4 LightSourceSceneMaskPS(float2 texCoord : TEXCOORD0) : COLOR0
    {
    /* get depth buffer pixel */
    float depthVal = tex2D(DepthMap, texCoord) - 1; //white bkgrnd

    /* get the scene */
    float4 scene = tex2D(Scene,texCoord);

    /* if stuff in front, black out pixel from scene */
    if(depthVal < 0) scene = 0;

    /* return the scene */
    return scene;
    }

    ReplyDelete
  7. I'd like to share my work (based on your work) - may I post the youtube link here or PM you privately. I've also made a planet shader based on yours since it jump started my learning process regarding shaders and HLSL.

    ReplyDelete
    Replies
    1. Hi Jordan, Sure, post your youtube clip and I can PM you from there I think :D

      Delete
  8. Here goes nothin'

    www.youtube.com/watch?v=WqqvStEX3FE

    Thanks!

    ReplyDelete
    Replies
    1. Just commented on it :) Nice work, I love Star Control! :D

      Delete
  9. New Update!

    www.youtube.com/watch?v=5hQqDAtOtlY
    Some of your March 2011 talk code is used in the UI "blur" and depth of field for distant objects. Was able to extend my own "static" effect from rendermonkey to give the UI some sort of randomized glitching effect. I had a bad implementation of your planet shader and finally fixed my errors & bad understanding how the culling works to draw the inner glow.

    ReplyDelete
  10. Hey Charles,

    Do you still have the code samples for this post ? The links seems to be dead... :(

    Cheers,

    ReplyDelete
    Replies
    1. Hi Zeniac,

      Ill check it out and let you know once I have fixed the link. I should still have the zip for it :)

      Delete
    2. Hey man,
      Have you been able to recover the zip ?

      Delete
    3. ahhhhh sorry, I have totally forgot, Ill have a look tonight.

      Delete
    4. Try the URL now, hope I have pointed it at the right zip :)

      Delete
    5. Uhuuuu !!! All working ! Thanks man !

      Delete
    6. Cool, hope you find it useful :)

      Delete
  11. This comment has been removed by the author.

    ReplyDelete