Jump to content

Post-processing effects test (SSAO/HDR/Bloom)


Recommended Posts

I know it may seen a quite a dumb question, but how does it work? How do you control the intensity of the normals, the influence of the specularity.... these things. I have very limited codding knowladge, I only did some really basic actor codding so far, but I'm really excited about this. Will it be controled like this suggests? http://trac.wildfire...com/ticket/1493

Not a dumb question at all.

I suggest you download the patch, open demo/art/materials/player_trans_ao_parallax_spec.xml and read the comments. Look at demo/art/actors/structures/romans/temple_mars2.xml for an example of an actor using that material.

Does that answer your question? If there are specific features you need (such as a way to control the intensity of the normals from the materials), just tell me and I'll add them.

Link to comment
Share on other sites

Hmm... After taking a quick look at the code, it seens pretty much straight-foward. :) And yeah! A way to control the intensity of the normals/specularity within the actor file would be awesome.

The only thing that is rather confusing for me to understand is the AO. So it's not dynamically generated? You must "bake" it into a new UV set and have it multiplied?

Also, the collor of the spec map changes the specularity collor? Or it was there just for self-ilumination? The self-ilumination could be a different texture... It could make things easer. (I just don't see it being used very often :P)

Thanks again!

Link to comment
Share on other sites

Hmm... After taking a quick look at the code, it seens pretty much straight-foward. :) And yeah! A way to control the intensity of the normals/specularity within the actor file would be awesome.

Got it.

The only thing that is rather confusing for me to understand is the AO. So it's not dynamically generated? You must "bake" it into a new UV set and have it multiplied?

It is not dynamically generated. Ykkrosh wanted it to be dynamically generated, and these patches make it easier to implement that algorithm (which is actually not too complex). While that is definitely something I want to add in the future (perhaps in 0ad part 2), right now I already have my hands full, as you can see.

Also, the collor of the spec map changes the specularity collor? Or it was there just for self-ilumination? The self-ilumination could be a different texture... It could make things easer. (I just don't see it being used very often :P)

The colour of the spec map changes the specularity colour, the brightness (value) of that colour changes the amount of specular. The "specularPower" entry in the material xml controls the "shininess" factor.

The self-illumination is controlled by the alpha channel of the spec map, which you can ignore completely if you don't enable the USE_SELF_LIGHT effect in the material.

Edited by myconid
Link to comment
Share on other sites

I'm on a fairly old machine with an integrated graphics card listed as: "Intel Corporation 82G33/G31 Express Integrated Graphics Controller (rev 10)"

When the patch is applied and preferglsl and gentangents are set to true, I get this:

eyNZ6l.png

When preferglsl and gentangents are set to false, the game looks normal.

Link to comment
Share on other sites

I'm on a fairly old machine with an integrated graphics card listed as: "Intel Corporation 82G33/G31 Express Integrated Graphics Controller (rev 10)"

When the patch is applied and preferglsl and gentangents are set to true, I get this:

When preferglsl and gentangents are set to false, the game looks normal.

Looks like your old Intel graphics card may not be up to the task. If you try the unchanged version of the game with preferglsl=true does it work?

Edited by myconid
Link to comment
Share on other sites

When adding one patch at a time, 0AD didn't start crashing until I copied the glsl shaders into "binaries/data/mods/public/shaders/glsl", at step 8 of your instructions. Setting "gentangents = false" and "preferglsl = false" also stopped the crashing.

Can you please check if the attached files help at all (I'm working completely blind, so chances are they'll not). If this doesn't help, I'll need to modify the source code again... :crash:

modelmap-nvidia.zip

Link to comment
Share on other sites

Can you please check if the attached files help at all (I'm working completely blind, so chances are they'll not). If this doesn't help, I'll need to modify the source code again... :crash:

Actually, your changes seem to have worked. 0AD is no longer crashing when loading any scenario maps, and mars_temple2 is displaying as expected in Atlas with no problems ^^

Link to comment
Share on other sites

Some of the effects I want to add (trees, smooth LOS...) need to have game-time information. I realise the renderer has no way of keeping track of it, and I was wondering if there's a good reason for that, or if it's just an omission.

I'm thinking I could add something like a TimeManager class in the renderer that basically keeps track of:

  1. the smoothed time since the last frame, and
  2. global time

so those can be used for displaying graphical effects that are independent of the game simulation. Maybe even a simple interface/callback mechanism where classes can register themselves to receive an "update" call every frame, to replace the hacky bits in Gameview.cpp.

Again I need to ask, is there any reason why I shouldn't do this given the current architecture of the renderer?

Edited by myconid
Link to comment
Share on other sites

I think the simulation interpolation code should deal with this kind of thing. I haven't looked at it though so i don't know if it would help you at all.

I can get the interpolated time by directly modifying ps/Game.cpp or even main.cpp, though either of those options requires me to add "rendering-aware" code where it doesn't belong. What I'm suggesting is, every frame we pass the global time info into a specialised class that is inside the renderer... and hope there's no non-obvious threading going on anywhere that could break this.

Edited by myconid
Link to comment
Share on other sites

Looks like your old Intel graphics card may not be up to the task. If you try the unchanged version of the game with preferglsl=true does it work?

No, it's still bad, though the geometry doesn't seem to be messed up:

BUejos.png

By the way, any chance you could push your work to Github or a similar site as you go along, possibly as a branch off this? Makes it a lot easier for testers to juggle between the various versions.

Edited by zoot
Link to comment
Share on other sites

No, it's still bad, though the geometry doesn't seem to be messed up:

According to the spec, the Intel G33 chipset only supports OpenGL 1.4, and the game uses GLSL 1.2 that requires OpenGL 2.1. I'm surprised it runs at all (maybe we should check and automatically fallback to ARB/non-GLSL)!

The differences you see between the old and new shaders are probably caused by your card's 96-instruction shader limit: the old vertex shader fits in that limit, while the new one doesn't (hence messed up geometry in the new one), and the fragment shaders don't fit in either case (hence messed up texturing).

By the way, any chance you could push your work to Github or a similar site as you go along, possibly as a branch off this? Makes it a lot easier for testers to juggle between the various versions.

I'll see what I can do.

Edited by myconid
Link to comment
Share on other sites

I think I've found a nice little project for the weekend: material-based terrains!

The normal/specular terrain demos I did earlier sort of forced what the shaders should be and were quite inflexible, so I'm thinking the same materials system being used for models would work great for the terrains as well...

Link to comment
Share on other sites

Yep, sure would - great idea.

As an aside - For guys who have been around a while (like me) I think there is a level of irony that a windows .exe of your patch hasn't been readily available for testing purposes. 6 years ago, we were struggling to find people to build the engine on a linux on mac computer. A sign of the times I suppose - and a sign of being a part of the OS community :)

Link to comment
Share on other sites

Now testing on my netbook with an 'Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07)' graphics adapter.

Tree animations works with no issue ( :woot: ).

Parallax mapping works nicely in Atlas:

hSNPBs.jpg

It also works in-game, though lightning of the terrain flickers back and forth several times a second (normal, darker, normal, darker etc.) If I disable GLSL, there is no flickering. If I enable GLSL with the unpatched master branch, I also experience flickering. (Conclusion: Flickering not due to your changes, but due to a problem in the GLSL support, hardware or software.)

Edited by zoot
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...