Jump to content

myconid

WFG Retired
  • Posts

    790
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by myconid

  1. Really easy. The shadowmap needs to be passed into the water shader, and the water shader can just draw shadows on the water surface. historic_bruno already gave you the answer to that. I'd recommend to change the default depth bias to something like 0.005 when the GPU supports 32bit depth buffers. It's a pretty banner, though I'm just in it because I like the hacking.
  2. I think shadow volumes could become the industry's favourite in the upcoming generations of GPUs (geometry shaders! ), though for the moment shadowmaps are a safer bet for us. Besides, it'll be much easier to get decent results out of what we've already got than if we start from scratch. It's true that the shadowmap implementation we have needs some debugging, especially the GLSL version (it's on my list, but the list keeps getting longer!). If we can stop the flickering during camera rotations, we're good for 0ad part one. For part two, if there'll be moving lights, seasons and stuff like that, there'll need to be a way to fade smoothly between light states to reduce the flickering... so that can wait.
  3. Shadows are already done on the GPU. The flickering/pixellation is a limitation of what current shadowmapping technology can do (you'll notice similar artifacts in AAA games as well). There are some alternative techniques to mitigate the problem, such as cascaded shadowmaps, though in my view their results aren't good enough for their memory/performance cost. There may also be some techniques to prevent the flickering under some circumstances, such as during rotations. I don't know much about them offhand, but they should be worth looking into.
  4. Check the config for the "GPU skinning" option. Apparently it's disabled because it's slower, though I haven't tested it.
  5. Just having a bit of fun. After the video I thought to use the object coordinates as an offset of the phase to make "waves" of wind passing through the terrain. Didn't quite work out the way I wanted, but it did make it look random! Right now I'm using three cosines per vertex... maybe kind of inefficient, so I'll need to find a way to approximate it... Yeah, I came across it while changing the tree actors. It's possible. The particle emitter could share the wind information with the tree shader, though the two are different effects. That's a job for the animators, I think. Adding a physics engine would take a lot longer than 25 minutes!
  6. The correct answer is 25 minutes.
  7. If the question is directed at me, go right ahead. 512x512. About 100k in png. Simpler models can do with less. Oh, I should mention that the png loading code doesn't like greyscale images, which would make that about 15k.
  8. Probably not too long. Yeah, I figured it would be a lot simpler than I thought initially!
  9. Whoa, that's really clever and looks amazing! It seems like it's using the low-frequency components of a frequency-space texture to transform the trunk and branches and the high-frequency components to transform the leaves... I only skimmed the paper, but I suppose it needs a super-fast Fourier transform library to be done efficiently?
  10. True, if you try to rig the models with skeletons and animate them manually (in that case the vertex transforms are done on the CPU, and that has a severe performance penalty). What I'm suggesting is to send some generic parameters to the GPU (time and wind direction/intensity) and let it do what it does best. That's a different effect entirely (particles), though I suppose they could share the same wind parameters.
  11. I'll try to. The second image is from the unmodified game. The third image is the pure "ambient occlusion" texture, as it was rendered offline with raytracing in Blender. The first image is that texture blended with the model's usual lighting. Ambient occlusion is basically a measure of how much each part of the model is illuminated by indirect sunlight. That is, parts that are harder for ambient sunlight to reach, such as the areas inside the arches, are made darker, and that makes the model look much more natural. Another way to think about it is, AO determines how much each bit of the model's surface is exposed to the sky (Google for more explanation). At the beginning of this thread, I showed how AO can be approximated at runtime using SSAO ("Screen Space AO"), a post-processing effect applied after the scene is rendered. Unfortunately, SSAO is a very crude approximation that can introduce unwanted "haloing" around objects and is not very good for RTS games, where you want to show as much detail as possible. Ykkrosh then suggested to precompute the AO, though he wanted to approximate the AO textures at loading time instead of raytracing them in 3d software. The problem with bringing either kind of precomputed lighting to 0ad was that the texture mapping created by the artists reused parts of the same textures on different models and on different parts of the same models. This is a good thing, as it allows us to have better quality textures, but we can't use the same texture coordinates to implement something that varies from surface to surface and from model to model, such as lighting. That required the introduction of some code to allow additional sets of texture coordinates in the model files, so we can have a texture wrapping where each individual surface has its own unique bit of texture space. The textures we use to modify the model's lighting, such as the AO texture, are called lightmaps. The two patches above let us to use lightmaps by: allowing the engine to associate any number of textures with an object by putting them in the object's definition or by calling a function at runtime, and exposing them in the GLSL shaders allowing the engine to load any number of texture coordinate sets from the models, and exposing them in the GLSL shaders By creating a new kind of object material that tells the shaders to expect an extra texture and an extra texcoord set, the shaders know to use a new bit of code that combines those with the model to create any lighting effect we like. The example pic above uses a raytraced AO texture created in Blender, though Ykkrosh's idea is now also possible without other major modifications to the engine (though that's not a priority for me, as there are lots of other improvements I can work on). So there you go. I hope all that made sense!
  12. Here's a more recent game that has nice trees: Here's how I think it can be done (without even modifying the tree models): there's a global wind direction/intensity shared by all trees each tree vertex knows its height from the ground vertices higher up sway more (in the wind direction) swaying is done by adding a sin or cos wave to the vertex positions maybe there's some randomness to make different trees look unique All this can be done in the vertex shader, so the performance cost is probably minor. Same effect can be generalised to also do things like moving grass, cloth etc.
  13. Ah, I see. I'm sort of forcing the AO onto all the model's lighting (diffuse + ambient), which isn't quite correct (always makes it too dark). It should really just affect the ambient light! If I do that, you can achieve what you're describing by fiddling with the ambient/sun colour settings in Atlas.
  14. It's possible, but it's the same thing as changing the contrast in the map itself, which is the right way to do it.
  15. The lightmaps/AO (the two patches in the last post) are ready for review, as they are independent from the earlier stuff. In fact, these two patches will be dependencies for the cleaned up version of the model normal/parallax/etc mapping. I'll first clean up model mapping, then terrain mapping, then smooth LOS, then screen-space effects. I'll make each a separate patch, and hopefully some/all of them will make it into Alpha 11 (end of July?). plumo and Wijitmaker, will something like this do? http://imgur.com/a/8ezQj
  16. I had some free time earlier today (finally!), so quite a few things got done. I made the changes that historic_bruno and Mythos_Ruler wanted for the texture loading: http://trac.wildfire...com/ticket/1493 I added support for multiple UV sets per model, so lightmaps are now possible: http://trac.wildfire...com/ticket/1497 I wrote a simple test that depends on both the above, which shows off precomputed AO. Here's what the last one looks like, if you're interested. The necessary files are attached below, but recall that the two patches linked above are also needed for it to work. offlineAO.zip
  17. The xml parser needs to know the element and attribute names at compile time, but not the values. (Ok, this is not universally true, but it's how it's set up in the engine). If someone writes a GLSL shader we haven't thought about, he might require textures whose names we didn't anticipate during development, so he'll need to modify the engine to add them or be forced to use inappropriate texture names in his shader.
  18. If those can do the conversion and things work out smoothly, I don't care at all about the deprecation... I'm that lazy, hehe!
  19. Well, the difference between deprecating and removing is about 5 lines of code, so it's not like there's code duplication or anything. What I'm concerned about is that if we remove it completely we'll have to modify all the actors before we can do anything else. There are a lot of actors...! Unfortunately that format isn't flexible enough. I can easily change it to <Textures><Texture name="diffuse" file="x.png"/></Textures> format if you guys think it's more readable.
  20. My thought was to make it like a "procedure call", meaning the material defines an interface and the actor provides the arguments. On second thought that's a really stupid idea. How about a system similar to the one you're suggesting, where the absence of the attribute is assumed to mean "base"? That way the actors still don't need to be updated, but the semantics are clear enough. Edit: Or, I can create an entirely new element type that doesn't conflict with the existing <texture> element. <texture> can be kept unchanged but deprecated and replaced in time. <samplers> <sampler name="baseTex" file="base.png"/> <sampler name="normalTex" file="normal.png"/> </samplers>
  21. Thanks for testing! A quick update about what I'm thinking about right now: How do we implement more advanced texturing..? The key is to let the xml files define how many textures need to be loaded for each actor, instead of hardwiring that into the code. For instance, the engine normally supports a single texture per object, while the normal/specular maps bumped that up to 3. What I'm considering is to have <sampler> entries in the material xml files that tell the engine how many textures are needed and what their names are in the shaders. e.g. <sampler name="baseTex"/> <sampler name="normTex"/> <sampler name="AOTex"/> tells the engine that the material/attached shader needs 3 textures, one being the diffuse texture, another being the normal map and the third the precomputed AO texture. The actor files can then define the actual texture files, and that will not require changes to the existing actor files if the <texture> element is used for this task. e.g. <texture>base.png</texture> <texture>norm.png</texture> <texture>ao.png</texture> The correspondence to the material entries is given by the order they appear in.
  22. I might have done that backwards. Instead of that previous change, use: glDisable(GL_BLEND); There have also been some fixes in the shaders since the last patch and they might be related. If this doesn't fix the problem, then the mistake is/was in the shaders.
×
×
  • Create New...