Jump to content
Sign in to follow this  
RedFox

[Development Decision] Should we drop old ARB shaders?

  

39 members have voted

  1. 1. Should we drop old ARB shader and stick to GLSL?

    • Yes, remove ARB code and only use GLSL
      26
    • No, we should work hard and support both ARB and GLSL
      3
    • Try to keep both, but make GLSL the default
      10
  2. 2. Do you have an OpenGL 2.0 capable graphics card? (almost every card since 2004)

    • Yes
      37
    • No
      2


Recommended Posts

Looking at the code right now, we have 2 different copies of shader code:

1) Old legacy ARB shaders from fixed-function era OpenGL 1.5.

2) OpenGL 2.0 GLSL shaders.

It's been 10 years since OpenGL 2.0 was introduced and since we've already decided to move away from fixed function code, it would only make sense to drop ARB shaders.

For those not familiar with ARB / GLSL, this is a type of code that is run by your graphics card to render pretty pixels on the screen. Right now the old ARB support code is adding quite unnecessary amount of code in our shader pipeline.

Removing it would make our code smaller, easier to manage and also faster, since we wouldn't need to have a preprocessor wrapper. If we no longer need a preprocessor wrapper, a lot of shader compilation code would also get smaller.

Furthermore, since we are currently supporting the ancient ARB shaders - with every change in shader code, we'd have to also update and test the ARB shaders. We're already low on developers as it is, so I don't see any reason to justify that overhead.

Even more important is that most code is written in modern GLSL - nobody uses old ARB assembly anymore. Nobody would update the ARB shaders anyways. This would hamper any shader development.

Right now is the time to vote on this matter and seriously consider moving onto GLSL permanently.

Share this post


Link to post
Share on other sites

I think this is a legitimate issue that can only be truly solved with hard data: we need to know who doesn't support Glsl "fully" in the 0 AD player base. We're currently only requiring GLSL 1.10. Apparently some semi-old linux machines have trouble with them. If that's not fixable by updating the drivers and the number of people running such machines are high enough (~5/10% of the total), I think we'd have a good reason to keep ARB, despite whatever pain it might be.

Secondly, I think that this won't necessarily allow us to do much fancier stuffs in the GLSL shaders because fancier stuff require newer versions of GLSL and OpenGL. Though it's probably much less of a hassle to have different GLSL scripts for different versions than different types of shaders altogether.

Finally, I stress that ARB is indeed old, deprecated, and that it's already not up-to-date with GLSL in 0 AD's current codebase. I also agree it adds a lot of code in the C++, I've run into it once in a while and it's not too nice.

  • Like 1

Share this post


Link to post
Share on other sites

I don't have a strong opinion either way, but I do think it is a good idea to have ARB as a fallback. It may be "old", but it is also very well-established and standardized, while every vendor seem to cook up their own semantics for GLSL. On my previous machine, for instance, everything would flicker oddly when using the GLSL renderpath, while the standard ARB one would just work and look exactly as it does for everyone else.

Share this post


Link to post
Share on other sites

Before removing the ARB shader it would be wise to enable GLSL by default for at least a couple of releases to spot any eventual problem with it.

Share this post


Link to post
Share on other sites

arb is too old, but some peole use linux that support old hardware . what graphic use all 3D Rts? specially age of mythology, if runs mythology can run 0 a.d. both sre very similar in graphics.

Share this post


Link to post
Share on other sites

I think it is a good suggestion to make GLSL the default for the next release, as Fabio says. Perhaps many people have not enabled it manually (as there is no settings panel to do it). Would it be possible to look at recent player/computer profile data to see how many people would be affected by dropping ARB, which means going GLSL-only? I suppose for most people in the "GLSL is troublesome" category it would be a slight performance drop if ARB is no longer available.

It would be best to see who will be affected, and to what extent.

Some quantitative questions to answer that will help the decision:

* What percentage of recent players can use GLSL 2.0?

* What percentage of recent players can use GLSL 3.0?

* What percentage of recent players actually have GLSL enabled?

* What would be the performance bonus of dropping the preprocessor wrapper(s)?

Maybe it's good to focus on getting enough fresh data from the next Alpha 14 release to answer some of the questions. Can the current profiling data answer this? See also this post with some links to data. Also, this trac page on GraphicsCompatibility lists 14% of users not being able to handle GLSL (data from 2011, so numbers may be lower now). I think it would be good to redo this analysis and make a decision.

Edited by dvangennip

Share this post


Link to post
Share on other sites

I don't even know if the internal graphics card on my Gateway E4500D can handle GLSL, the main menu background flickers, but everything else seems to work as of Alpha 13. Alpha 11 was the last Alpha that worked on my parents' older Gateway with a Nvidia GeForce 4200 Ti, I was using GeForce 6200 but it almost seems inferior to the internal graphics cards of the E4500D.

Share this post


Link to post
Share on other sites

Intel GMA graphics is quite rubbish - it supports OpenGL 1.4~ something and it usually doesn't run anything decently. Last time I tried running 0AD on my old laptop that had Intel GMA graphics, the whole game was blue and it ran around 2 FPS.

In this sense, it doesn't really matter - those computers are so ancient they can't even run the game anyways :(

I think the most sensible move would be to enable "Prefer GLSL" in this release, then iron out all the GLSL bugs - I'm fairly sure that most issues are present because the GLSL version hasn't been properly debugged on multiple platforms. Once we get around to ironing out the bugs we can drop ARB shader support.

How does that sound?

Share this post


Link to post
Share on other sites

I'd be 100% fine with that. The GLSL support is indeed most likely not perfect and the renderer really could use a bit of work anyhow.

Though still, I'd rather see you working on some other things in your to-do list first.

Share this post


Link to post
Share on other sites

Enabling "Prefer GLSL" in this release, then iron out all the GLSL bugs sounds very reasonable.

I'd also flag the cross-platform issues. Does 0AD yet support Apple's newer OpenGL 3.2 Core profile? If not, there could be issues upon dropping the older ARB shaders on Mac hardware that would otherwise run 0AD fine.

The OpenGL 3.2 Core profile was presented as a clean break from the past for Apple devs, but it must be explicitly setup and called in an application -- there's no automatic benefit from it.

Share this post


Link to post
Share on other sites

The (GLSL) shader mode should possibly enabled by default when OpenGL >= 2.0 is supported, maybe in svn just after the next release. Currently I suppose few players change from the default fixed render path.

Share this post


Link to post
Share on other sites

Have you guys reviewed wfg's data?

http://www.wildfireg...showtopic=13784

http://www.wildfireg...showtopic=15089

Making reports like this:

http://feedback.wild.../report/opengl/

http://feedback.wild...ic/cpucaps.html

http://zaynar.co.uk/0ad-pub/performance-20100306.png

http://zaynar.co.uk/0ad-pub/ram-20110401.png

Like wraitii suggested - hard data is the way to go. It will help you make the call one way or the other and you'll have the data to back up the decision.

Also worth a read:

http://www.wildfiregames.com/forum/index.php?showtopic=14419&pid=217338&st=0entry217338

Edited by Wijitmaker

Share this post


Link to post
Share on other sites

According to the statistics you provided, Wijitmaker, we can get the necessary information from /report/opengl/ actually.

GLSL 1.10, which runs on OpenGL 2.0, requires extensions GL_ARB_vertex_shader and GL_ARB_fragment_shader:

92% (s) GL_ARB_vertex_shader (~GL2.0)

91% (s) GL_ARB_fragment_shader (~GL2.0)

At first glance it looks like ~92% support GL2.0. But if we take a closer look, we can see that the failed cases are either Windows users with ancient cards, or Linux+Radeon users with older drivers (should be fixed by just updating the drivers). You can clearly see that some cards that failed, succeeded with newer drivers on Linux.

There is also the case of Intel GMA graphics, which seems to be the main case of failure. But to be honest, I tried to run 0AD on Intel GMA graphics only once and it ran as a glitchy blue mess with 2fps.

So in that case, people who already have ancient cards, can't run 0AD anyways, since even if the game somehow manages to start up fine and actually display something meaningful on screen, it's going to be unplayable.

Share this post


Link to post
Share on other sites

I wouldn't overinterpret that data, though. A card can report that it supports GL2.0 while only doing so poorly. The real test will be whether the result actually looks good for the player sitting in front of the screen.

Share this post


Link to post
Share on other sites

We need to take those reports with a grain of salt though, the current ones haven't been updated for quite awhile. To make a smart decision Philip will need to overwrite those reports with the "1,436,964" reports* from "4,723" unique installations* which are sitting in his SQL database.

*See the IRC logs around 22:30

Share this post


Link to post
Share on other sites

There is also the case of Intel GMA graphics, which seems to be the main case of failure. But to be honest, I tried to run 0AD on Intel GMA graphics only once and it ran as a glitchy blue mess with 2fps.

I have a refurb Thinkpad T400, with GMA 4500 MHD graphics, 0AD plays on it just fine.

Share this post


Link to post
Share on other sites

I have a refurb Thinkpad T400, with GMA 4500 MHD graphics, 0AD plays on it just fine.

I should have specified Intel GMA 900 series. My bad :) The first (GMA 900) series wasn't that good and is the failure case since it doesn't support OpenGL 2.0 ;)

Anything above GMA X3100 seems to support OpenGL 2.0. You can read more about it on Wikipedia.

Share this post


Link to post
Share on other sites

When some work on the glsl shaders was done by myconid I remember people reporting a variety of glitches with linux opengl drivers. Linux is a significant target currently especially for developers so we should be very careful about dropping support.

Share this post


Link to post
Share on other sites

I should have specified Intel GMA 900 series. My bad :) The first (GMA 900) series wasn't that good and is the failure case since it doesn't support OpenGL 2.0 ;)

Thought it was something like that. Don't see any need to worry about those, doubt anyone with GMA 900 graphics expects to play 3D games on it. I was pleasantly surprised by how well it works on mine, actually. :thumbup:

When some work on the glsl shaders was done by myconid I remember people reporting a variety of glitches with linux opengl drivers. Linux is a significant target currently especially for developers so we should be very careful about dropping support.

Absolutely, don't want to mess up Linux support.

Share this post


Link to post
Share on other sites

When some work on the glsl shaders was done by myconid I remember people reporting a variety of glitches with linux opengl drivers. Linux is a significant target currently especially for developers so we should be very careful about dropping support.

I reported some issues that were fixed both in 0 A.D. and graphics drivers. The only remaining issue with GLSL on current SVN is what is shown in this screenshot. IIRC myconid confirmed it was a 0 A.D. shader issue on some older ATI graphic chips that he planned to investigate and fix.

post-8891-0-48548700-1374745540_thumb.pn

Share this post


Link to post
Share on other sites

I reported some issues that were fixed both in 0 A.D. and graphics drivers. The only remaining issue with GLSL on current SVN is what is shown in this screenshot. IIRC myconid confirmed it was a 0 A.D. shader issue on some older ATI graphic chips that he planned to investigate and fix.

Having taken a closer look at TerrainRenderer now, it looks like TerrainBlends stage doesn't have a texture bound, which has to be a logic bug in the GLSL path of the code.

Perhaps the older card has a limit in texture units? TerrainBlends stage uses at least 2 textures (base texture and the blends texture) if not more.

Share this post


Link to post
Share on other sites

Thank you for the changes, with Intel Graphic Integrated a graphic Card is hard play the game but the performance back even in Deep Forest Map. I don't test in my other machines. Probably my MacBook is the best running the game, but I can use the SvN in it.

Share this post


Link to post
Share on other sites

This is the report of my card:

http://feedback.wild...n%20ATI%20RV530

IIRC I got the same problem on both Linux and Windows.

I may be able to do some tests if needed, since tomorrow.

Looks like you have more than enough texture units on your GPU. Then it has to be an issue with the shader texture binding stage. Perhaps diffuse texture is the same as the blend texture (invalid sampler?).

This is most likely, since it would be a programming bug in the Blend generation code. With a good debugger it would be pretty easy to figure it out. If the diffuse and blend textures match, it's a bug in the code.

The texture binding is done in PatchRData.cpp at line 796. Hope you know how to use a debugger :)

Share this post


Link to post
Share on other sites

Looks like you have more than enough texture units on your GPU. Then it has to be an issue with the shader texture binding stage. Perhaps diffuse texture is the same as the blend texture (invalid sampler?).

This is most likely, since it would be a programming bug in the Blend generation code. With a good debugger it would be pretty easy to figure it out. If the diffuse and blend textures match, it's a bug in the code.

The texture binding is done in PatchRData.cpp at line 796. Hope you know how to use a debugger :)

What do you mean with diffuse and blend? I see baseTex, normTex and specTex, and samp.Sampler seems to always differ at least for the first samples.

However I have to say that I cannot see anymore the problem now. I am doing some more tests now...

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...