fabio Posted February 9, 2013 Report Share Posted February 9, 2013 Is this still being worked on? Can you give us an update on current status? Is there a rough estimate date for the finished work or first code drop?Thanks! Quote Link to comment Share on other sites More sharing options...
ummonk Posted April 21, 2013 Report Share Posted April 21, 2013 (edited) I and my brother just downloaded 0 AD and have started playing it. He uses a Mac with an Intel GMA 950.Since you're talking about pre-2003 hardware, I assume you're going to use ARB shaders, not GLSL. IIRC, Mac OS X drivers have software emulation for ARB shaders (and a quick Googling suggested that the GMA-950 does pixel shading, with the vertex shading being handled by CPU).So go ahead and make the switch! Edited April 21, 2013 by ummonk Quote Link to comment Share on other sites More sharing options...
tuan kuranes Posted May 24, 2013 Report Share Posted May 24, 2013 If everyone does agree, I would propose to create small simple steps/ticket so that people know where to stand there and can start contributing without colliding ?Here's a modest proposal of tickets that could be created, in order and with dependencies:Wipe all non-shader and arb codeThose could be done somewhat in parallel with each other ("somewhat" because using svn instead of git is a pain... for any merging):Get rid of all deprecated opengl immediate calls (deprecated and mandatory for openglES support), turning them in vbo calls (yes, even drawing a texture using a quad. should lead to faster rendering)Remove current SDL "os window" handling and handling it directly. (Makes 0ad able to select different opengl profiles)Get rid of fixed function glmatrix calls (deprecated and mandatory for openglES and opengl 4.0 support) and we already compute most matrix anyway (in selection/pathfind/etc). It's just a matter of using uniform for those matrix (worldmatrix, worldviewmatrix, modelmatrix, etc., note that discussing/defining some uniform name scheme so that all shader share the same would ease things there, see next point)Add GLSL high level handling code: parsing glsl using regex to get 0ad #defines, #pragma, etc. (handle debug lines, #import/include pragmas to concatenate files and making glsl code much more DRY, change #precision and #version pragma at runtime, adding #defines at runtime, etc.), add reload/validate shader ability (faster shader debugging/coding). Idea is to be able to have shared reusable glsl code library. (easier to maintain, smaller codebase)A very good tool for those steps is gremedy opengl profiler/debugger as its analyzer gives nice and precise list of deprecated calls per frame or per run. (and lots of other nice opengl helpers)Once 1 and 2 done, a much easier next move then would be:Total simulation2/graphics separation using command buffers. In 0ad case, could do it higher than opengl commands: that would be something like taking advantage of "renderSubmit", and the list of submitted render entity, which would end being the "command buffer" given to graphics/render worker thread. (faster rendering as soon as 2 core available, which is pretty standard nowadays)Add new renderers: different opengl profile, openglES, debug/test mock renderer, deferred, forward, etc. ( the hidden cost here is defining a scheme to handle per renderer *materials/shader* in a nice way. (deferred and forward doesn't use the same shaders) Quote Link to comment Share on other sites More sharing options...
alpha123 Posted May 24, 2013 Report Share Posted May 24, 2013 Add GLSL high level handling code: parsing glsl using regexThis is, um, a really bad idea. GLSL is a Chomsky type-2 grammar, while regexes can only parse type-3 grammars. In other words, you can't do this correctly, for the same reason you can't parse HTML with regexes. You might, maybe, get something that looks like it works, sometimes, but it's just fundamentally a bad idea.I'm not necessarily adverse to parsing GLSL with a real GLSL parser, but I don't think that would be worth the trouble. Quote Link to comment Share on other sites More sharing options...
tuan kuranes Posted May 24, 2013 Report Share Posted May 24, 2013 (edited) This is, um, a really bad idea. GLSL is a Chomsky type-2 grammar, while regexes can only parse type-3 grammars. In other words, you can't do this correctly, for the same reason you can't parse HTML with regexes. You might, maybe, get something that looks like it works, sometimes, but it's just fundamentally a bad idea.I'm not necessarily adverse to parsing GLSL with a real GLSL parser, but I don't think that would be worth the trouble."parsing using regex" is a bad way top describe it, sorry. It's more a "preprocessor": you find #define, #pragma, #include. inside glsl and replace it with values, other glsl content file, etc. Here's an example of glsl include support using boost of what I meant. I do agree that real parsing and compiling is not really useful in runtime, only for offline tools like aras_p's glsl optimizer Edited May 24, 2013 by tuan kuranes Quote Link to comment Share on other sites More sharing options...
alpha123 Posted May 24, 2013 Report Share Posted May 24, 2013 "parsing using regex" is a bad way top describe it, sorry. It's more a "preprocessor": you find #define, #pragma, #include. inside glsl and replace it with values, other glsl content file, etc. Here's an example of glsl include support using boost of what I meant. I do agree that real parsing and compiling is not really useful in runtime, only for offline tools like aras_p's glsl optimizerFor just that case I suppose you can sort of get away with regular expressions. It's still ugly and incorrect though. Quote Link to comment Share on other sites More sharing options...
Flamadeck Posted June 16, 2013 Report Share Posted June 16, 2013 (edited) Short: The old fixed-function pipeline should have been removed already!The oldest graphic card I use is an ATI radeon X1600 supporting OpenGL 2.1.(Which will be replaced this year with a computer that has an ATI radeon HD7770 supporting OpenGL 4.3)Very old hardware in this case is more like approaching 15 years.The percentage of people not being able to play is very small.It will help create a good game and weed out very old graphic cards, setups that would hold relative computational-intense features back.The old fixed-function pipeline is deprecated already. Using it is technically a bad idea in all situations with new development, end of story!You need to make something that is great when released.It's important that 0 A.D. runs great on most hardware.Not all setups out there whatsoever, which is not gonna happen anyway.Features in newer OpenGL versions like instancing can be really helpful with improving performanceSDL 2 is already in release candidate status.Hope that 0 A.D. will have an OpenGL ES 3 renderer in the future.(Fun OpenGL ES3 fact, you can have the OpenGL ES3 context in systems with full OpenGL 4.3 or later.) Edited June 16, 2013 by Flamadeck Quote Link to comment Share on other sites More sharing options...
FeXoR Posted June 17, 2013 Report Share Posted June 17, 2013 (edited) Flamadeck: For me that is no argument. It's like preferring "Hey, we could then have worms that glow in the dark! (OK, they stumble around like random walking protozoa... but:) They are shiny!" instead of preferring "Wow, our worms can now talk and sotialy interact with each other!". In fact not even very basic functionality like line-of-sight are in the game. Using up all available computational resources for graphics seams a very bad idea to me. I admit that the support for old graphics cards could be dropped if computers, that can have those cards, are not capable of running 0 A.D. due to the lack of other resources anyway. However, ruling out computers because they have a weak graphics card but would otherwise be able to run 0 A.D. (just for the sake of graphics) is unacceptable to me.I would accept Arguments like maintenance effort though. That doesn't include implementing features, that only the newer of the supported graphics cards will be able to display, and then (because maintaining compatibility to older graphics cards get more complicated due to this) drop the support for older ones. Edited June 17, 2013 by FeXoR Quote Link to comment Share on other sites More sharing options...
Flamadeck Posted June 17, 2013 Report Share Posted June 17, 2013 @FeXoRPlease read my post a bit better than this.By aiming for better GPU's we can push more calculations on the GPU, this allows for doing more non-graphical interesting things with our computational budget on the CPU (and GPU too, see instancing).When I said computational-intense features I meant all kinds of things, including AI and physics.The whole idea on setting minimum here is to be able to do more with the GPU, not necessarily fancy graphics.Mostly to make sure we don't have to use the CPU for all kinds of things, which would make 0 A.D. unplayable.Where you say it's okay to drop support, you say other resources. What kind would that be? We're talking about a computer game here where calculation power and memory are almost the only resources that matter.You make the assumption that a weak graphics card can produce nice graphics but the cutoff point would be where there is unnecessary fancy features. This is not true! A weak graphics card or cpu or too little ram means you won't be able to run 0 A.D. on low detail anyway. 1 Quote Link to comment Share on other sites More sharing options...
FeXoR Posted June 17, 2013 Report Share Posted June 17, 2013 Flamadeck: Then it seams I got that wrong, apologize and am glad to here we'r on the same track 1 Quote Link to comment Share on other sites More sharing options...
raymond Posted February 11, 2014 Report Share Posted February 11, 2014 Any updates about this? Maybe boost to OpenGL 4.2, which is released in 2011 and supports graphic cards from 2010 and newer: http://en.wikipedia.org/wiki/OpenGL Quote Link to comment Share on other sites More sharing options...
niektb Posted February 11, 2014 Report Share Posted February 11, 2014 Any updates about this? Maybe boost to OpenGL 4.2, which is released in 2011 and supports graphic cards from 2010 and newer: http://en.wikipedia.org/wiki/OpenGLNo, in that way you would exclude older Intel igp's. Still, I think that upgrading to a newer opengl (3.1?)is a must. Quote Link to comment Share on other sites More sharing options...
fabio Posted July 2, 2014 Report Share Posted July 2, 2014 Yes, though it would be nice if Philip could update the reports He just did it: http://feedback.wildfiregames.com/report/opengl/ Quote Link to comment Share on other sites More sharing options...
Stan` Posted July 2, 2014 Report Share Posted July 2, 2014 Seriously some people are really playing 0ad on GTX TITAN Black ? X) Quote Link to comment Share on other sites More sharing options...
fabio Posted December 8, 2014 Report Share Posted December 8, 2014 I just noticed this news, where widelands game is now requiring opengl 2.1:https://wl.widelands.org/news/2014/12/2/20141202-time-flies-when-youre-having-fun/There are also some other interesting news there: sdl2, c++11, ... Quote Link to comment Share on other sites More sharing options...
GunChleoc Posted December 8, 2014 Report Share Posted December 8, 2014 I actually pointed the team to this thread when we were making our decision about software rendering... it's gone now, and the game is running much faster as a result Quote Link to comment Share on other sites More sharing options...
fabio Posted December 17, 2014 Report Share Posted December 17, 2014 Another open source game, SuperTuxKart, now requires OpenGL 3.1+. Screenshots are really impressive:http://supertuxkart.blogspot.it/2014/12/merry-christmas-and-beta.htmlhttp://supertuxkart.sourceforge.net/Antartica_engine:_Overviewhttp://supertuxkart.sourceforge.net/Antartica_engine:_Technical_Details Quote Link to comment Share on other sites More sharing options...
Echelon9 Posted December 21, 2014 Report Share Posted December 21, 2014 Another open source game, SuperTuxKart, now requires OpenGL 3.1+. ...I'd check that further. My understanding of SuperTuxCart developer comments on the phoronix.com forums is that there is an OpenGL 3.1+ renderer added, but that an earlier OpenGL renderer is still present to support older cards. 1 Quote Link to comment Share on other sites More sharing options...
mifritscher Posted November 20, 2016 Report Share Posted November 20, 2016 (edited) Only as an hint: even on an i7-640LM, the integrated GPU (which can handle OpenGL 2.1 and inofficially even OpenGL 3) isn't fast enough to enable GLSL 2.0 shaders - its way too slow, even if the graphics quality is set to low. Without its working fine. So if all shaders etc. are converted to OpenGL 2.1 GLSL, there should be "extra low quality" versions available - also to ease porting to Android, running it on compute sticks etc. Regarding suptertuxkart: Yes, the old renderer is still available. The i7-640LM can be forced to claim OpenGL 3 and then its technically running, but practical unplayable because of way too few fps. The old renderer is running fine on the same maschine. Edited November 20, 2016 by mifritscher 1 Quote Link to comment Share on other sites More sharing options...
tuk0z Posted November 22, 2017 Report Share Posted November 22, 2017 Whole thread much appreciated be me, a 3D programming total noob more involved into general systems builds & optimization. On the hardware part it seems to me the GLSL you talked about in this thread requires a « GPU » supporting OpenGL >= 2.0? ANd going further does it matter to play 0 A.D., whether the GPU has pixel and vertex shaders (e.g. Radeon Radeon X1950 GT or GeForce 7600 GT) vs more the recent Unified shaders "cores"? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.