Norvegia Posted July 23, 2013 Report Share Posted July 23, 2013 Hi! I found it quite interesting to read the technical discussion and progress reports on this forum. I think it is incredible what you all have accomplished. It seems like backwards compatibility is an important goal. I believe performance is the best way to ensure broad compatibility. I am by no means an expert in this field, but it sometimes comes down to backwards compatibility vs performance. When opting for backwards compability over performance, you are making it unplayable on many systems. Systems with low (single) threaded cpu performance like : AMD bobcat, Jaguar and K8. And systems with low GPU performance like intel (GMA) HD4500, HD2000+ and other low end graphics cards. The developer behind Banished saw a 25+% fps boost in a scene when he changed the api from DirectX 9.0c to 11 on his nvidia 610m based laptop. I understand that newer standards doesn't always mean better performance, but as I mentioned before, good performance is the best way to ensure as many as possible is able to play your fantastic game. -Norvegia 1 Quote Link to comment Share on other sites More sharing options...
greenknight32 Posted July 24, 2013 Report Share Posted July 24, 2013 If you require anything higher than DirectX 9.0c, you eliminate all the players who are still running Windows XP. That's a lot of people. 1 Quote Link to comment Share on other sites More sharing options...
feneur Posted July 24, 2013 Report Share Posted July 24, 2013 If you require anything higher than DirectX 9.0c, you eliminate all the players who are still running Windows XP. That's a lot of people.Well, since 0 A.D. uses OpenGL and not DirectX this is not relevant to us. It was just an example he listed. Quote Link to comment Share on other sites More sharing options...
greenknight32 Posted July 25, 2013 Report Share Posted July 25, 2013 Well, since 0 A.D. uses OpenGL and not DirectX this is not relevant to us. It was just an example he listed. Of course, I was just pointing out the contradiction in citing that as an example of a "way to ensure as many as possible is able to play" - it eliminated support for millions of computers.There definitely can be performance penalties in supporting outdated hardware and OSes. There is a need to strike a balance between legacy support and optimum performance, and that balance is continually shifting. It will be a subject of ongoing discussion for as long as the game is being developed. Quote Link to comment Share on other sites More sharing options...
zoot Posted July 25, 2013 Report Share Posted July 25, 2013 @Norvegia: The main performance hogs has already been identified a long time ago. Compatibility was not one of them, as far as I know. Quote Link to comment Share on other sites More sharing options...
Norvegia Posted July 25, 2013 Author Report Share Posted July 25, 2013 I was not implying that compatibility is the main performance hog, I was only giving my opinion in the compatibility discussion. The developer in the example I listed was amazed by the performance improvement and memory footprint reduction. It might be difficult to predict if a new api/standard will enhance or decrease performance, but I think the game will benefit if the developers are a little more cold hearted when it comes to compability with old hardware.This is just my 2 cents, and again, I'm no software/game developer. I'm just concerned about the AMD Bobcats and Intel Atoms out there. Quote Link to comment Share on other sites More sharing options...
zoot Posted July 25, 2013 Report Share Posted July 25, 2013 It might be difficult to predict if a new api/standard will enhance or decrease performance, but I think the game will benefit if the developers are a little more cold hearted when it comes to compability with old hardware.There is nothing to suggest that this is the case. We can relatively easily "predict if a new api/standard will enhance or decrease performance" by using profilers, and seeing where computation time is actually spent, rather than just guessing. The developer in your example could have done the same. Quote Link to comment Share on other sites More sharing options...
Norvegia Posted July 25, 2013 Author Report Share Posted July 25, 2013 There is nothing to suggest that this is the case. We can relatively easily "predict if a new api/standard will enhance or decrease performance" by using profilers, and seeing where computation time is actually spent, rather than just guessing. The developer in your example could have done the same.Doesn't this require a lot of work? Like moving to c++11, doesn't a large part of the code need to be altered or rewritten before the performance delta can be known? Quote Link to comment Share on other sites More sharing options...
zoot Posted July 25, 2013 Report Share Posted July 25, 2013 Doesn't this require a lot of work? Like moving to c++11, doesn't a large part of the code need to be altered or rewritten before the performance delta can be known?We don't need a delta to determine how big a fraction of each frame is spent in a given API. If profilers show that 0.0001% of the time is spent in a "backwards compatible" API, while 20% of the time is spent in the pathfinder, then the pathfinder is surely a more worthwhile optimization target. 1 Quote Link to comment Share on other sites More sharing options...
Norvegia Posted July 25, 2013 Author Report Share Posted July 25, 2013 Of course. I have gotten the impression that the deveolpers sometimes had to choose between compatibility vs performance. If this is not the case, then disregard this thread.(The API change in the example I listed resulted in less GPU time per frame) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.