Jump to content

Search the Community

Showing results for 'vegastrike'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome
    • Announcements / News
    • Introductions & Off-Topic Discussion
    • Help & Feedback
  • 0 A.D.
    • General Discussion
    • Gameplay Discussion
    • Game Development & Technical Discussion
    • Art Development
    • Game Modification
    • Project Governance
    • Testing

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests


First Name


Last Name


Skype ID

  1. Hi @DanW58 been awhile now lead at the vegastrike project see your still joined at the hip to Eiffel we have been cleaning the source tree so that it now compiles on current distros still no luck on OSX or Windows yet but working on it new group of coders Enjoy the Choice
  2. Have you tried "vspkg" which are source packages that Visual Studio can use to built dependencies' dll's though libpng may still be fun as we at vegastrike filed a bug against it several months ago now, Btw you can build them for 64 or 32 bits Enjoy the Choice
  3. You might find solving it numerically with a the fourth order Runge-Kutta method more accurate it converges quickly too.We use it in Vegastrike for each physics frame.This is to calculate the position of about several hundred ships each frame as well as the planets in that solar system.The problem you describe reminds me of an article from early 70's in the "the British Interplanetary Society" journal. Enjoy the Choice
  4. We at the vegastrike project use gitter which you can bridge to IRC and it also allows matrix sign-ins as they own gitter.Gitter has nice intagration with GitHub as well,you can set up several chanels inculding privite ones with email notivication of posts. Enjoy the Choice
  5. Because I'm nice, or stupid and I believe acts do better than this project than endless arguments: Index: binaries/data/mods/public/simulation/templates/units/samnite_spearman.xml =================================================================== --- binaries/data/mods/public/simulation/templates/units/samnite_spearman.xml (revision 25989) +++ binaries/data/mods/public/simulation/templates/units/samnite_spearman.xml (working copy) @@ -5,6 +5,9 @@ <SpecificName>Samnite Spearman</SpecificName> <Icon>units/samnite_spearman.png</Icon> </Identity> + <UnitMotion> + <ScaleMultiplier>0.555</ScaleMultiplier> + </UnitMotion> <VisualActor> <Actor>units/carthaginians/infantry_spearman_c_samnite.xml</Actor> </VisualActor> Index: source/simulation2/components/CCmpUnitMotion.h =================================================================== --- source/simulation2/components/CCmpUnitMotion.h (revision 25989) +++ source/simulation2/components/CCmpUnitMotion.h (working copy) @@ -247,6 +247,7 @@ "<a:example>" "<WalkSpeed>7.0</WalkSpeed>" "<PassabilityClass>default</PassabilityClass>" + "<ScaleMultiplier>1.8</ScaleMultiplier>" "</a:example>" "<element name='FormationController'>" "<data type='boolean'/>" @@ -255,6 +256,11 @@ "<ref name='positiveDecimal'/>" "</element>" "<optional>" + "<element name='ScaleMultiplier' a:help='Simple multiplier to allow to define relative speed in templates.'>" + "<ref name='positiveDecimal'/>" + "</element>" + "</optional>" + "<optional>" "<element name='RunMultiplier' a:help='How much faster the unit goes when running (as a multiple of walk speed).'>" "<ref name='positiveDecimal'/>" "</element>" @@ -282,6 +288,11 @@ m_FacePointAfterMove = true; m_WalkSpeed = m_TemplateWalkSpeed = m_Speed = paramNode.GetChild("WalkSpeed").ToFixed(); + + CParamNode scaleMultiplier = paramNode.GetChild("ScaleMultiplier"); + if (scaleMultiplier.IsOk()) + m_WalkSpeed = m_TemplateWalkSpeed = m_Speed = m_WalkSpeed.Multiply(scaleMultiplier.ToFixed()); + m_SpeedMultiplier = fixed::FromInt(1); m_LastTurnSpeed = m_CurrentSpeed = fixed::Zero(); That's the engine change you need to make it work. I designed it specially to not break every single mod in existence by making it optional which is the reverse of your approach. I also tested it and it works. Just for the record and because I invested a long time in those discussions, here is what was said about Vegastrike on the forums Showing results for 'vegastrike'. - Wildfire Games Community Forums And we had the same kind of discussions about Ambient Occlusion being baked wrong and all art files needing to be changed. That seems incorrect to me, in fact I started a scifi mod some time ago, with the goal to make it an alternative to starcraft, and walking speeds were the least of my concerns. The ability to make mines sockets and to stop flying vehicles were more annoying that that. Of course the huge number of new buildings and their design to make was also worrysome. GitHub - 0ADMods/stella_artis
  6. As the producer for an other FOSS project Vegastrike it's a space shooter/trading game I very familiar with the scale thing as we model solar systems and our capital ships are kilometres in length we do model real physics and a certain Arcade style simplification and there is a need for our units to maintain their specifications to very specific units otherwise the differential equations just don't work. All that is just the background though the game must still be playable/fun we play fast and loose with time to make that possible but the issue with 0AD is that the game scale is not consistent it's a composite compromise with several different scales to make displaying the game elements possible/understandable and easier to code the underlying simulation so the actual units are just abstractions not connected to the real world BTW I've been coding since the sixties and with Vegastrike since 2005. Enjoy the Choice
  7. I'm exhausted working with this terrain_common.fs shader. I can't deal with the profusion of compile switches everywhere; this is insanity. You try to do one thing, this variable doesn't exist; try to do that, another variable doesn't exist. The normalmap, as I said, was only modulating diffuse; not specular. But it gets worse: The light on the bumps is computed separately, and added to the picture later. I did not want the light on the bumps; I wanted the normal to be modulated by the normal map, so then I can do my NdotL, EyedotN, HalfvecdotN, and use the modulated normal for all my calculations; but the code in this shader doesn't modulate the normal. So I tried to copy the code from the model_common.fs for modulating normal by normalmap; but the first part goes, vec3 bitangent = vec3(v_normal.w, v_tangent.w, v_lighting.w); float sign = v_tangent.w; mat3 tbn = mat3(v_tangent.xyz, bitangent * -sign, v_normal.xyz); but the v_normal and v_lighting uniforms in this shader are declared as a vec3's; there is no w part. And that's after commenting out pounds of bloody conditional compilation leg-traps. I hate to give up, but I'm at that point; this is impossible to work with. I would understand a couple of switches having to do with graphics settings; but why should there be a switch for whether to use specular, another for whether to use specular map, another for whether to use normal map ... Why would you NOT want to use these things? And then, if they only affected values, like #IF USE_NORMALMAP normal = modulate_normal(v_normal, normalmap); #ELSE normal = v_normal; #ENDIF Right? But let me have the variable normal; don't tell me I have to use v_normal in the code whenever normalmap is not being used. Well this is NOT the case with normal, but it is the case with tons of other variables that exist or not based on the bloody switches. It is insane! And why are are the shaders so different from each other? Why are v_normal and v_lighting vec4's in one shader but vec3's in another? And why don't all the shaders get all of the information? Why some shaders have access to the skybox, while other shaders are denied it? What is the value of hiding those datas? Is it such a big hit on performance to pass that information, that it necesitates being miserly with variables everywhere? This is unbearable to work with. Happens a lot in open source that people secretly feel ambivalent about openness, so they follow open source rules superficially, but write code in a "protective" way, anyways. There was a lot of that in Vegastrike, I remember very well... Of course, they do want help; but then they have to set up a culture of initiation, where they are going to "teach" people how to be as protective as them, the style, the secrets of what means what. So the whole thing becomes a private club or religion. Not saying categorically that's what's happening here, but I'm starting to suspect it. Slightly more than just starting...
  8. @wowgetoffyourcellphone Thanks. For me it might as well be something between a sprint and a marathon, because I did it before. Not as well organized as it is getting now; it was a huge mess back then; but many of these ideas I tried them before. I have to say though, this is looking infinitely better. I'm in love with the texture groupings and channel packings. I never imagined something so good. My old experiments needed about 10 textures. Here we have 5. (I'm talking run-time textures, as opposed to art-level textures). The one-texel textures, that's not my idea; we had a whole bunch of them in Vegastrike; very useful. Obviously what lead to there being so many compiler switches in the shaders is a misunderstanding or bad assumptions about what increases or decreases GPU performance. Someone thought that stripping down a shader to the bare minimum code needed for each object was a good idea. That's the opposite of the truth; that implies that the same shader source file generates a different shader for every object, and you end up with hundreds of shader swithches. Each shader switch has an equivalent cost of rendering about ten thousand polygons. What you want to do is have as few shaders as possible, with NO compiler switches, and even then sort your draw calls so that all the calls that call each shader are together, so that for N shaders you only have N-1 switches. The only compilation conditionals justifiable are those related to user settings, which don't change frame to frame; but game assets should NOT have a way to tailor ther shaders. Pick one by name maybe; but not tailor them. Please don't be shy to ask questions. I try to be clear rather than mystical, but I'm probably taking for granted a few concepts. I know when I was getting started with shaders one of my biggest disbelief items was about all of that code happening for every pixel on the screen. Took a while for my brain to accept that. Back in the days of Doom, the holly grail for game graphics was to get the rasterizer to work while executing less than 6 assembler instructions per pixel... I had a book that chronicled someone's jurney of code optimization, and how he got to 8 instructions per pixel and could not optimize anymore... Until someone whispered another trick in his ear at a convention, and then he got to six instructions. Here we are computing logs, powers, trigonometry, matrix and vector math, pages of it ... per pixel. Per fragment, rather. @asterix Many thanks; I'll check that out. EDIT: I just listened to the first video; it's refreshing to hear Vladislav Belov speaking; he knows his stuff. One thing he might consider in terms of sorting is using something like bubble-sort with some tweaks. It is traditional to use bubble-sort as a perfect example of un-optimized algorithm, but few people realize that bubble-sort is the fastest sorting algorithm for a presorted set, and that in fact quicksort is the slowest. In the case of sorting objects by Z depth, from one frame to the next the sorting doesn't typically change much (unless the camera moves fast), so the set is usually pretty close to a pre-sorted set; much cheaper to re-sort with bubble sort than quicksort. And in fact, the engine knows when the camera moves and by how much, so it can set a movement trigger for a qsort call.
  9. OBJECT TEXTURES REVISITED (take 2) Object textures are what defines an object's appearance other than material, as mentioned before. Normal_U 8 bits "Forms.png" (PNG sRGBA; no mipmaps) Normal_V 8 bits " Normal_W 8 bits " Height 8 bits " Emit_red 5 bits "Light.dds" (DXT5) Emit_grn 6 bits " Emit_blu 5 bits " Occlusion 8 bits " Faction 5 bits "Zones.dds" (DXT3) DetailMod 6 bits " ???????? 5 bits " Alpha 4 bits " EUREKA! 3 Object Pack textures, with one channel to spare. What I particularly like about it is how the channels hang together: Normals and height are interrelated; they are both form modifiers. Same thing goes for Emission and Occlusion, both of which have to do with light and are "baked" in similar ways. (I don't mean that lightbulbs and torches are baked; I mean how they illuminate the object is something that can be and should be baked. And this baking can be modulated by the CPU; thus, a flame sequence for a torch can have an intensity track which in turn can be used to modulate the baked emissive. And if you worry about actual lights in the emissive texture that should not be modulated, worry not; I dealt with that problem a long time ago; found a trick to decouple light sources from baked illumination. ) Finally, the Zones texture has three zonings. Faction tells where to place faction color. DetailMod tells where and how to add detail noise (0.5 will be NO detail; 0.0 will be max smoothness detail, 1.0 will be max AO detail. For a ground or a very rough surface, AO detail can be best. For a smoother surface, smoothness detail shines). And alpha, which is also a zoning. The reason I chose DXT3 instead of DXT5 for Zones.dds is that the alpha channel does not need ANY precision, in fact, I would have been happy with a single bit alpha for alpha testing, and instead it really needs to NOT be messed up by correlation expectations of the DXT algorithm. And all the stuff that doesn't mipmap well at all, but require high precision and low artifacts, namely the normalmap and the height for parallax, those two are together uncompressed and bi-linearly filtered. I'm particularly happy about keeping the normalmap uncompressed, rather than try the roundabout way of Age of Dragons, of using DXT5 RGB for U, then alpha for V, and computing W on the fly. Like I said, I came up with that VERY idea 20 years ago, EXACTLY same idea, and implemented it, but the results were far less than satisfactory, and eventually we decided to go uncompressed and un-mip-mapped. No need to repeat that long learning experience. The way it is here, the normalmap format is entirely standard, unproblematic, and the alpha channel has the height channel, which is the way it's been done for ages by engines using parallax. Why reinvent a wheel that works well? Those two things, normalmap and height, go well together. To summarize, I have two DXT5 textures for materials, and then one uncompressed, one DXT5 and one DXT3 textures for objects 5 textures altogether, only one of them is not compressed, and the channels they pack MAKE SENSE together. Yes! Compare this with the glTF abomination, which uses 3 UNCOMPRESSED textures, plus one compressed texture that mixes (object) AO with (material) attribute channels, making it useless to any engine that has the good sense to appropriately separate the concerns of objects and materials with separate UV mappings. But so, DXT3 and DXT5 being 4:1 compressed formats, this boils down to glTF being about TWICE THE SIZE of this packing (in video memory terms), and it doesn't even have such material channels as index of refraction, surface purity or passivized layer thickness; and it doesn't have object texture channels such as height, faction, detail texture modulator or even a friggin alpha channel !!! You get not even half the goodness, but have to pay TWICE the price, in video memory consumption, with that glTF stupid garbage ... ( And it pretentiously calls itself "physics based" but doesn't even have a channel for index of refraction. What a joke! ) Here, even the names of textures, and the names of channels, make sense: albedo.dds, optics.dds, forms.png, light.dds and zones.dds. Where else in the world do you get so much clarity? "Smoothness" for specular power, "Gloss" for index of refraction, "Purity" for surface lack of diffuse impurities, "Thickness" for rust thickness ... Any system you look into, nowadays, they intentionally misname things to mystify and pretend. Nobody gives you clarity like this. NOBODY. Couple of closing notes: I worked for years with an engine that supported only 1 UV mapping, Vegastrike, and back then I thought it was good. But it didn't take me 5 minutes around here to realize what a great idea having 2 UV's is. Separating the Material and Object concerns is a Win-Win-WIN, with the added advantage that pixelation concerns are minimized when two uncorrelated mappings are blended. The ability to overlap islands in material space is invaluable, as it creates an appearance of great detail at minimal cost, and reduces artistic work enormously. Imagine if for every building that uses brick or wood you would have to re-create, or at best copy and paste, bricks, or wood. It is insane. Don't throw away the best thing your engine has to offer. Question, instead, the self-proclaimed authorities that came up with that glTF TRASH, --as it will be remembered (if at all) once all the the marketing dust settles. Note that materials don't have Emit, in this system; here temperature is an object-related thing; not a material attribute; such that a lava flow would have to be an object, and have emissive color from the object texture pack, and with (non-emissive) volcanic rock as the material. Which may simply boil down to the terrain, as an object, having an emissive burgundy color painted on the lava flow, to make it glow. Regarding translucent plant leaves, fires, plasma drives and glass, all that will be served by another shader; the "fireglass" shader, as I called it 20 years ago (yes, I coded it back then; but I don't have the files; will have to code it again). (Coming after this one ...) Comments? Opinions? ((But please, keep it to technical merits; don't start telling me I should be a sheeple and follow what "the big guys" do; I would rather kill myself.)) If nobody has anything to add, subtract or point out, I will start working on the shader tomorrow. Regarding exporting materials from Blender, I can't see what problems there'd be. Blender has Index of Refraction, specular power, etc., built in; Cycles has them too. I could come up with a set of nodes for rendering to ensure they follow the exact same math as the shader, and therefore be sure renders look exactly like what objects will look like in-game. Easier to debug in the shader first, THEN transfer to Blender, though. As for texturing in Blender, we could simplify the workflow by using color codes for materials. So you select, say, 16 materials you will need, assign them color codes, then in Blender you paint using color codes, but when you hit F-12 the colors are replaced by actual materials for rendering. Just an idea, and it would probably need a lot of manual touch-up after export due to filtering aritfacts. I'd like to write a tool (in C++) to convert png to DXT1/3/5 better than other tools out there do. I don't know that they DON't do what I would do, yet; but I'm guessing they don't. You know what dithering is? It's basically a recursive algorithm that, given a quantization limitation in a texture, and given a higher precision values texture than is representable in the lesser texture, rather than throwing away data by rounding, tries to spred the errors a little bit among neighboring pixels, so that the overall effect is more precise. Well, I think the same algorithm could be, and should be applied to a texture before DXT compression, except for the peculiar 5,6,5 bits of DXT color representation. So you dither down to where you have the three lowest bits of red an blue channels at zero, and green channel's two least significant bits as zero, while maintaining texel neighborhood color accuracy as much as possible, then DXT-compress. Furthermore, I think in some situations it may be better to NOT use nearest points in the line to pick end colors, but allow the end points to move if it helps the overall matching accuracy with two bit indexes. I'm sure NOBODY is doing these things... Furthermore, where the 16 points form a cloud that's very difficult to linearize, a good algorithm should skip the texel, continue with the others, and once the neighboring texels are worked out, perhaps choose the same line orientation as one of the neigbors, or average the orientations of the neighboring texels. EDIT (day or two later): Zones texture rearranged: Faction 5 bits "Zones.dds" (DXT3) Ageing 6 bits " DetailMod 5 bits " Alpha 4 bits " The reason for this change is that Microns, (oxide layer thickness for iridescent reflections), is actually a "zone" on an "object", rather than a material characteristic. On the other hand, it is but one type of rust metal can exhibit. Rusts can be black, red, orange, greenish for copper, or clear. If we use the albedo alpha channel to encode for a rust color, then this "Ageing" rust mask channel can tell where to apply it and how thick. For colored rusts, Ageing acts like an alpha-blend. For clear rust, it acts as thickness of dielectric film. Perhaps it could be used to also add staining to non-metals. The idea is that if Ageing is fully on (1.0), if AgedColor is a color (below 0.75), it will overwrite albedo and set MSpec and Purity to zero. But if AgedColor is clear (1.0), MSpec and Purity will be maxed out. How this would look on non-metals I don't know and at least for now I don't care.
  10. How? And which one? There are two types of AO bakinkg: a) were all rays are counted equally, and b) where rays are multiplied by ray dot normal before accumulating. I think both are pretty scientific. The first represents exact unoccluded visibility. The second represents diffusely reflected omnidirectional light. The fist is good for things like probabilistic occlusion hacks. The second looks the most real when applied to models. Do you mean it's a hack because of the finite samples? Or because it does not take all of the environment into account? Or because it isn't dynamic? I mean, you could say all of graphics is a hack, since photons move from light sources to eyes, but in graphics we project from the eye, back to light sources... Ah, but that's not "transparency"; it's "translucency". But I get what you mean. If I grab a piece of tinted glass in front of my face and DON'T change the angle, I get constant transparency rate, and therefore something similar to Alpha. But in any case, what I meant is that Alpha is useless in 3D graphics unless you want to model a teleporter. No, it is just the fact that it's in a second UV channel. I'm considering only textures pertaining to material representation, for now. Isn't player color something the CPU holds? Oh, I know what you mean, a channel for where to show it, right? Touche. I'll make sure my glass shader will be able to do glass, leaves, plasma, fire and Alpha things. Sure, the emissive texture was my favorite texture work in my Vegastrike days. The real value of it is not just glowing things that can be produced without a texture, but self-illumination. Having parts of a ship receive light from other parts of ship, that adds a TON of realism. That's covered already. Index of refraction, purity of surface and spec power all maxed out will give you the glossiest clear coating in the galaxy. I don't understand the part of To me all of that is necessary to describe a material; I don't see where there is a "choice" to make.
  11. Cool project; I many times asked for something of the sort from the Vegastrike devs. Should take no time at all to come up with a patch... I'll be back...
  12. It's just right. If you want more light, make the sun brighter, or increase ambient. I don't have any hacks to alter the physics or the optics; it is what it is, according to the lights and the materials; no more; no less. It just follows physics and optics. Users have brightness AND contrast controls on their monitors; so do I; so do you. How do we know that our controls are all adjusted exactly the same? This is a subject we should explore, like how bright or dark should it be shipped, so that it doesn't fall out of the range of users' adjustments. One thing I would hate, though, is in-game brightness controls, before anybody asks; because then you run into the trouble of "should I increase the game's brightness?, or my monitors?, or my videocard's? Too many adjustments in a long chain. It's like if you are a musician, and you got 5 effects pedals, each having a gain pot, followed by the amplifier's gain; and you also have a volume pot on your guitar or bass; you end up with like 10 volume pots, and you don't know which needs adjustment. Let me ask you another thing, though: In the real world, would you go about complaining it is too dark or too bright? I mean, today we would, because we have electric lighting we can control. But in the real world it happens often you're near a building at dusk, and the building looks super bright with golden radiance; and the rest of the place looks rather dark. No? Why should it be different in-game? Obviously I'm not alone feeling like this way, as someone created an Acropolis at Night map, designed to be dark; --even challengingly dark. I played that map several times. As a gamer I enjoy realism far more than I do unrequested catering to my incorrectly presumed preferences. If you want to know how it all got darker than before, well, part of that is that a lot of the stuff has bright specular textures, most of which I'm suppressing. The ambient occlusions were not being used in many cases, and where they were used, their gain was turned down to as low as 0.6 for no reason whatsoever (in those xml files); and then the shader was multiplying the ao's by 2.0, also for no reason whatsoever. Last but not least, some of the grounds had bright specular textures, causing the color of the ground to go to saturation. Once I turned off specular in the terrain shader, the diffuse textures, specially the Acropolis map, were still too bright, so I worked at detecting grayscale color (in the terrain shader) and dimming it while increasing contrast, and lo and behold, I discovered for the first time that the ground in the Acropolis map had nice square tiles. I had never known that before. The same goes for some of the stone paths, like in the Badlands, and Belgian Bog; now they can really be appreciated in their full beauty; before they were just whitish blurs. So yes, if you try to go brighter than monitors can display, you lose a lot of detail :-) You might care to sue Samsung rather than blame me. So right now everything is right, by physics and optics. The way to increase light is by cranking up the light, if that's what you want; but it looks good to my eyes. I want to see contrasts. I want to see bright sun reflections contrasting with darker surrounding objects. If you make everything be as bright as to be in the goldilocks of visual perception, you lose realism and beauty overall; it all becomes a cheap arcade. You might as well ask Goya why his paintings were so dark. Well, he was showing realities that other painters were ignoring. The question is, does it look real? Forget about passing judgement on how bright or dark it should be; does it look real? Having said all that, I do enjoy direct sunlight, in the VERY FEW maps that feature it. In the pictures I just posted, only the first one shows some hint of nice rays; the rest are all ambient light based, which is boring as hell. More maps should be like Belgian Bog and Oceanside. Beautiful suns... Ambient light should be reduced in most maps; it is so bright it doesn't let you appreciate the sunlight; and it reduces the visibility of shadows, which are important for 3D perception. EDIT: Speaking of brighness and contrast, I just remembered an issue we worked on at Vegastrike, namely "gamma awareness", which is a rather obscure subject... Monitors, for the longest time, have been very non-linear devices, and it's been customary to this day to compensate for the nonlinearity in the color values of assets and textures. The theory is long to explain, but in practical terms, what it boils down to is that when a shader reads any textures, it should immediately square the value, to convert from standard texture gamma to linear space. Once all the light computations are done, and the shader is about to output the final color for the pixel, it should take the square root first, then output, so as to go back to display device gamma space. I'm going to try this tomorrow. I mean today... 4 AM !!!
  13. That's where all the GPU time is spent, then. I remember in the Vegastrike engine we had two shaders, basically, one for solids and one for transparencies, and we segregated their use, to have only a single switch. So, the transparent shader was running first, while painting back to front; and once all transparent objects were disposed of, we switched to the solid shader and proceeded with solid objects from front to back. I would suggest you try to do the same. I have no idea what all this profusion of shaders is for; I can't tell from their names; but I can tell you up-front they are NOT justified. I'm sure they implement simple functionalities that could easily be added to the main shaders, and be disposed of.
  14. Well, here's a solution I found decades ago, and haven't seen anyone else implement. Back then I was lead developer of a remake of a game of flying in space, trading and fighting off pirates, and it was running on the Vegastrike engine. I was the main programmer, the main shader programmer, AND the main artist (2D AND 3D); though I did get a bit of help now and then. And I HATED releasing, like everybody else; plus, it took a huge amount of bandwidth... My game was 900 megs, which in those days was huge, and we had a lot of players; and I was paying for my own dedicated server, and bandwidth was limited (luckily never exceeded it). But so I wondered: why not put the game, as it installs, on SVN? That way, all our gamers have to do to get the latest is svn up. Mac and Windows users had to do a few extra things, as the installation in SVN was the more typical Linux install. I got a lot of bitter complaining initially (they hated having to install Tortoise), and a lot of resistance from colleagues, but once people tasted the honey of being able to update their game every day, they loved it, and never complained again. So I had 3 SVN repos: The Art repository. The Code repository. and The Installation repository.
  15. Yeah, I'm a bottom up guy; I like to establish what is needed before I worry how to obtain it, and so I think of the best (most complete and efficient) texture stack at the graphics level, and how to persuade blender to deliver it is an afterthought to me. But I've been there; I was working on getting Blender to look on screen the same as the engine I was using at the time, while using the same textures; and it wasn't easy. It should be much easier today with material nodes. The old pipeline was utterly inscrutable. Part of the trick of getting a good stack is to take bit depths of the format into account. I remember there was one DDS format that had a nice, 8 bit alpha channel, ideal for precision sensitive things like specPower or AO. You don't have to worry about performance with my work; it is always my top concern together with realism. The shaders I was writing for Vegastrike were about 4 times as big as the current shaders are here, and were running at full frame-rate ... 20 years ago, those were the days of 150 nm transistors; we're at 7, going to 5 nm soon. Photorealistic rendering does not require raytracing. I had photorealistic shaders in real time that were in many ways superior to raytracers. In really MANY ways. Loki1950 will tell you. Not that raytracers couldn't be better, but there's people behind everything, and many of them... er, MOST of them are not perfectionists, and even hate perfectionists. Happens in every organization. They treat perfectionists like they are trash. So you got dozens of raytracers that look the same, make the same mistakes, or that add ridiculous amounts of noise for no good reason. My experience is that if you add a little bit more realism every day, or every other day, after a few months you got an award-winning shader; and it doesn't have to be too big or slow; on most days, the work is not about adding code but merely improving the existing code. But I think I know what you mean by "all other materials", probably you refer to say matte materials, such as cloth or carpeting, for example; where there is not much specularity to speak of, and therefore it is moot whether it is metal or non-metal. Totally agreed also about risking the shader ruining a good material. That's precisely why I started to work on a new "psychic" shader that would guarantee not to intervene where a material representation is valid. The current metal detection shader does not offer such a guarantee. It cannot, as it is rather hackish. However, I do have to say that this metal detection shader does not mess with materials that have zero specularity. It looks at the specular texture for hints of metallicity; if there is no specularity to analyze, then it does nothing. And you might argue that specularity may come not from a texture but from an xml file; and it would have been true an hour ago; but I just found that, and disabled it; if specularity doesn't come from a texture, now I'm setting it to zero. Why? Because uniform specular color is for the birds. And I don't mean for the chickens in the game... So, indeed, all it is doing is converting some non-metals to metals, and leaving the rest of the materials alone. Another important thing I can assure you about this shader is that it is targeted and deliberate. You can change the #if 1 near the bottom to #if 0, and set which float variable you want to watch in grey-scale. Set it to "is_metal", and you'll see a black world with a bunch of white spots. The white spots are what is detected as metal. You will very rarely see any shade of gray. Some metal detections may be incorrect, I do admit; but you don't need to fear that everything in a scene is being slightly changed, unless you see large areas of grey, but you won't. I'm working on cleaning up this metal detection shader. My portions of the code grew monolithic; I'm trying to fit it now into subroutines so that it is easier to understand. I'm also intending to add a bit of Fresnel for non-metal objects, particularly human skin and plant leafs; probably will have it working within this weekend. I want to see them both sporting a natural shimmer. PS, Would be nice to have a per-map, or map location oriented "GroundColor" uniform, for specular reflections occluded by the ground, so that the shader can "reflect" this color, instead of the sky-box. One way to approach this would be for any object on a terrain, to copy the color of the ground at the instancing location onto a variable, then pass it on to the shader. I can probably code it myself; just running it by you in case you have better ideas. Heck, I also need an index of refraction uniform; this would be per-material... In lieu of a texture channel, of course. One thing this game is lacking is flowers. I mean, the berries look like flowers, but what about wild flowers? Flowers give a special opportunity for nifty shader hacks, as many flowers (red and yellow, primarily) can absorb blue light and re-radiate it at a lower wavelength. This excludes blue, violet and white flowers; mostly red, orange and yellow flowers have this ability to *emmit* light in proportion to the amount of blue light they receive. Modelling this should result in very "vivid" flowers...
  16. @hyperionor anyone that knows, could it be that many models don't have a specular texture, but that the engine is defaulting specular to white in the absence of a specular texture? I just can't understand why there's so much specularity in everything. The default color for specular, if any, should be black; NOT white! (The Vegastrike engine used to do exactly that, at one time, and any spaceship without a specular texture looked like a floating mirror from hell.) EDIT: Nevermind; I think I get it: Some units have textures; the others have a setting for specular color. I overrode it with black in the shader. Still, most things seem to have ridiculously high specularity coming from the textures...
  17. I'm afraid Cycles would not work. Several problems: 1) Cycles's devs are devotees of random; they solve every problem by adding noise, and then give you decapitating filters to get rid of it. 2) A node network would never be as fast as hand-written code. 3) Neither the Cycles people nor the Blender people have the slightest clue about floating point numerical precision issues, and when you are adding 300 or 1000 rays the accumulator had better be double precision, or you get huge aliasing problems. My buddy Klauss and I, at Vegastrike, once built a relationship with this Spanish guy that had an ao tool called Xnormal. We helped him improve quality, such as by using a double for the accumulator, and helped him get rid of a lot of bugs. He had the option of vlight dot normal modulation or plain ao. Not sure if it still exists, though; that was 20 years ago. EDIT: (still there, but still Windows-only) https://xnormal.net/?lang=en
  18. I had been out of this line of work for too long; this is the first time I hear of glTF and spirv; I'm reading about both right now. Not sure yet what the art impact would be. Generally, though, I think PBR should go with PBA, and it should not offend artists to receive some physics based guidelines. I posted a "mini-course", some very basic rules, such as not having diffuse and specular colors that add to more than white, and when to desaturate diffuse for specular, and when not to: And I didn't get any dislikes. When I was an artist at first (and I WAS clue-less back then) I wanted physics based direction; I hated anybody telling me "You are the artist; whatever you decide." As far as converting to new formats, it is a matter of obtaining or coding tools to convert existing assets. There was a time when Vegastrike did not have cube-maps. When I pushed for them I was getting resistance based on the amount of work it would take to convert from spheremaps. So one day my buddy Klauss took on the programming, and I took on the maps themselves. In two days I converted all the spheremaps to cubemaps, and Klauss had the code ready. As someone said, there comes a time in the history of every project when you have to shoot the engineers and start production. If swtiching to glTF is the way to go, I would advocate NOT discussing it too much... ;-) EDIT: One change I would propose, that would help separate what's art from what isn't, is to separate weather from maps. Just as you have choice of map and civilization, I'd have a choice of season, time of day (or night) and weather, of course defaulted to random. This would entirely remove the expectation that ambient light is an artist's choice or responsibility. And it would increase replay value 100-fold.
  19. (Nice to hear Vegastrike is still alive )
  20. No I do not forget how I got 0ad into it the first time more than two years ago now but I did for vegastrike it was simple as I just used cmake to generate the workspace. Enjoy the Choice
  21. I LOVE your code, guys. Never seen anything so clean and thorough and well organized (since last time I looked at MY code, of course ;-)). Also reading the Timing Pitfalls and Solutions document. I hit that wall before; just didn't know how thick it was. PRIVATE NOTE: The engine I worked with, before, was Vegastrike. If you've never looked at it, thank your favorite god and perform the appropriate sacrifices.
  22. Besides the interesting.html file the cmd file is useful to reproduce the error @balduin though I'm a GitHub user I'm not part of the mod community working on a few other projects ie: vegastrike and makehuman. Enjoy the Choice
  23. Was not really a guess just seen similar behavior with the vegastrike engine where I am senior forum admin the nouveau driver is good for most older OpenGL apps but it's support for newer OpenGL calls are problematic at best. Enjoy the Choice
  24. I think that is the default behavior fcxSanya as it is for most forum software I admin the vegastrike forum with uses phpBB 3 and that is what I have seen for about 8 years now Enjoy the Choice
  25. Just got a forum post on the vegastrike game forums where I'm senior admin from a user that had done the upgrade to Win 10 where he reported that our current binaries did not work because there was no OpenGL support.It may be that Win !0 ships with crippled GPU drivers or needs compilation with a newer version of Visual Studio so I am just passing the news on.So lots of Windows blender heads will not be happy either Enjoy the Choice
×
×
  • Create New...