Author Topic: OpenGL ES2 and Linux and pi - now with patches!  (Read 35478 times)

0 Members and 1 Guest are viewing this topic.

OpenGL ES2 and Linux and pi - now with patches!
I noticed some previous discussion about OpenGL ES, centred around Android and iOS.  And yes, I saw the conclusions of those discussions: that the input methods normally available on those platforms are not suitable for a space fighter sim.

But what if I start talking about a proper Linux platform, where one can assume a proper sized screen and proper desktop input devices, yet where the graphics acceleration is provided by OpenGL ES (or ES2) instead of "classic" OpenGL?  It's less powerful than today's desktop PCs, of course, but it compares very well to what was available at retail FS2's release.

I am of course talking about Raspberry Pi.

So I made sure that I could compile and run FSO with retail VPs on a more conventional Linux machine - which was surprisingly easy - an then started looking at what would be needed to convert it to GLES2.

Oh boy.  The ancient and obsolete-in-GL-1.1-already glBegin/End paradigm is all over the place.  Where did you guys learn your GL coding skills?  Seriously.   :nono:

So that's the first stage, and it can be done without breaking compatibility with anything whatsoever, and probably improving performance a bit while we're at it.  I see that vertex arrays are already used in some places, but since GLES has completely dropped the Begin/End paradigm (and for good reason), they have to be used *everywhere*.  Yes, even for single points or fullscreen quads - those are the easiest to convert, in fact, and I already refreshed my skills on the MVE cutscene player (which has the virtue of being the very first thing on screen, so ridiculously easy to test).  Ideally we should also be using VBOs where practical, but that's not a necessity.

I wonder if we can get rid of some of the unnecessary and inefficient fixed-point arithmetic while we're at it.  This game never (intentionally, anyway) ran on anything less than a Pentium Classic, which had a pretty good FPU already.

The next stage, assuming the target is GLES2, is to convert literally everything to shaders.  GLES1 doesn't have shaders at all, which is a bit limiting, and all worthwhile hardware supports GLES2, so that's the obvious target.  This is, however, the point where backwards compatibility with old desktop hardware (eg. original Radeon) that was perfectly capable of running retail FS2 starts to become a problem.  Any thoughts on this would be interesting to hear.  It should be possible to leave the fixed-functionality code in parallel, but then you have two code paths, one of which is likely to decay over time unless it is actively tested.

As part of converting everything to shaders, it's time to drop vertex arrays...  what, didn't we just finish putting those in five minutes ago?  Let me finish.  Drop vertex arrays, normal arrays, colour arrays, texcoord arrays, all of them - and specify them as vertex attribute arrays instead.  Vertex attributes are the correct way to do a fully shaded graphics pipeline, which avoids having to shoehorn everything into and out of the original fixed-function pipeline state.  You guessed it, as part of the major API cleanup, GLES2 dropped that state, so vertex attributes are all there is.  This requires a minor rewrite of all the shaders, BTW, but there at least we don't have to worry about backwards compatibility - desktop GL2 has vertex attributes, and anything not supporting GL2 doesn't run GL shaders at all.  Also fortunately, replacing vertex arrays et al with vertex attribute arrays is a like-for-like replacement job, so very easy.

All of the above can be done on desktop GL on any already-supported platform.  All of it is a good refactoring and cleanup operation *anyway*.  Some of it might directly result in performance improvements due to coming up to date and being cleaner.

Once all of that is done, it should be reasonably straightforward to do the actual port to GLES2.  It mostly involves hunting down uses of missing features and sorting out quirks (such as precision specifiers) of the smaller API.  It also means arranging for a new context creation and possibly input method, which might involve work on SDL rather than FSO itself.

All in favour?   :yes:

All against?   :wtf:
« Last Edit: March 11, 2012, 07:16:37 pm by Kromaatikse »

 

Offline chief1983

  • Still lacks a custom title
  • Moderator
  • 212
  • ⬇️⬆️⬅️⬅️🅰➡️⬇️
    • Minecraft
    • Skype
    • Steam
    • Twitter
    • Fate of the Galaxy
Re: OpenGL ES2 and Linux (and pi)
 :yes:

As far as:

Quote
Where did you guys learn your GL coding skills?  Seriously.

Mom's basement.  Dunno bout everybody else.
Fate of the Galaxy - Now Hiring!  Apply within | Diaspora | SCP Home | Collada Importer for PCS2
Karajorma's 'How to report bugs' | Mantis
#freespace | #scp-swc | #diaspora | #SCP | #hard-light on EsperNet

"You may not sell or otherwise commercially exploit the source or things you created based on the source." -- Excerpt from FSO license, for reference

Nuclear1:  Jesus Christ zack you're a little too hamyurger for HLP right now...
iamzack:  i dont have hamynerge i just want ptatoc hips D:
redsniper:  Platonic hips?!
iamzack:  lays

 

Offline Eli2

  • 26
Re: OpenGL ES2 and Linux (and pi)
Where did you guys learn your GL coding skills?  Seriously.   :nono:

I have no GL coding skills, but i will learn it to review your patch !

 

Offline The E

  • He's Ebeneezer Goode
  • Moderator
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Re: OpenGL ES2 and Linux (and pi)
wat.

I'm sorry if we've offended your sensibilities regarding the OpenGL code, but bear in mind the following:

1. FSO runs on a lot of hardware, with a lot of people still using OpenGL 2-level stuff (looking at you, intelgrated people)
2. A move toward a forward-compatible OpenGL3 implementation is planned, but won't be started until after 3.6.14 makes it out the door
3. Until quite recently, we did not have people on the team with the necessary skills to make such an effort
4. OpenGL 3, not ES, is our focus. While "portable" installations would be a nice thing to have, most of our users are on desktop or laptop PCs.
5. While we plan to remove the shader support for GL2-level hardware, the original fixed-function render path should stay intact (see above re: Intel users). We do not want to alienate people without pressing need.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 
Re: OpenGL ES2 and Linux (and pi)
Well, now that I've got your attention...   ;)

Nearly all of what I posted above also applies to converting to GL3 Core Profile.  You still need to eliminate Begin/End, convert everything to shaders and vertex attributes, and eliminate all other references to obsolete functionality.  Only the final step, of downconverting to the GLES2 feature set, is eliminated.

Porting from GL3 Core to GLES2 should be relatively easy, since essentially the same design decisions and API culling occurred in each design - not surprising since they were done by the same group of people.  IIRC, Khronos took over OpenGL responsibility once GLES2 was already established and becoming popular, and that's when GL3 began to take shape.  If the GL3 conversion is done carefully, it should even be possible for GL3 and GLES2 support to coexist in the same code path without too many #ifdefs.

Not to mention that GL2.x supports almost everything that GL3 Core does.  The problem with, ah, "Intelgrated" graphics is mostly due to performance, bugs and a few relatively minor missing features which are mandatory in GL3.

Maintaining separate code paths for old (FF) hardware and modern (fully shader based) hardware is fine by me, if that's what you are planning anyway.  The main concern is avoiding bitrot in one or the other.  I suggest doing the conversion away from Begin/End before splitting, though, since that loses you nothing and is particularly important on slower hardware.

And, in fact, I have a fairly good idea where the ubiquity of Begin/End comes from: NeHe.  Once I learned enough to understand how graphics hardware really works, and looked back to see when vertex arrays were introduced - hint, it was long before PC 3D accelerators arrived - I seriously wondered why anyone bothered to teach Begin/End any more.  Yet all the tutorials I could find started with that, and introduced vertex arrays as an "advanced technique" for performance optimisation.  Only when GLES arrived and made vertex arrays mandatory did tutorials - and only those specifically for GLES - start to use them from the start.

And yet, if your head isn't already cluttered up with the Begin/End way of doing things, it's much easier to understand the GL as a whole and how vertex arrays fit into it.  This is the beauty of GLES2, as it reveals the inherent conceptual simplicity of a modern graphics pipeline.  Once you realise that models usually have a fixed number of vertices and thus memory management is a lot easier than you first thought, and you write a few helper functions to take care of some of the nastier boilerplate, and you write a few simple shaders to cover your basic rendering needs, you forget Begin/End ... until you run across it in someone else's code, and promptly start :banghead:.

It's like BASIC with GOSUB all over again, twenty or so years later.  The only way to pass data into or out of a GOSUB routine is via global variables.  I'm sure you all know what a crapshoot that is.

The truth is probably that the tutorial writers learned Begin/End first, for whatever reason, and didn't feel confident enough with C memory management to recommend vertex arrays as a starting point for anyone else.  The fact is, Begin/End forces the GL to do the conversion and the memory management for you, only much less efficiently, because the hardware only works with arrays and the driver can't make any optimisations based on data probably remaining the same between frames.  Specifying one vertex at a time only ever made sense with software rendering.

Finally, I agree that starting a major upheaval like this is best done after a release, rather than just before it.  Perhaps some preliminary work can be done in a branch, as a proof of concept if nothing else.

 

Offline Eli2

  • 26
Re: OpenGL ES2 and Linux (and pi)
How long do you think it would take to finish this endeavor?

 

Offline karajorma

  • King Louie - Jungle VIP
  • Administrator
  • 214
    • Karajorma's Freespace FAQ
Re: OpenGL ES2 and Linux (and pi)
More importantly, how much of that work are you willing to do yourself? I'm always happy to welcome aboard new coders who are willing to code for us. Not so happy with the "You're doing it all wrong, do it my way. It's better. I'll tell you all how to do it" kind.

I'm assuming you're the former but I've seen plenty of the latter before so it's worth asking before wasting any time. :)
Karajorma's Freespace FAQ. It's almost like asking me yourself.

[ Diaspora ] - [ Seeds Of Rebellion ] - [ Mind Games ]

 
Re: OpenGL ES2 and Linux (and pi)
Both good points.  I'm still finding my way around the code, but at least it looks like nearly all the GL stuff is grouped together in one place, which definitely helps.  There's currently a state manager and a standard vertex format for the FF paradigm and the textures, but no obvious equivalent management for the vertex buffers that already exist.  Centralising that logic early on will definitely simplify matters.

So I don't know how long it will take, but it should take less than the average point release time of FSO in recent years.  :lol:

I mentioned http://www.raspberrypi.org/ before, as that's a project I'm fairly involved in already - I've dug up a lot of information, written test programs to tease out problems with SD cards for it, and even whipped up a basic Linux distro in a few days in order to encourage hardfloat versions of the graphics libraries (a big efficiency improvement) to be produced.  There was a discussion about what games would be nice-to-haves on it, and FS2 came up very quickly indeed.  And so here I am, essentially asking about the best direction to proceed to make it happen.

I could port it all myself, but the changes required are so extensive that the end result would not be mergeable with the main codebase afterwards.  Knowing that work in a technically very similar direction is planned is thus very helpful, since we might be able to work together rather than at cross purposes.

Of course, since I have a fairly demanding (and semi related) day job, I don't have unlimited amounts of spare time to dedicate to the cause, but I can certainly chip in.

 

Offline karajorma

  • King Louie - Jungle VIP
  • Administrator
  • 214
    • Karajorma's Freespace FAQ
Re: OpenGL ES2 and Linux (and pi)
I'm not expecting you to give up bathing for FS2_Open. Just wanted to know you had a good reason to stick around. :)
Karajorma's Freespace FAQ. It's almost like asking me yourself.

[ Diaspora ] - [ Seeds Of Rebellion ] - [ Mind Games ]

 

Offline Fury

  • The Curmudgeon
  • 213
Re: OpenGL ES2 and Linux (and pi)
Kromaatikse, please stay around and help SCP to implement OpenGL3. And ES2 while at it. Your efforts would be very much appreciated, at least if you're willing to get your own hands dirty. :)

 
Re: OpenGL ES2 and Linux (and pi)
All right, let's see what we can come up with.

So far, as mentioned, I've built SVN HEAD on Linux and proved that it works with retail data. I could also try it with some of the mediavps, although the machine I'm using for this is a bit weak - Atom+ION2. What else should I do to really get started?

 

Offline Echelon9

  • 210
Re: OpenGL ES2 and Linux (and pi)
This is really interesting, and beneficial.

Another open source game project I follow, 0AD, has recently done a rewrite to support GLSL and ES.

While it did allow cross compiling and basic running on Android - which is explicitly still a hobby, and not a platform that 0AD will target releases for yet - the code improvements have helped on standard OpenGL with performance.

So a big win to seek out compatability and refactoring for ES.

 
Re: OpenGL ES2 and Linux (and pi)
Interestingly, cross-compiling won't be necessary for this port, even though I have an ARM based machine in mind. Because it's meant to be a full Linux computer, it is a simple matter to install and use all of the necessary build tools directly on it (in the extreme case by simply installing Gentoo). It will just take a bit longer to build than on a standard desktop machine.

 

Offline chief1983

  • Still lacks a custom title
  • Moderator
  • 212
  • ⬇️⬆️⬅️⬅️🅰➡️⬇️
    • Minecraft
    • Skype
    • Steam
    • Twitter
    • Fate of the Galaxy
Re: OpenGL ES2 and Linux (and pi)
You could set up cross-compiling distcc to speed things up maybe.
Fate of the Galaxy - Now Hiring!  Apply within | Diaspora | SCP Home | Collada Importer for PCS2
Karajorma's 'How to report bugs' | Mantis
#freespace | #scp-swc | #diaspora | #SCP | #hard-light on EsperNet

"You may not sell or otherwise commercially exploit the source or things you created based on the source." -- Excerpt from FSO license, for reference

Nuclear1:  Jesus Christ zack you're a little too hamyurger for HLP right now...
iamzack:  i dont have hamynerge i just want ptatoc hips D:
redsniper:  Platonic hips?!
iamzack:  lays

 
Re: OpenGL ES2 and Linux (and pi)
I'll cross that bridge when I come to it - my Raspberry Pi hardware hasn't arrived yet, and there is a lot of work to do on the desktop side first, for which the Atom is sufficient.  In any case I have a second ARM based machine which is more powerful (dual Cortex-A9 versus single ARM11), so plain distcc would still work fine.

It strikes me that an early need will be for an upgraded state manager that, as well as textures and FF state, can keep track of VBOs and shaders, and be capable of storing entire state vectors so that setup code doesn't need to be duplicated everywhere.  This, I think, would also help to hide and ease the differences between legacy FF-style state and modern vertex attributes.

Such a manager could provide what's known as a "stateless rendering API".  This means that all the information needed to render anything is provided explicitly during the draw call, using state objects that are set up by the caller and can be kept between frames, and the (stateful) GL is manipulated behind the scenes to the minimum extent necessary to set up the correct state for that draw event.  This is a useful idea mainly because GL API calls are pretty expensive - I recently measured a GLES2 implementation that was able to do maybe 10,000 APi calls per second, versus maybe ten million vertices per second.

The alternative is that we do local, straightforward conversions.  This might mean that we still have to find somewhere to allocate and fill a vertex array or VBO beforehand.  Things might get even hairier once we need to map vertex attributes for a given shader together - which BTW we should try to do using enumerated constant lookups into a cache, not strings all the time.  This latter part is a really good argument for proper state management code, even if only for that specific subproblem.

Or maybe I have too many ideas and not enough reading of the code to make them concrete yet.  :nervous:

 

Offline Eli2

  • 26
Re: OpenGL ES2 and Linux (and pi)
I wonder if we can get rid of some of the unnecessary and inefficient fixed-point arithmetic while we're at it.

Can you be more precise on that ?
What code parts are you specifically referring to ?

 
Re: OpenGL ES2 and Linux (and pi)
While playing with the cutscene code, I noticed that it was using glVertex2i and a nasty conversion macro to get there from numbers in some sensible coordinate system. It's entirely possible that that is already an exception to normal practice; if so I'll be quite happy. The cutscene code is not performance critical and so might have escaped cleanup applied elsewhere.

I really need to sit down and digest this code properly.

 
Re: OpenGL ES2 and Linux (and pi)
Okay, it's not as bad as I feared since integer vertices only seem to get used for 2D UI and cutscene stuff. But there's still a lot of fl2i and i2fl use (they are just casts, but casts of this type are expensive) which is probably excessive.

As rules of thumb:

Use integers for things that are fundamentally discrete, such as handles or indices.

Use small integers for compact storage of static bulk data such as textures.

Use floats for everything else. They are not performance expensive when you have an FPU, which we do on every supported platform all the way back to retail era.

Use ceil, floor et al instead of casting to int whenever possible. This can be a big win. Only cast to int when inevitable, eg when an array index will result. Try to keep such things rare.

OpenGL uses floats internally for almost everything (even colours with modern shaders). Therefore avoid making it perform extra conversions, by giving it floats whenever you conveniently can. This doesn't mean you have to unpack textures.

 
Re: OpenGL ES2 and Linux (and pi)
Okay, now that it's technically the weekend, I've had a chance to sit down and really investigate - starting with a profile run using Callgrind (part of Valgrind).  This is not the most efficient profiler ever - on my Atom box it reminded me of playing Elite on original hardware - but it doesn't miss a thing, and is excellent for getting accurate call counts.  The noble sacrifice of the crew of the Aquitaine shall not be in vain.

Those call counts tell me that gr_opengl_render_buffer() is called about 240 times per frame (for the mission I tested) on average.  This is about 20 times per model, and there is often more than one model per ship (I'm guessing for moving turrets).  And this is on retail data, where the models and the rendering pipeline must be much simpler than on the enhanced stuff.  This in itself is not too shocking - provided it is a reasonably efficient function.

But gr_opengl_render_buffer() always calls one of two massive catch-all rendering functions, both of which set up a lot of state, draw one vertex buffer, then carefully tear it all down again.  This is a hilariously inefficient way of doing things - the only possibilities of being worse would be if it still used Begin/End, or decided to upload the textures anew every time, or something ridiculous like that.  It even looks up the shader attributes using string comparisons.  :shaking:

This is what a state manager is supposed to take care of for you.  A state manager exists in the code, but it doesn't really do anything useful except for major state like textures - it's just another layer of abstraction.  :rolleyes:

Accordingly, the second thing on my todo list is to implement a proper state manager that is actually useful.  The first thing on the list is still to do basic Begin/End to vertex array conversions, probably picking out a few unnecessary integer conversions in the process.  Once those two things are complete, it should by then be fairly clear how to proceed further.

 

Offline chief1983

  • Still lacks a custom title
  • Moderator
  • 212
  • ⬇️⬆️⬅️⬅️🅰➡️⬇️
    • Minecraft
    • Skype
    • Steam
    • Twitter
    • Fate of the Galaxy
Re: OpenGL ES2 and Linux (and pi)
Looking forward to it.
Fate of the Galaxy - Now Hiring!  Apply within | Diaspora | SCP Home | Collada Importer for PCS2
Karajorma's 'How to report bugs' | Mantis
#freespace | #scp-swc | #diaspora | #SCP | #hard-light on EsperNet

"You may not sell or otherwise commercially exploit the source or things you created based on the source." -- Excerpt from FSO license, for reference

Nuclear1:  Jesus Christ zack you're a little too hamyurger for HLP right now...
iamzack:  i dont have hamynerge i just want ptatoc hips D:
redsniper:  Platonic hips?!
iamzack:  lays