Hard Light Productions Forums
Modding, Mission Design, and Coding => FS2 Open Coding - The Source Code Project (SCP) => Cross-Platform Development => Topic started by: swarmster on July 24, 2011, 04:04:00 pm
-
Edit:
I'm currently trying to compile some information on whether these bugs are somehow card-specific, which I know is a hypothesis that was suggested in the past. Here is the current score, please feel free to post with your own:
problem
----------
nVidia 8600M GT
nVidia 9600M GT
nVidia 9800 GT
nVidia 320M
no problem
----------
nVidia 9400M (?)
ATI Radeon HD 2400 (maybe too old for shaders)
-------------------------------
Hi all,
Firstly, let me say, as a new member, I'm extremely impressed by everything this community has done for Freespace over a decade later. I've never seen a more organized and productive game community, and Freespace is certainly deserving of it.
Now, I finally had a chance yesterday to put together an install for this on my own system. I was really easy (and beautiful), but I shortly noticed a bug that I guess has been pointed out a number of times on these forums in the past: "clipping" involving ships passing through their warps and ship models multiplying mid-explosion.
Please see these threads if you're unfamiliar:
http://www.hard-light.net/forums/index.php?topic=60216.0 (http://www.hard-light.net/forums/index.php?topic=60216.0)
http://www.hard-light.net/forums/index.php?topic=72327.0 (http://www.hard-light.net/forums/index.php?topic=72327.0) (reply #18+)
http://www.hard-light.net/forums/index.php?topic=72586.0 (http://www.hard-light.net/forums/index.php?topic=72586.0)
This appears to be unique to OS X and is resolved by using the -no_glsl or -disable_glsl_model flags, which also disables a lot of nice graphical effects. It is still present in nightly build 7355.
What's interesting is that newly released OS X Lion now supports OpenGL 3.2 and GLSL 1.5 (vs. OpenGL 2.x and GLSL 1.2 in Snow Leopard). This brings it significantly more on-par with other operating systems. Is it possible this older version of GLSL was to blame?
The only problem, reading around the internet, is that programs don't just automatically make use of the new OpenGL in Lion, they have to specifically initialize it, or else they roll back to "compatibility" 2.x mode.
I can't say that I have any real experience with OpenGL programming, so I don't know how hard it would be to update FSO for Lion, but I'll continue to read on the subject. I thought it was worth bringing up here, however.
-
Not sure if this is of any help, but here are a couple of the relevant sections in Apple's OpenGL Programming Guide:
Choosing Renderer and Buffer Attributes (http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/OpenGL-MacProgGuide/opengl_pixelformats/opengl_pixelformats.html#//apple_ref/doc/uid/TP40001987-CH214-SW9)
Updating an Application to Support the OpenGL 3.2 Core Specification (http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/OpenGL-MacProgGuide/UpdatinganApplicationtoSupportOpenGL3/UpdatinganApplicationtoSupportOpenGL3.html#//apple_ref/doc/uid/TP40001987-CH3-SW1)
Choice quote:
"The OpenGL 3.2 core profile is defined by Khronos and explicitly removes removes deprecated features described in earlier versions of the OpenGL specification; further the core profile prohibits these functions from being added back into OpenGL using extensions. OpenGL 3.2 core represents a complete break from the fixed function pipeline of OpenGL 1.x in favor of a clean, lean shader-based pipeline.
"When you use the OpenGL 3.2 Core profile on Mac OS X, legacy extensions are removed wherever their functionality is already provided by OpenGL 3.2. Further, pixel and buffer format attributes that are marked as deprecated may not be used in conjunction with the OpenGL 3.2 core profile.
"If you are updating an existing Mac OS X application, include the kCGLOGLPVersion_Legacy constant.
The legacy profile provides the same functionality found in earlier versions of Mac OS X, with no changes. It continues to support older extensions as well as deprecated pixel and buffer format attributes. No new functionality will be added to the legacy profile in future versions of Mac OS X.
"If you want to use OpenGL 3.2 in your application, but also want to support earlier versions of Mac OS X or Macs that lack hardware support for OpenGL 3.2, you must implement multiple OpenGL rendering options in your application. On Mac OS X v10.7, your application should first test to see if OpenGL 3.2 is supported. If OpenGL 3.2 is supported, create a context and provide it to your OpenGL 3.2 rendering path. Otherwise, search for a pixel format using the legacy profile instead."
I would particularly stress the bit about Apple calling and implementing 3.2 as a 'clean break' from the past, and referring to the old implementation as "legacy". Anyone who's followed Apple for a while knows that when they start talking about making clean breaks, they don't usually mess around. It sounds like things might need to migrate over sooner or later regardless. Ars Technica suggests (http://arstechnica.com/apple/guides/2011/07/does-apple-still-care-about-creative-pros.ars/2) GL 4.1 might not be so far off.
Also, a couple code examples people have written to request 3.2 (the validity of these is unconfirmed, of course):
1 (http://forums.macrumors.com/showpost.php?p=12177634&postcount=485)
2 (http://cgit.freedesktop.org/mesa/mesa/commit/?id=9a00dd974699e369b1eb292103fbde8bc6adfb87)
Implementation of this stuff seems alternately pretty simple and super complicated.
-
Also, Lion has been out for less than a week, and as long as FSO runs as good on it as Snow Leopard, we're probably not in any hurry to handle that stuff. Personally, the issues you describe aren't even platform wide, I believe they only occur on certain cards. My Nvidia 9400M renders fine on 10.6. I imagine that there are others having rendering issues too, but they're probably the exception and not the rule, and there might be a fix that doesn't involve upgrading to Lion. Who knows. Unfortunately, lack of OS X devs makes it difficult to make headway on these platform specific issues, especially when only some people are even capable of reproducing them.
-
Fair points. I realize I'm suggesting significant work ostensibly to fix a graphical issue no one can be sure it will fix, although I think migrating to OS X's new graphics framework would be generally beneficial in the long run.
It hasn't been clear to me whether everyone on OS X has this clipping issue. Has anyone followed up on whether it's hardware-dependent? I had seen some suggestion that nVidia cards have it while ATI/AMD cards do not, but no follow up on whether that's accurate.
Maybe it would be something worth discussing here. It would give us a list of recommended cards as well as potentially more clues as to where the bug is coming from.
You say the nVidia 9400M does not have this problem. I have an nVidia 9800GT and do have the problem. Could anyone else reading please weigh in? Both with news of other cards, as well as confirmation for these?
If there's a trend maybe I'll just buy a new card and shut up about it for now. ;)
-
You say the nVidia 9400M does not have this problem. I have an nVidia 9800GT and do have the problem. Could anyone else reading please weigh in? Both with news of other cards, as well as confirmation for these?
I have a NVIDIA GeForce 9600M GT, and have seen the issue of exploding ships duplicating into two whole ships when breaking apart.
Thankfully, I do know a bit about the OpenGL pipeline, so consider at least one SCP dev is interested to fix it.
-
Fair points. I realize I'm suggesting significant work ostensibly to fix a graphical issue no one can be sure it will fix, although I think migrating to OS X's new graphics framework would be generally beneficial in the long run.
It hasn't been clear to me whether everyone on OS X has this clipping issue. Has anyone followed up on whether it's hardware-dependent? I had seen some suggestion that nVidia cards have it while ATI/AMD cards do not, but no follow up on whether that's accurate.
Maybe it would be something worth discussing here. It would give us a list of recommended cards as well as potentially more clues as to where the bug is coming from.
You say the nVidia 9400M does not have this problem. I have an nVidia 9800GT and do have the problem. Could anyone else reading please weigh in? Both with news of other cards, as well as confirmation for these?
If there's a trend maybe I'll just buy a new card and shut up about it for now. ;)
Or you could bootcamp if you don't mind Windows that much.
-
I don't even have any room left to bootcamp on this box. Crammed it full of VirtualBox machines, MP3s, work stuff, FSO stuff, etc, and this little 250GB drive is loaded.
-
Or you could bootcamp if you don't mind Windows that much.
hehe, without opening too big a can of worms, I would say I'd rather upgrade my video card if necessary than downgrade my OS. Beyond my feelings toward Windows itself, it's a pain to boot back and forth (in terms of time and losing any and all background processes) for a single program, no matter how great that program is.
Edit: Here's where we stand for hardware right now, going from the above and one mention I found in another thread.
problem
----------
nVidia 8600M GT
nVidia 9600M GT
nVidia 9800 GT
no problem
----------
nVidia 9400M
-
I should admit, this isn't my primary testing machine and I don't play a whole lot on here. Almost entirely FotG if I do, and that's still minimal. There's a chance I've just never noticed it. What's the easiest way to reproduce it?
-
I've been playing through FS1 lately, and the way I've been testing it has been to enter a mission (even the first tutorial mission) and immediately jump out. Your ship passes through the warp and keeps going out the other side before disappearing.
-
Oh that's that bug? I thought that was on Windows at one point too.
-
Yes, and yes, I've heard it was once a problem in Windows as well. I haven't been able to find anything about what was changed that fixed it in Windows, but I wonder if it doesn't have something to do with Windows being far quicker to support newer versions of OpenGL? It would really be interesting to find out exactly when that happened.
Of course, maybe it's just two separate issues that happen to have the same symptoms.
-
IIRC, the windows bug was caused by an uninitialized vector being passed around in the clipping code, nothing directly related to the level of OpenGL support.
-
Edit: Here's where we stand for hardware right now, going from the above and one mention I found in another thread.
problem
----------
nVidia 8600M GT
nVidia 9600M GT
nVidia 9800 GT
no problem
----------
nVidia 9400M
You can also add the info from my two Macs:
Bug also happens on:
- Build 3.6.13 - nightly build FS2_Open-Inferno (debug)-20110728_r7384
- System: 2011 Macbook Pro (7,1) - Intel Core 2 Duo - 2.4 GHz - 4 GB RAM
- Graphics System: NVIDIA GeForce 320M - 256MB VRAM
- OS: Mac OS X "Snow Leopard" 10.6.8
BUT, does NOT happen on:
- Build 3.6.13 - nightly build FS2_Open-Inferno (debug)-20110728_r7384
- System: 2008 iMac (8,1) - Intel Core 2 Duo - 2.4 GHz - 4 GB RAM
- Graphics System: ATI Radeon HD 2400 - 128MB VRAM
- OS: Mac OS X "Snow Leopard" 10.6.8
So, to clarify... It does happen on a:
NVIDIA GeForce 320M - 256MB VRAM
but not on a:
ATI Radeon HD 2400 - 128MB VRAM
(Though, the much older ATI card's rendering looks horrible in other respects, and seems to only avoid this bug because it doesn't support all the shader functions the newer nVidia card does)
-
Thanks for the info! I'll add it to the main post.
I'm a little surprised by the trouble you're seeing with the 2400. I realize the _400 level cards are pretty low, but the 2000-series wasn't that much older than the 8600-9800 GeForce series.
I haven't yet sprung for a new ATI card to test the brand hypothesis, but it would be really interesting to hear if someone out there has one.