Forums » Bugs
16-bit Textures Locked?
Ohhh... a plot twist. I'm gonna have to do a little backpeddling here and say that this is probably a driver issue. I carefully inspected Quake 3's textures and they exhibit the same symptoms. I'm pretty sure my other games do too, but they just haven't been as noticeable with sharper textures.
At this point I don't know what to do. I already e-mailed Apple about it in the feedback section of Panther. I doubt they're gonna do anything about it though. I'm going to try to contact nVidia about the problem also. But again... why would they listen to just one voice? This really sucks. Anyone got any other tips?
=(
At this point I don't know what to do. I already e-mailed Apple about it in the feedback section of Panther. I doubt they're gonna do anything about it though. I'm going to try to contact nVidia about the problem also. But again... why would they listen to just one voice? This really sucks. Anyone got any other tips?
=(
We have a gf2mx but that didn't exhibit the 16bit-only problem you have.
There are 2 places where 16bit mode is configured in Vendetta.
Both are in config.ini. One is bpp=32 in the [Vendetta] section and the other is texturequality=32 in the [refgl] section.
deleting the config.ini file should cause them to be reset back to the default settings for your 32 meg card which I think is 32bit mode.
This probably doesn't help any though.
One thing though is that in OpenGL, it is impossible to force it to use 32bit or 16bit textures. Those values are just suggestions and the video driver has the option to use its own internal format. There are recent opengl extensions that allow you to be more strict but I think those are still not forced.
There are 2 places where 16bit mode is configured in Vendetta.
Both are in config.ini. One is bpp=32 in the [Vendetta] section and the other is texturequality=32 in the [refgl] section.
deleting the config.ini file should cause them to be reset back to the default settings for your 32 meg card which I think is 32bit mode.
This probably doesn't help any though.
One thing though is that in OpenGL, it is impossible to force it to use 32bit or 16bit textures. Those values are just suggestions and the video driver has the option to use its own internal format. There are recent opengl extensions that allow you to be more strict but I think those are still not forced.
Yeah, I checked the ini files and the menus of Vendetta. They're all set to 32-bit. And I even made a fresh download. I don't know what it could be. I may just decide to take digital images of my laptop screen and then send the actual screen dumps as a comparison. You'll see that what I'm seeing on my screen is a lot more pixelated the the output of the screen dump. I have no idea why it's doing this. This bug is driving me nuts! AArrggh!
While it's highly unlikely, I'll wait to see if the next update fixes this before I submit a detailed bug report with screenshots. Unfortunately OS X doesn't really allow you to download "the latest drivers" or add control panels to take extra control of your GPU. Whatever is in the software update database and whatever you have now on your harddrive... is really all you get when it comes to OS X. When things go right this simplicity can be a blessing. But when something goes wrong it can be an absolute nightmare, as is the case right now.
While it's highly unlikely, I'll wait to see if the next update fixes this before I submit a detailed bug report with screenshots. Unfortunately OS X doesn't really allow you to download "the latest drivers" or add control panels to take extra control of your GPU. Whatever is in the software update database and whatever you have now on your harddrive... is really all you get when it comes to OS X. When things go right this simplicity can be a blessing. But when something goes wrong it can be an absolute nightmare, as is the case right now.
Oh yeah, the texturequality in [refgl] isn't even used right now.
I ask opengl to do all its textures in 32 bit mode.
Maybe there's a setting in your video settings in osX? I don't know anything about the new nvidia drivers for OSX.
I ask opengl to do all its textures in 32 bit mode.
Maybe there's a setting in your video settings in osX? I don't know anything about the new nvidia drivers for OSX.
Can you try using that... thing? I can honestly say there are no controls in OS X to tweak the GPU. It's all you, buddy. You're our last hope...
=(
=(
I don't know what's going on, but I somehow killed my copy of Vendetta and probably won't bother to re-download it until the next update. I was trying to tweak some of the values of my ini files to fix an odd bug that was happening to me, but I probably went a little too far. Allow me to explain...
First, the system specs:
1 GHz G4
512MB DDR RAM
32 MB nVidia GeForce FX Go5200
OS X 10.3 (Panther)
(note: it happened in OS X 10.2.7 also)
Now, the problem:
My textures, for whatever the reason, appear to be stuck in 16-bit mode. No matter what settings I use, 16-bit or 32-bit, my textures remain 16-bit. But my screen resolution itself changes without any problem. So, how do I know my textures are stuck in 16-bit? Because there is noticeable dithering on the station textures where colors don't blend smoothly. Transparent entities like lens flares, wormhole graphics, and energy weapons fire appear turd-like and pixelated with an ugly border around 'em. And speckled artifacts surround my crosshairs. I've had this problem since I downloaded Vendetta on my PowerBook.
It's actually been around since I had the iBook also, but I figured since I was playing in 16-bit mode at that time it was just for that reason. But now that I have 32-bit settings the problem persists.
Thinking maybe I could fix it myself, I go down the list of the various graphical values in the config.ini file and switch one thing off, then come back and switch it on and go to the next one if that didn't fix the problem. Well... I tried reselecting the 32-bit screen resolution, and that didn't work. I tried turning mip mapping on with trilinear filtering, and that didn't work. I switched between the two texture resolutions, and that didn't work. I tried to turn shaders, rglow, and gatten on and off as well. But despite the sharper texture quality, they still appeared to be 16-bit with severe dithering. So clearly it's not something I can control on my end of settings.
What makes the situation even more odd is my inability to take screenshots of this bug. I thought I'd sent in an e-mail bug report with screenshot attachments to ray, but when I reviewed the screenshots everything looked PERFECT. The specks around the crosshairs were gone. Transparencies were nice and smooth. Textures flowed beautifully without any dithering. I even checked to see if anti-aliasing was turned off in my image viewer. Lo and behold my screenshots were perfect... but my game screen wasn't. WTF?!!
Anyway, I'm not sure if you devs have a Mac with a GeForce card. Maybe this bug can be recreated on an ATI equipped Mac. It's worth a try. Heck, maybe some of you other Mac users can help me test this out. Should it appear on your test system also, you'd best see it with gamma turned up to its maximum level. Backgrounds must be turned off completely as well. The dithering and specks just stick out at you in this setting. And yes, those are the settings I regularly play the game.
First, the system specs:
1 GHz G4
512MB DDR RAM
32 MB nVidia GeForce FX Go5200
OS X 10.3 (Panther)
(note: it happened in OS X 10.2.7 also)
Now, the problem:
My textures, for whatever the reason, appear to be stuck in 16-bit mode. No matter what settings I use, 16-bit or 32-bit, my textures remain 16-bit. But my screen resolution itself changes without any problem. So, how do I know my textures are stuck in 16-bit? Because there is noticeable dithering on the station textures where colors don't blend smoothly. Transparent entities like lens flares, wormhole graphics, and energy weapons fire appear turd-like and pixelated with an ugly border around 'em. And speckled artifacts surround my crosshairs. I've had this problem since I downloaded Vendetta on my PowerBook.
It's actually been around since I had the iBook also, but I figured since I was playing in 16-bit mode at that time it was just for that reason. But now that I have 32-bit settings the problem persists.
Thinking maybe I could fix it myself, I go down the list of the various graphical values in the config.ini file and switch one thing off, then come back and switch it on and go to the next one if that didn't fix the problem. Well... I tried reselecting the 32-bit screen resolution, and that didn't work. I tried turning mip mapping on with trilinear filtering, and that didn't work. I switched between the two texture resolutions, and that didn't work. I tried to turn shaders, rglow, and gatten on and off as well. But despite the sharper texture quality, they still appeared to be 16-bit with severe dithering. So clearly it's not something I can control on my end of settings.
What makes the situation even more odd is my inability to take screenshots of this bug. I thought I'd sent in an e-mail bug report with screenshot attachments to ray, but when I reviewed the screenshots everything looked PERFECT. The specks around the crosshairs were gone. Transparencies were nice and smooth. Textures flowed beautifully without any dithering. I even checked to see if anti-aliasing was turned off in my image viewer. Lo and behold my screenshots were perfect... but my game screen wasn't. WTF?!!
Anyway, I'm not sure if you devs have a Mac with a GeForce card. Maybe this bug can be recreated on an ATI equipped Mac. It's worth a try. Heck, maybe some of you other Mac users can help me test this out. Should it appear on your test system also, you'd best see it with gamma turned up to its maximum level. Backgrounds must be turned off completely as well. The dithering and specks just stick out at you in this setting. And yes, those are the settings I regularly play the game.
Wow, that's wierd. Sure am glad I have an ATI card...
Hmm...maybe cause it's a mac?
(joking)
(joking)
Okay, I figured out that turning texture compression on for my setup causes the game to freeze up upon loading the startup menu. It just turns completely black after the updater runs and I can't do anything.
Hmm, do me a favor, play with your gamma and see if you're able to reproduce my Disco bug.
Outta here! Shoo! Shoo! Go back to your own topic and stay there. I don't have any disco bugs to report. And since you made fun of Macs, I'll have to say that your PC sucks for its disco bug. So there!
=b
=b
nope arotle, i play in 16 bit all teh time, i nevr see any of thise problems. I think it's in your card...
I doubt it's just the card. It was noticeable in the 16MB Mobility Radeon of my iBook too. So both GPU manufacterers had this. Try turning your gamma up. Although I think it may be more noticeable on LCDs than CRTs. Not sure. Any PowerBook owners wanting to test this?
i always play with gamma on 15, so i might try turning it up...
ah, typical Mac user. Oh well.
Hmmm... this could very well be a driver issue. I just installed Ghost Recon and my sniper scope view also has stepped transparency, rather than a smooth gradient. My settings are 32-bit on that too. I'm gonna submit a bug to Apple and ask them to look into it. Nevertheless, I think it's worth a look for Vendetta also. I'm gonna try other games to see if it's the same. This sucks.
=(
=(
try deleting the config.ini file and see if that helps.
Vendetta will recreate the file with some default values.
Vendetta will recreate the file with some default values.
Yeah, I tried deleting the ini file. I even redownloaded Vendetta completely. I also tried switching my texture quality to 16-bit, thinking maybe the two values somehow got switched. And I tried turning Rage 128 Color Hack on and off, which to my surprise made my FPS rates increase upon turning it on, with no difference in visual quality whatsoever. Woohoo?! Anyway, none of that fixed the problem. The blobby textures and severe dithering is still there.
=(
=(
I just installed Unreal Tournament 2003 on my Mac and I'm getting 32-bit textures on it fine. Quake 3 also gives me 32-bit textures as well. So at this point I think it's safe to say that this isn't a driver issue. For some reason I guess my GPU is spazzing out on Vendetta and is locked on to 16-bit textures. Aaargh!
I'm not sure how you devs would go about fixing this problem, considering you really don't have a Mac with a GeForce card (or do you?) to test it on. Nevertheless I hope nVidia has your full cooperation in the future for support. I'd hate to be stuck in 16-bit textures for the final version.
I'm not sure how you devs would go about fixing this problem, considering you really don't have a Mac with a GeForce card (or do you?) to test it on. Nevertheless I hope nVidia has your full cooperation in the future for support. I'd hate to be stuck in 16-bit textures for the final version.