Forums » General

Random texture question for devs

12»
Jan 30, 2005 mburrack link
So I was playing last night on my lowly eMac, first time in several months, and I noticed something odd. I was able to fly about in Azek I-12 at about 60 fps without issue (this is with fullscreen glow off, of course--stations and fs glow just suck framerate away) but as soon as I jumped to Azek H-12, my framerate dropped to about 10fps. The polycount was only half of that in I12 (5,000 instead of 10,000), so it didn't seem to be polycount. However, dropping my texture quality to "medium" suddenly fixed the framerate problem, so apparently there's just a huge texture load in H12.

Which leads me to my question(s). All that's in H12, as far as I could see at least, is bots and asteroids. So what's taking up the texture quality? Lots of different asteroids? Is S3TC supported in the engine, and in particular on Mac ATI cards under OpenGL? Is there a way to tell? Is there any debug switch to display texture memory usage in game? Do you batch objects together by texture to minimize swapping?

Just wondering, since on "medium" quality the stations....erm, don't look their best :~)

Thx,
--mcn
Jan 30, 2005 AlienB link
iirc S3TC and all of it's followings, DXTC bla bla are just what the texture file *is*, IE, you'd run it through a tool to compress them using S3TC, it's not really the graphics engine doing it. Then again, I usually dont know what I'm talking about. Free bump for an awnser.
Jan 31, 2005 mburrack link
S3TC is short for S3 (the video company) Texture Compression. It's a relatively naiive but efficient compression method that can encode 16 pixels (4x4 block) in 2bpp + 2 16-bit colors for a total of 64 bits (compared to raw 16bpp of 256 bits).

S3TC was licensed to Microsoft for use in DirectX, where it was renamed DXTC. However, S3TC is also supported on some graphics cards under OpenGL.

You are correct in that it is a "tool" for compressing the textures and not a feature of the engine per se, although it is up to the engine to decide whether to use S3TC-compressed textures or not. The advantage is, of course, a 4x reduction in texture memory usage, which means you can fit more textures into video memory, or--in my case--fit enough into a 32MB video card that it doesn't cause texture thrashing and blow the framerate to hell.

(Side note, and file this under Too Much Information: You would think that you just can't load more textures than can fit on a video card, and at the most basic level that's correct. However, OpenGL and DirectX both do some rudimentary management. For example, if you try to load, say, 40MB of textures on a 32MB video card (ignoring extra space taken up by other things), the drivers will load what it can fit, keep the extra around in system memory, draw what it needs to, and then swap out the textures for the remainder it couldn't fit before so it can draw stuff with those textures. Since it has to do this per frame, you end up swapping textures back and forth between system and video memory *every frame*--a phenomenon called texture thrashing. As you can imagine, it eats up a *lot* of time doing the swaps, so you end up killing the framerate. My question about S3TC support is related because if they didn't support S3TC, changing to support it would effectively cut the texture usage into 1/4 of what it used to be, or in the example above, 40MB to 10MB. 10MB fits on a 32MB card, so you no longer get texture thrashing, and your framerate stays nice and healthy.)

OK, I *seriously* need to get back into the game development biz. I've got *wayyyy* too much energy built up for this stuff....

--mcn
Jan 31, 2005 macguy link
Hey devs, could you implement this?!? Sounds like a good fix to the graphics rendering problems on lower end macs( i have a 1.25Ghz G4 emac with an ati 9200 with 32megabytes of on card video memory)
Jan 31, 2005 roguelazer link
They do offer texture compression...
Jan 31, 2005 AlienB link
In windows under DX, and that's it.
(Heh, I think it's DXTC too)
Feb 01, 2005 mburrack link
If it's DXTC, then it's easily enabled on the OpenGL side on all platforms, since DXTC = S3TC (see my above post).

'Course, it's possible they already support S3TC on OpenGL, at which point my other questions become...well, not more important, since it's just my curiosity, but you get the idea :)

--mcn
Feb 01, 2005 raybondo link
Heh, S3TC/DXTC looks kinda bad, IMHO. We let the driver compress the images which is part of the reason it looks crappier than if we did it offline, because the driver chooses a quick and not as good compression.

The DX driver in VO has the option in the menu, but the GL driver doesn't. However it can be enabled by editing the config.ini file. It is completely untested, so use at your own risk. VO will only use the setting if your video card supports the gl_ARB_texture_compression extension.
Edit the config.ini file and look for tc=0 and change it to tc=1
See if that helps any.
Feb 01, 2005 roguelazer link
The texture compression is really ugly on my cards. All the textures get compressed down to 16bit color, but with much worse dithering than the 16-bit mode. If you need to save space in textures, I'd reccommend switching Texture Color Depth to 16bit- it looks nicer than compressed textures, and it probably saves a good bit of memory.
Feb 03, 2005 mburrack link
Thanks, I'll try the config option when I get home tonight.

rogue: Yeah, it'll cut the usage in half (32bpp to 16bpp). Not as good as the 1/4 of S3TC, but oh well.

raybondo: Just to throw out an idea, at Cyan we compressed them offline basically either using the drivers or using the freely available Photoshop plugin, and then decompressed them on the fly if the card didn't support it. Granted, that means the textures look compressed even when they aren't, so maybe keeping two copies of the textures on disk, one compressed and one not? That also would let you tweak the compressed versions to look nice offline, and also let you pick and choose which textures to compress. It'd also seem to make the code path rather easy for switching (texture compression on? great, look *here* for textures instead of *there*). You would also not have to worry about compressing them in-game then, which would probably help any compatability issues with both drivers (there are DX and OGL drivers that support S3TC as a source format but not as a destination format).

But yes, the standard DX method of compressing textures is way crappy. Although, to be fair, it's not exactly *easy* to do (I tried myself once. Technically, it's a simple compression scheme. The trick is finding the right two key colors for every 4x4 block, and *that* is unbelievably hard...) Regardless, it would seem a bad idea to throw away S3TC support altogether, since it's not the format, but the method of compression you're using, that makes the textures look bad.

Not that I think it's a terribly high priority item or anything. Just wanted to throw out my 2 cents :) Although in-game texture usage counters similar to the FPS counters we already have sure would be nice... :) :) :)

--mcn
Feb 04, 2005 mburrack link
Update: I tried the setting and Vendetta crashed about 2/3rds of the way through loading. A very hard crash and lock, too.

Some debugging info at this point would be helpful, and I'd be very interested to know what the error code returned, if any, is from the call to glTexImage2D that creates the compressed texture. Also, does the code checked to ensure the compressed texture format being compressed to is supported in the driver? (See http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=3;t=012453 for more detail). If you want me to do anything to help track down this issue--run any special build with extra logging, etc,--I'd be more than happy to help.

I also submitted the bug through the Vendetta Crash Reporter, although I don't know if it worked or not...

Thanks,
-mcn
Feb 04, 2005 raybondo link
Ok, I got the bug report you submitted, mburrack.

The code does check for gl_ARB_texture_compression.
I set the internal format hint to either GL_COMPRESSED_RGBA_ARB or GL_COMPRESSED_RGB_ARB (depending on whether the texture has alpha) for glTexImage2D.

The problem with having both compressed and uncompressed versions of the textures is that since we are distributing this online, any additional textures increase download time. Keeping the compressed texture instead of the uncompressed texture will, as you said, make even non-compressd textures look compressed.
We distribute (most of) the textures as jpgs and changing them to s3tc will make them bigger and therefore the initial download of VO will be larger.
I check the return codes of glTexImage2D but it looks like VO crashed deep inside OS X's OpenGL drivers.

Anyways, I did say use at your own risk. I'll try it out on our mac and see if it crashes on Friday.
Feb 04, 2005 mburrack link
lol yeah I know, but hey, debugging isn't ever without risks :)

Did it crash b/c of the glTexImage2D call? If so, perhaps I could make a small test app with whatever texture is in question (if you know which it is) and I could submit it to Apple as a bug report.

Also, do you try to load any mipmap levels below 4x4? I've seen drivers on the Windows side that hate that, and then some drivers that deal with them perfectly fine, so I figured I'd ask.

Hmm, now I might have to go find a color expert and see if I can figure out how to construct a better S3TC compressor...all ya gotta do is pick the best two colors in a 4x4 block...can't be that hard, right? :)

--mcn
Feb 04, 2005 tramshed link
Does that apply to linux also? if so maybe i can squeeze a few more fps out.
Feb 04, 2005 mburrack link
Assuming you have a card whose Linux drivers support it, it should work. Worth trying, at least :) Might help in debugging too to know if it works under Linux.

As a tip, though, you might want to check to see if the drivers support the extension first. I believe the openglinfo.log or something like that in the Vendetta directory includes a printout of all the extensions as reported by the driver.

EDIT: Note that it won't alter your FPS if you already had enough texture memory to fit all the textures. However, if you had little enough texture memory (as I did) for some areas to completely fit, it could *dramatically* improve your framerate. So if/when you try it, make sure to go into a sector that you always get very low FPS on (if there is one) and see if it's improved there.

--mcn
Feb 04, 2005 roguelazer link
Just type glxinfo to find out if your card supports it. Might be fglxinfo on ATI binary driver- they seem to precede all glx commands with an f.
Feb 04, 2005 raybondo link
The game is crashing on the mac for me, too. It is crashing inside the OSs opengl driver deep in the glCopyTexSubImage2D when I generate the background. I render the background to an offscreen buffer and then copy it to a texture. Apparently copying it to a compressed texture is bad.
Try setting your background detail to 'Off'. Maybe that will cause VO to not crash.
I fixed VO so it doesn't use compressed textures for render targeted textures and that solved the problem, but you'll have to wait for the next patch to get it.
Feb 04, 2005 mburrack link
Strange that that would be the trick, although that explains why it gets so far in loading before crashing. I'll see if I can whip up a test case at home this weekend too to submit to Apple. t'would be nice to get it fixed :)

Stupid ATI drivers...

--mcn
Feb 06, 2005 KixKizzle link
Sooooo how many textures does VO currently have? I have a 256mg card. Would using texture compression slow down my fps? I'll try it later tonight. I'm just trying to get better fps around asteroids. I've got a geforce 6600 and the asteroids cut the frames in half. 85/2=42 on average. This is with graphics up the entire way tho. . . . Any way I can increase frames WITHOUT reducing my settings :) I'm greedy I kno...

/givemoney Devs 2c
Feb 07, 2005 tramshed link
The texture compression works fine on x86-64 gentoo with an nvidia gf4mx440, im assuming itll probably work fine on any nvidia card, might want to consider adding the option to the menus in linux if you have an nvidia. Ati, who knows, ive got a system ill toss a radeon in sometime soon to test it out. Thatll be using the kernel driver though, ati's proprietary drivers more than likely will crash n burn, but who knows.