Virtual texture coordinates and offsets.

Started by
3 comments, last by zdeafz 1 year, 7 months ago

So i've been delving into virtual texturing, and there's one thing i really dont get. I've watched the Sean Barret talk and read many papers but i just can't seen to get it.

For example we have a picture, and uv's to draw on a mesh from that picture. I tile the picture into 16x16 tiles, make an indirection texture that is 16x16 pixels. According to the talk i just look into the indirection texture using UV's and wuallah you get a corner of you physical texture in memory.
That's all fine and dandy, but i can't understand for the life of me how do you calculate the offset. I don't have the x and y of the page stored in my uv coordinates, they don't come like that.
Do i first have to render the scene out with uv's provided with the mesh, then translate those into my own virtual texture uv's in a different form, or do they just come like that.
What are even virtual texture coordinates or am I missing something.
So far as i've understood :
Virtual texture - basically the texture you have on disk that you tiled.
Physical texture - the tiles that reside in GPU memory.

Advertisement

zdeafz said:
Do i first have to render the scene out with uv's provided with the mesh, then translate those into my own virtual texture uv's in a different form, or do they just come like that.

You have a pre-processing tool which packs the many textures to a huge virtual texture, and this tool also transforms the UVs of the mesh from the initial UVs to the virtual UVs.
So the problem of translating UVs is already solved when the game reads its data.

That'a at least what i have assumed, not having experience with virtual texturing.

Yes, as JoeJ said you need to form some kind of giant “atlas” that contains all of the texture data that forms the full virtual texture. In some cases this might be straightforward. For example, the most common use case for virtual textures is probably terrain: in that case your terrain already has an implicit rectangular layout, so you just make your virtual texture coordinates match the same coordinates that you use to sample the terrain height map. With arbitrary meshes it's more complicated. You basically have the same issue you have with lightmap UVs, where you now need to form an atlas that assigns unique UV charts to all of your meshes. If you've never done before, you can consider integrating xatlas to generate that unique parameterization for you. Those UVs you get back from xatlas can then be used directly as your virtual texture UVs.

Thank you so much for replies'
@MJP as for xatlas ill look into it.

This topic is closed to new replies.

Advertisement