Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
A better way to handle texture scale/cache
06-21-2014, 07:52 AM (This post was last modified: 06-21-2014 07:56 AM by RadarNyan.)
Post: #7
RE: A better way to handle texture scale/cache
What if we keep a copy of the original (not processed) texture data (in the full power-of-two size) when we do the process (scale.. etc)

Then when the sizeInRAM changed, we compare the hash of current texture data (using sizeInRAM) and the old copy data (also use sizeInRAM) if they are identical, just use the cached texture instead of reloading it.

This way we could avoid uploading the same texture too frequently so we avoid a terrible performance drop caused by reloading textures.

But... since hashing will be become way more frequent than the current method, will it cause a performance problem? (I guess not?)

It wouldn't take too much extra memory if we keep a copy (of original data) right? I mean... If we can keep 4xscaled textures (which I assume at least three times bigger than original texture) in memory, sure wouldn't be a problem just for double original size, right?
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
RE: A better way to handle texture scale/cache - RadarNyan - 06-21-2014 07:52 AM

Forum Jump: