Duplicate texture dump with garbage data
|
11-27-2020, 10:42 AM
(This post was last modified: 01-14-2021 12:27 PM by BanhCanh.)
Post: #1
|
|||
|
|||
Duplicate texture dump with garbage data
Hello,
I've been trying to upscale some textures of Growlanser Wayfayrer of time and I've met some problems as it dumps a lots of duplicate no matter the settings. So here is my question: Is there a definite number of duplicate texture with garbage data ? What i've tried: -ignoreAddress -reducedHash -xxh64 -hashranges -hashes Things is some textures are actually only in the first 1/3 of the dump (meaning reducedHash won't have as much effect on them). I can't use hashranges because they also appear at multiples addresses (and from what I know I have to disable ignoreAddress to effectively use it). So I settled for using xxh64 ignoreAddress reducedHash (because it actually works on most textures) and to counter that I created a little script that will auto-feed the textures.ini (all images's hashes will be linked to one .png) if it detect images where the first 1/3 is identical (using imagemagick) and then remove the duplicate image. I know it may not be ideal but it looks like it is working great and is fast enough so far. The reason I'm still asking is because I'm wondering if -new- images with garbage data will always be dumped (with the garbage data always changing)? Or will it end and it is possible to link them all to the same .png in the [hashes] of the textures.ini? In my test, after around ~150 duplicates it stops so the .ini don't grow anymore but it varies depending on the textures so I can't be sure. Please do tell me if I'm not making myself clear, it's not my native language thanks RE-EDIT: If anyone is passing by, I ended up adding settings for the textures.ini in my custom PPSSPP build so I can specify a reducehashsize ( =/= from the 0.5 by default) for a chosen dimension (also in the .ini). All dump meeting these conditions will have a prefix ex:128x256_addresscluthash.png which should prevent "hashname/filename" collision with those who don't I guess ? I'm unsure but it seems to be so for now.. In the ini it is looking like that: [reducehashrange] ;; syntax : w,h = reduceHashSize 128,256 = 0.3125 EDIT2: Just added the two files I modified before rebuilding. I imitated how the [hashranges] worked to add my configs. I believe both can work together (hashrange to resize the dump then reducehashrange to reduce the part of the "new" dump that is hashed). I'm just hoping someone more knowledgeable does something similar/better. EDIT3: I AI upscaled the whole game (8K textures+) and didn't notice any weird issues (re-did a full playthrough). Some textures are still duplicated but that's another issue (exact same texture if I trust imagemagick, same clut but different hash). Those are relatively "rare" so I just added them to the [hashes] part. |
|||
« Next Oldest | Next Newest »
|
Messages In This Thread |
Duplicate texture dump with garbage data - BanhCanh - 11-27-2020 10:42 AM
|