Sprite compression

Posted on May 28, 2011

Hello!

I was hoping to discuss our new title announcement in todays blog but unfortunately we’re waiting on the app review so we’ll be announcing early next week instead.

Credit for the topic of todays blog goes to our awesome long-serving coder Phil Jones (@logicstorm) who did the work on the system, some brief tests and produced the graphs for the team.

Our internal 2D framework is predictably focused around sprites. We have a sprite bank tool that imports various image files (both for single images and series of images making up animations), it then analyses, splits and packs these.

The build process

  • Removes blank space from any sprite (and stores an offset to compensate for the blank space)
  • Breaks down each sprite into cells (varies per platform but likely 64×64)
  • Each of these cells is then stored in sprite pages (again varies but 256×256 on iPhone)
  • Cells that are identical to ones elsewhere in the spritebank are only stored once (so an image of flat colour repeated elsewhere or a static part of an animated image won’t be duplicated in the pages)
  • For a frame of a sprite animation (or for a static sprite) we then store a list of cells that make up that sprite and the offsets to render them at.
  • We then compress each of the pages and store to file.

 

This system has been used by us very successfully over the years and has helped us package up lots of assets into some of our NDS games while performing real-time streaming + decoding of more complex animations.

The original compression method we used was LZSS , this decoded very fast and had pretty good results for us on NDS.

We still use the same sprite framework on our current 2D games pretty successfully, we’ve found however that we’ve needed to get the size of our sprite banks down a bit further and with increased CPU power available to us Phil investigated using JPG / Zlib / PNG and a combination of these (NOTE: No actual PNG tests were done but we compare to the source asset in PNG form).

The only change we need to make is by applying a different compression method in our sprite bank tool (we already supported uncompressed so it’s just a case of storing the type of compression used for each sprite page), on use of a sprite at runtime we lookup the sprite page needed, load it from file, decompress to RAM and then pass to GL/DirectX for use.

As mentioned most of the test was only across a single large image file , the graph below shows the compressed sizes of the sprite banks versus the source PNG. For the alpha in the image for JPG encoding we tried two methods PNG stored alpha and zlib alpha.

We did produce some data across a full project to show that these results are pretty similar across a more varied data set.

JPEG is obviously lossy unlike our other methods used, the above graphs are produced at 75% quality level. We did some testing at various quality levels to show the jpeg artefacts introduced versus the source.

Below 80% the results aren’t too great for a high quality asset and it isn’t great introducing some of these errors in our very pretty artwork.

Let us have a look at how the quality compares to file sizes to see how much tweaking the level could improve things.

Decoding time is very important to us (though can be hidden by  precaching during loading screens / streaming). The impressive speed of Sean Barrett’s stb_image versus libjpeg/libpng is shown in this chart.  It also shows that perhaps we should have considered zlib previously in our sprite system (though the NDS might have a different performance profile with zlib versus lzss).

Across the size and decode speed results then jpeg rgb + zlib alpha is a massive improvement on our current LZSS. We’re looking at using 85%-90% for our jpeg compression quality level for final game assets.

 

Further things to try

  • We didn’t try any of the pngcrush tools, we have used this before as a separate process and it has gained 30-40% savings on some alpha PNG images . We could have also tried this on a pure PNG spritebank storage method, it should beat the source PNG assets by a fair bit.
  • Vector assets! We’re aiming not to be using purely bitmap assets for much longer, especially for some of the artwork on current projects (which would adapt well).

 

Things we’ve been enjoying this week

 

Be Sociable, Share!

3 Responses

  1. Steven
    May 30, 2011

    Just a comment about “quality” : the quality factor in JPEG is (in addition to be implementation-specific but it seems the Independent JPEG Group’s implementation is the de factor standard) not a reliable estimate of perceived quality. 50% quality doesn’t mean twice as bad as 100% or twice better than 25%. The bit-rate curve (the “JPEG Quality vs File Size” curve) probably maps more accurately the quality factor in % to the perceived quality.

    The comparison would then be with JPEG at a perceptually lossless setting (85%-90% ?) to be fair, from the end-user point of view, to lossless compression methods like PNG.

    Of course, “end-user point of view” means getting people to compare on the device screen, which may reveal something else completely.


  2. admin
    May 30, 2011

    A very valid point regarding the non-linear scale, as the quality factor grid in the article shows anything below 70% looks ‘a bit pants’ (technical term!).

    We produced our sprite banks at various settings to test out how many of the team could spot compression issues 🙂


  3. Steven
    May 30, 2011

    In a reference-less test, on my screen, I would not be able to distinguish the 90% from the 100%, that is, I could not say the 90% is lossy; while the 80% shows some ringing artifacts (the “bit pantsiness”, to continue on your technical terminology 😉 ). On a NDS, I would’nt know, I don’t have one.


Leave a Reply