The SHA file contains all the graphical elements used by Jill of the Jungle, grouped into tilesets. Each image is stored in 8-bit linear VGA format (one byte for one pixel) along with an optional colour mapping table to convert the colour codes into a reduced range suitable for display in EGA and CGA video modes.
The file has no signature, however the array of tile offsets can be read and checked to ensure they are within range (and don't point past the end of the file.) Likewise parsing the rest of the fields to ensure they are in range will ensure only valid files are read.
|UINT32LE offsets||Offset of each tileset|
|UINT16LE sizes||Size of each tileset|
The file starts with an array of 128 32-bit unsigned integers. Only the first half of these contain actual values. The second half are zero/unused. These numbers represent the position of the first byte of each tile set in the file. If the offset and size are both zero, that entry is unused.
The file continues with an array of 128 16-bit unsigned shorts. Again, only the first 64 contain values. These numbers represent the length, in bytes, of each respective tile set.
The first tile set object starts directly following this table at byte 768, i.e. ((128 * 4) + (128 * 2)), as indicated by one of the offsets in the table itself.
It is important to note that these values are stored little-endian. This means that for example the offset 768 is stored as the four bytes 00-03-00-00, not as 00-00-03-00. This is true for both offsets and lengths.
At the offset given in the header, each tileset is in the following structure.
|UINT8 numShapes||Number of tiles in the tile set|
|UINT16LE numRots||Doesn't seem to have any use and is generally 1 (name taken from Xargon source)|
|UINT16LE lenCGA||How many bytes of memory will be used for data in the respective video mode after decompression|
|UINT8 numColourBits||Bit depth of colour map (see below)|
|UINT16LE flags||One or more values defining how the data should be treated:|
0x0001 = SHM_FONTF (font)
0x0002 = unused
0x0004 = SHM_BLFLAG (level tile set)
If the SHM_FONTF flag is set, the data is typically a font and there is no colour map present (and so a byte value of 1 will refer to colour #1 in CGA, EGA and VGA modes.)
Likewise if the numColourBits is 8, there is no colour map either. How are these images displayed in EGA and CGA modes then?
Otherwise, the structure above is immediately followed by a colour map (see below). The size of the colour map is calculated from the numColourBits field:
length = (1 << numColourBits) * 4
If the colour map is present, it is used to reduce the 8-bit (256 colour) images down to EGA (16 colour) and CGA (four colour) depths.
The numColourBits field controls how many entries are in the colour map:
entries = 1 << numColourBits
Each entry is four bytes long, representing CGA, EGA and VGA, respectively, and then an unused zero value. Each byte maps to the index of a colour in the palette. If there is no colour map, the bytes stored in the image directly correlate with the game's palette.
Following the colour map is the array of tile objects. Each tile has a three byte header.
|UINT8 width||Image width in bytes/pixels|
|UINT8 height||Image height in bytes/pixels|
|UINT8 type||Data format|
The image data follows, with the exact format dependent on the type value.
Jill of the Jungle uses the earliest known version of the engine, which doesn't support anything but BYTE. Kiloblaster is a further updated version of the engine which supports PLAIN, at least. It's likely that these compression features were planned but not implemented until after a few releases of the engine.
Type 0: BYTE
This format is raw 8bpp - each byte represents a pixel. The first byte is the pixel at (0,0) and the last byte is the pixel at (w-1,h-1). The byte's value is transformed through the colour map (if one is present), then it becomes an index into the game's palette.
The length of the image data (in bytes) is calculated as width * height.
Type 1: PLAIN
Pixels are represented in their raw form. If the colour depth is 4 bits it will store 2 pixels per byte and with 2 bits it will store 4 pixels per byte.
In this case, the tile lengths (in bytes) are reduced accordingly - with 4bpp (two pixels per byte) the tile length will be half the number of bytes compared to 8bpp (one pixel per byte.)
Since the leftover bits at the end of a line will never be used to store the start data of a next line, the data at the end of 4-bit and 2-bit lines always needs to be rounded up to the next full byte. This full data length per line is called the 'stride'. To calculate this stride, the following formula can be used:
stride = ((width * bits_per_pixel) + 7) / 8
To calculate the full tile size, simply multiply the stride with the height:
size_in_bytes = stride * height
Type 2: RLE
This type is apparently unused.
While it is unknown whether this feature is available in Jill of the Jungle, Xargon can read certain tiles as palettes. If an entry is flagged as 8-bit color and 64x12 in size, the 768-byte content is treated as a 6-bit VGA Palette. In order for the palette to be used, it must be "requested" by the game in some manner (e.g. loading an image for a map file). Instead of reading this tile as a standard image, the game will instantly use this as the new palette. Specifically, this means the palette switch will occur when the tile is first loaded, such as when entering a level, rather than when the tile is actually displayed on-screen. This also means it is not possible to store a full 256-colour tile of 64x12 pixels in size, however this limitation could be worked around if needed by using a colour map and an image of 128 colours or less.
It is currently unknown whether this feature is used in Xargon, but if it is, all 256 colours are set. The feature is used in Kiloblaster (to set the palette used for the VGA level backdrops), but of the 256-colours in the tile, only colours 15 to 240 (inclusive) are set. The other colours in the palette are left unchanged.
Confirm if Jill can do this too