Its a thing from the old 8bit gaming world. You code in 5bit chunks
with a leading length marker. 5bits is enough for a-z and some bits
of punctuation, plus capital implying space and 'escape' for an 8bit
sequence block.
Gets you a bit under 40% compression with real life data and takes about
200 bytes to decode
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/