This repository was archived by the owner on Sep 13, 2025. It is now read-only.

Description
How does the process of encoding trits into bytes work when the trits represent a negative value?
For example, consider the following sequence of little-endian* trits:
00++000++--+-bal3
These trits represent the value -42424210.
How do we determine the length of the resulting binary sequence? +42424210 can be represented using 19 bits (11001111001001100102):
- Should we leave this up to the OS/architecture?
- Should we pad the result to 24 bits (3 bytes) so that we can make the first one the sign bit?
- Should we pad the result to 32 bits (4 bytes) so that we can represent it as a standard data type (e.g., C's
signed long)?
* (in this context, assume little-endian trits and bits — that's a whole 'nother can of worms!)