This repository was archived by the owner on Sep 13, 2025. It is now read-only.

Description
How do we distinguish between little-endian and big-endian trits?
For example, consider the following sequence of trits:
00++000++--+-bal3
- If we interpret this value as little-endian (least significant trit first), then the value is -42424210.
- If we interpret this value as big-endian (most significant trit first), then the value is 7902210.
We have the same problem with bits, of course, which is (for example) why some UTFs use a BOM.
Should decoded trits have a similar feature (Trit Order Mark — TOM)?