Skip to content
This repository was archived by the owner on Sep 13, 2025. It is now read-only.
This repository was archived by the owner on Sep 13, 2025. It is now read-only.

Encoding negative trit values to bytes #9

@todofixthis

Description

@todofixthis

How does the process of encoding trits into bytes work when the trits represent a negative value?

For example, consider the following sequence of little-endian* trits:

00++000++--+-bal3

These trits represent the value -42424210.

How do we determine the length of the resulting binary sequence? +42424210 can be represented using 19 bits (11001111001001100102):

  • Should we leave this up to the OS/architecture?
  • Should we pad the result to 24 bits (3 bytes) so that we can make the first one the sign bit?
  • Should we pad the result to 32 bits (4 bytes) so that we can represent it as a standard data type (e.g., C's signed long)?

* (in this context, assume little-endian trits and bits — that's a whole 'nother can of worms!)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions