What version of Lattigo are you using?
v6.1.1
Does this issue persist with the latest release?
Yes.
What were you trying to do?
Serialize and deserialize ciphertexts with large scale values (e.g., scale = 1e100) using MarshalBinary/UnmarshalBinary.
What were you expecting to happen?
Serialization and deserialization should succeed for any valid scale value.
What actually happened?
Serialization fails with a buffer error when the scale exponent is large (e.g, 1e100).
The error message is:
failed to marshal challenge: ring.Poly.WriteTo: structs.Vector[uint64]. WriteTo:
buffer.WriteAsUint64Slice[uint64]: cannot WriteUint64Slice:
available buffer/8 is zero even after flush
This is due to Scale.BinarySize() underestimating the required buffer size for large exponents.
The current implementation of Scale.BinarySize() uses a fixed formula: 21 + (ScalePrecisionLog10+6)<<1
|
// BinarySize returns the serialized size of the object in bytes. |
|
// Each value is encoded with .Text('e', ceil(ScalePrecision / log2(10))). |
|
func (s Scale) BinarySize() int { |
|
// 21 for JSON formatting plus 2*(6 + ScalePrecisionLog10) for the scales encoding. |
|
return 21 + (ScalePrecisionLog10+6)<<1 |
|
} |
This formlula estimates the maximum length of the JSON string representation of the scale, assuming the exponent part (e.g., e+10) is always 4 characters. However, for large exponents like e+100, the exponent part become 5 characters, and the bufer is too small, causing serialization to fail.
Reproducibility
https://go.dev/play/p/lCmWsyhPPjX
package main
import (
"fmt"
"math/big"
"github.com/tuneinsight/lattigo/v6/core/rlwe"
)
func main() {
// Create a scale similar to what would be used in CKKS
scale := rlwe.NewScale(1 << 40)
binarySize := scale.BinarySize()
data, err := scale.MarshalBinary()
if err != nil {
panic(err)
}
fmt.Printf("BinarySize: %d\n", binarySize)
fmt.Printf("Actual JSON len: %d\n", len(data))
fmt.Printf("JSON: %s\n", string(data))
fmt.Println()
// Test with a larger scale
scale2 := rlwe.NewScale(new(big.Float).SetPrec(128).SetFloat64(1e100))
binarySize2 := scale2.BinarySize()
data2, err := scale2.MarshalBinary()
if err != nil {
panic(err)
}
fmt.Printf("Scale2 BinarySize: %d\n", binarySize2)
fmt.Printf("Scale2 Actual JSON len: %d\n", len(data2))
fmt.Printf("Scale2 JSON: %s\n", string(data2))
// Output:
// BinarySize: 111
// Actual JSON len: 111
// JSON: {"Value":"1.099511627776000000000000000000000000000e+12","Mod":"0.000000000000000000000000000000000000000e+00"}
//
// Scale2 BinarySize: 111
// Scale2 Actual JSON len: 112
// Scale2 JSON: {"Value":"1.000000000000000015902891109759918046836e+100","Mod":"0.000000000000000000000000000000000000000e+00"}
}
What version of Lattigo are you using?
v6.1.1
Does this issue persist with the latest release?
Yes.
What were you trying to do?
Serialize and deserialize ciphertexts with large scale values (e.g., scale = 1e100) using MarshalBinary/UnmarshalBinary.
What were you expecting to happen?
Serialization and deserialization should succeed for any valid scale value.
What actually happened?
Serialization fails with a buffer error when the scale exponent is large (e.g, 1e100).
The error message is:
This is due to
Scale.BinarySize()underestimating the required buffer size for large exponents.The current implementation of
Scale.BinarySize()uses a fixed formula:21 + (ScalePrecisionLog10+6)<<1lattigo/core/rlwe/scale.go
Lines 173 to 178 in f3966f4
This formlula estimates the maximum length of the JSON string representation of the scale, assuming the exponent part (e.g.,
e+10) is always 4 characters. However, for large exponents likee+100, the exponent part become 5 characters, and the bufer is too small, causing serialization to fail.Reproducibility
https://go.dev/play/p/lCmWsyhPPjX