Hi,
I got a total 755G index saved in my disk after encoding the whole wiki passage. The large index takes huge storage and long time to load to GPU. However, it requires less than 100G after loading to GPU, which could be the index compression mentioned in your paper. Is it possible to save and load the compressed index for better time and storage consumption?
Hi,
I got a total 755G index saved in my disk after encoding the whole wiki passage. The large index takes huge storage and long time to load to GPU. However, it requires less than 100G after loading to GPU, which could be the index compression mentioned in your paper. Is it possible to save and load the compressed index for better time and storage consumption?