You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the problems we keep running into is: does this data going to fit the block size limit ?
Right now there is no great solution for this, only thing we can do try and encode bunch of times and measure the size. Unfortunately that is not really a great option, because:
Previously encoded data is not reused, so we waste computation and create more data to be GC-ed.
If we are going above the block size limit there is no good way to backtrack
I do not know what the answer is here, but I do like the way CARBufferWriter came out and I think maybe something along the same lines could work here as well. Specifically I would like to:
Allocate and pass in buffer to encode data into as opposed to just buffer out for the encoded node.
This also provides better control in cases where we want to encode several things into a larger buffer.
It also implies you can't accidentally create a block size which is greater than block size limit.
Ideally API should allow encoder avoid re-encoding same data over and over again.
The text was updated successfully, but these errors were encountered:
One of the problems we keep running into is: does this data going to fit the block size limit ?
Right now there is no great solution for this, only thing we can do try and encode bunch of times and measure the size. Unfortunately that is not really a great option, because:
I do not know what the answer is here, but I do like the way CARBufferWriter came out and I think maybe something along the same lines could work here as well. Specifically I would like to:
The text was updated successfully, but these errors were encountered: