Replies: 1 comment
-
Hi there, one can actually interpret a (deterministic) pseudo-random number generator as a hash function of the seed. As such, the resulting hash encoding behaves pretty much the same as with simpler and cheaper hash functions. We confirmed this by trying the PCG32 rng. In section 6 of the paper ("choice of hash function" paragraph), we discuss it in slightly more detail. Cheers! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I was very impressed with the Instant NeRF paper. Thanks for the good research.
However, I have a question regarding the paper.
Does it have to be a hash function to assign a hash table index to each grid? (Instant NeRF)
Instead of using hash function, I think we can use rand(0~T) function to give index to grids. In that case, The table with trainable feature vectors is not a hash table, but just a table, but if the size is fixed to T as before, the effect will be the same.
Is there any particular reason the input encoding works better because it's a hash function?
Beta Was this translation helpful? Give feedback.
All reactions