Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(storage): enable preload for log store and dynamic filter #13558
feat(storage): enable preload for log store and dynamic filter #13558
Changes from 10 commits
7d0a80e
c5f66e7
4abeaa3
6e42e38
c6a1c82
cd8a949
3b532f5
5ef5e7b
8b1d10f
e4dc5b9
cd5f2cb
cb6bd7d
372c9e9
d45a926
9c6ab96
ba49e30
bebc963
09499ff
01685da
7970480
05970f6
035a879
29e516d
bdd467d
e12ac31
29bad52
1597cdd
ed431d2
6b782a3
cd29165
400e7db
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How the provided cache policy is used is confusing. The behavior is:
policy
CachePolicy::Fill(CachePriority::Low)
How about making it more consistent?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
More thoughts:
we have two mechanism to constrain prefetch memory usage now:
pending_streaming_loading
is used to track the usage and it is limited bylarge_query_memory_usage
block_cache_capacity
(and the low priority ratio).The motivation of introducing low priority block cache is to control the behavior of cache refill, not block prefetch, and we are mixing it up here. IMO, logically it is better to have a clear sepration:
Ideas:
pending_streaming_loading
toprefetch_buffer_usage
andlarge_query_memory_usage
toprefetch_buffer_capacity
prefetch_buffer_usage
. For normal read prefetch, the buf is aVec<Block>
. For streaming read prefetch, the buf is the implicit recv buf.prefetch_buffer_capacity
, prefetch will be disable and fall back to per block normal read. This is completely independent to the block cache policy.What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need some documentation on what the behavior is after setting these flags
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
streaming read is removed from this PR. Can we either remove this option until streaming read is added back or at least modify the doc to mention that this option is not used.