-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching of search results in memory #9
Comments
Big +1 on this. My use case is similar, need to augment logging events with immutable ES data. This would be a huge performance boost. |
Absolutely +1 on this. I had actually assumed this was part of the plugin 😢 |
I have same needs also, but this issue haven't resolved so long time. So I created new filter plugin for solve this and similar problems. Please see logstash-filter-memoize. |
Hello all, apologies for the delay here. We are thinking through this caching feature and would love feedback from the broader community. For each of your use cases, is configurable LRU caching sufficient for most workloads? It would require initial cache warmup, and if the ES lookup dataset changes often, it could result in more misses which would impact throughput. For our DB lookups, we offer two caching options. The jdbc_streaming filter is used with an LRU caching strategy, while the jdbc_static filter allows for full local caching of the lookup dataset at startup, along with a periodic cache refresh option. Would a similar full local caching strategy be useful for you? Any other strategies you'd like to see? |
@acchen97 LRU would be suitable for our use cases, though seperate caches for hits and misses would be useful. Full local cache could also be very useful for us too. |
@pemontto thanks for your input. Do you mind sharing details on your lookup dataset? i.e. what kind of data, how big it is, how often it changes |
|
@acchen97 |
Big +1 on this! Has there been any movement since the last update? |
@acchen97 we would also benefit from this: |
here's btw. another workaround for the missing caching option: |
I know this is an old request, but just adding a huge +1. We have a logstash configuration that handles host logs. It would be extremely useful to lookup the host properties in ElasticSearch, such as Application, Customer, LogLevel, etc. This metadata (obtained from an Elasticsearch index) would not only enrich the event, but drive logic, such as, if the hostname is not tagged for WARN level, then drop WARN logs, or if the server belongs to Application XYZ, then ship the log to the XYZ index. |
+1 |
Is caching the search results in memory within the scope of this plugin? I'm thinking about implementing something simple and making a pull request, but I'll just make a custom plugin if it's not something that would likely be accepted.
The use case is as follows:
I think this could be accomplished by adding a simple LRU cache and two optional configuration values: the size of the cache in entries, and a identifier that uniquely represents the search. Without these parameters, the plugin would behave as usual and just hit ES every time.
The text was updated successfully, but these errors were encountered: