Data compression #2207
-
Hi there! I'm trying to understand how storage data compression works. Here is storage use graph: Older data is ingested from different seq instance using clef tool. Recent data (from April onvards) is ingested directly to seq instance, and obviously takes much more space. Amount of evens is within same range, so I'm guessing imported/older data is heavily compressed. How compression works? Is there a way for force compression for anything older than X days to save disk space? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Hi @aydar-viv The compression is the same across the entire stream. You can use the histogram on the events page to see how event volume has changed over time, or the 'Space by event type' built-in query to check the data size by event type. If the event volume is the same then it must be that the newer events carry more data. The two tools you have available to manage disk usage are:
|
Beta Was this translation helpful? Give feedback.
Hi @aydar-viv
The compression is the same across the entire stream. You can use the histogram on the events page to see how event volume has changed over time, or the 'Space by event type' built-in query to check the data size by event type. If the event volume is the same then it must be that the newer events carry more data. The two tools you have available to manage disk usage are: