-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A db migration does not take Query#apply_auto_limit in count #6767
Comments
This is how I patched it to work, but it seems far from best practices and I haven't much context to provide a good fix.
|
NOTE: my migration makes different hashes if upgrade->revert. Looking into. |
I have tight time so ended up with just making the migration irreversible 😬
|
I've encountered issues while migrating real production data - it was failing on queries which had no statements (e.g. empty or everything commented out), so hacked it that way:
😬 |
Thank you for reporting this! I was debugging a similar issue and came to similar conclusions. The problem is more extensive than just this migration:
I will create a fix. |
An update on this: On the good news side: the place in the code where we call On the bad news side: To fix the refresh issue, we need to generate a correct query hash for each query. Problem is that to do so we have two options:
I think I'll go for 3. |
What would it take for it to always work 100% automatically and reliably? On that note, with a "best effort in migrations" approach, what do you reckon the behaviour will be when (if?) it doesn't work, and do you reckon we can catch the failures in some better way? 😄 |
With the:
Would it be dangerous to run that automatically, so it's a hands off thing for admins? |
It's quite a lot of rewriting existing code. It will work incorrectly only for some of the parameterized queries. All other queries are fine.
We can have something that checks the hash of the query when trying to refresh it and updating if needed. But not sure I want to invest in this much.. a better path will be to deprecate this whole concept of query hashes. Not sure it brings enough value. |
Interesting. Is it something that if the query hash fails to be found, it could automatically run the query again to get the newest result instead of erroring out? |
In #7184 you can see:
I think these two should be enough? |
Issue Summary
I am trying to upgrade a Redash 10.1.0 instance to the current preview version
A new version is a "preview" docker image pushed at Feb 10th of 2024.
After migration:
I've discovered that this db migration:
https://github.com/getredash/redash/blob/master/migrations/versions/1038c2174f5d_make_case_insensitive_hash_of_query_text.py
does not take in count the "apply_auto_limit" setting of Queries.
So, if a Query's
options
containsapply_auto_limit: true
, this migration converts a query hash without applying the limit. But the QueryRunner applies it as usually generates query results adding theLIMIT 1000
. Then the result could never be associated with their queries anymore.I guess this db migration should be rewritten and use smth like this:
redash/redash/query_runner/__init__.py
Line 310 in 939bec2
Steps to Reproduce
Install 10.1.0. E.g. I've launched https://hub.docker.com/layers/redash/redash/10.1.0.b50633/images/sha256-f3e8d95656255d9684051604a586dfd50035fa1e297e68379dc1b557f1885c0f?context=explore. Or it could be pulled from github 🤷
Create a datasource (it could be a Redash's database as well)
Create and publish query, e.g.
select current_timestamp;
.Set it to be refreshed once a minute.
Ensure the query results are refreshed by refreshing a page after a few minutes.
Stop the instance, update the version (use a preview image or pull to the current master or whatever you should do), using the same internal Redash's database.
Run db migrations.
Run the redash.
Check the query after a few minutes - it wouldn't be refreshed. Not "Refresh" button will lead to the expected behaviour.
Technical details:
used docker image: https://hub.docker.com/layers/redash/redash/10.1.0.b50633/images/sha256-f3e8d95656255d9684051604a586dfd50035fa1e297e68379dc1b557f1885c0f?context=explore
The text was updated successfully, but these errors were encountered: