-
Notifications
You must be signed in to change notification settings - Fork 591
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nightly-20240130 nexmark-q5-many-windows perf degradation #14990
Comments
The throughput increased likely because of 0cd9ff1 #14558
now drops because of 9417409 #14855 14588 vs 14855, what a coincidence |
Interesting, but I can't explain this phenomenon. IMO this PR should only have an effect on UDF :) |
Interesting +1, Worth investigating the cause, I think |
Any update? The perf never came back to the original level http://metabase.risingwave-cloud.xyz/question/1304-nexmark-q5-many-windows-blackhole-medium-1cn-avg-source-output-rows-per-second-rows-s-history-thtb-266?start_date=2024-01-22 |
@TennyZhuang Any updates? |
CREATE SINK nexmark_q5_many_windows
AS
SELECT
AuctionBids.auction, AuctionBids.num
FROM (
SELECT
bid.auction,
count(*) AS num,
window_start AS starttime
FROM
HOP(bid, date_time, INTERVAL '5' SECOND, INTERVAL '5' MINUTE)
GROUP BY
bid.auction,
window_start
) AS AuctionBids
JOIN (
SELECT
max(CountBids.num) AS maxn,
CountBids.starttime_c
FROM (
SELECT
count(*) AS num,
window_start AS starttime_c
FROM
HOP(bid, date_time, INTERVAL '5' SECOND, INTERVAL '5' MINUTE)
GROUP BY
bid.auction,
window_start
) AS CountBids
GROUP BY
CountBids.starttime_c
) AS MaxBids
ON
AuctionBids.starttime = MaxBids.starttime_c AND
AuctionBids.num >= MaxBids.maxn
WITH ( connector = 'blackhole', type = 'append-only', force_append_only = 'true'); Plan:
|
It's indeed caused by #14558, but the reason is unknown. Will continue to investigate. CPU flamegraph: profile results.zip |
No results. Let's close the issue as the problem was already solved. |
Describe the bug
https://buildkite.com/risingwave-test/nexmark-benchmark/builds/2944#018d5b1a-81cd-4004-b5c1-21cf9024a263
https://grafana.test.risingwave-cloud.xyz/d/liz0yRCZz1/log-search-dashboard?orgId=1&var-data_source=Logging:%20test-useast1-eks-a&from=1706652288000&to=1706654091000&var-namespace=nexmark-bs-0-14-daily-20240130
Error message/log
No response
To Reproduce
No response
Expected behavior
No response
How did you deploy RisingWave?
No response
The version of RisingWave
nightly-20240130
Additional context
nightly-20240130
a0574b7466f5f4a28f8bcbcb660b2060a922094a
feat(test): add ObjectStore trait simulator support (#14545)1083266bd9c9924651e0cb6787f0f9094d3f0fcb
fix: fix assertion of creation tracker for sink into table (#14845)11b02eb0adfb2f1a07143f549e80d96d2f1a4237
feat: convert custom parallelism to auto/fixed in recovery loop in meta (#14871)8cfc7f1f92a6f2b387af9116b7620d8475ebfb91
refactor: extract some common functions for join (#14868)63e548538b9def0a36f48e4bc7fdb9b58caf067e
feat(stream): support row count for arrangement backfill (#14836)1e2605405fab48815aac4856be5d9a4f64824471
doc: Update README.md (#14867)4358a4ec92c61b3b3ac013651ca3f490c02f0ff3
feat(expr): allow partial option arguments in functions (#14738)1ecfe4ce3297bcca5f8eb7e1a52e34be5d3d9b2e
chore: remove the unused mem_table_spill_threshold opt (#14839)f4bae4b5d7369f85a0dd8fd04d48ab6a5e1c9c89
chore(deps): Bump strum_macros from 0.25.3 to 0.26.1 (#14851)9417409957dbdc047cc758b6652ae5134a68a9b4
revert: feat(stream): make Project expr evaluation concurrent (#14558) (#14855)aeeb347dda5bc8b84d8025d4ce09a10fa2b6a455
fix: fix compact task overlap check (#14856)d694af0851991a32ba052e1e112b6d57ef67d0b0
feat(ci): add concurrency control for integration test (#14837)43b6eefcd15803a9b02bac24d3c3758266d0e3eb
feat(case-when): constant lookup optimization for constant form case-when expression (#14586)8919044c0baf3c588796ba33cdc48c2333f3cc40
fix(optimizer): fix temporal join shuffle (#14848)b471a9bd2d3da552055b7f449e8f13f6dd701e55
feat(sql-backend): support drop source/view/streaming jobs ddls in sql-backend (#14690)The text was updated successfully, but these errors were encountered: