Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hammer pushback support #106

Merged
merged 4 commits into from
Aug 1, 2024
Merged

Conversation

AlCutter
Copy link
Collaborator

@AlCutter AlCutter commented Aug 1, 2024

This PR adds support in the hammer for recognising and reacting to pushback from the target log.

@codecov-commenter
Copy link

codecov-commenter commented Aug 1, 2024

Codecov Report

Attention: Patch coverage is 0% with 28 lines in your changes missing coverage. Please review.

Project coverage is 26.17%. Comparing base (46ec9c2) to head (1bf6dae).
Report is 28 commits behind head on main.

Files Patch % Lines
hammer/client.go 0.00% 18 Missing ⚠️
hammer/hammer.go 0.00% 9 Missing ⚠️
hammer/workers.go 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #106      +/-   ##
==========================================
- Coverage   35.80%   26.17%   -9.64%     
==========================================
  Files          16       19       +3     
  Lines        1363     1857     +494     
==========================================
- Hits          488      486       -2     
- Misses        801     1297     +496     
  Partials       74       74              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

hammer/client.go Outdated
// Continue below
case http.StatusServiceUnavailable, http.StatusBadGateway, http.StatusGatewayTimeout:
// These status codes may indicate a delay before retrying, so handle that here:
retryDelay(resp.Header.Get("RetryAfter"), time.Second)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This just computes a value and then throws it away, right? Where does the retry happen?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

d'oh, yes - it used to actually delay in the method.

@@ -201,7 +201,9 @@ func (w *LogWriter) Run(ctx context.Context) {
lt := leafTime{queuedAt: time.Now()}
index, err := w.writer(ctx, newLeaf)
if err != nil {
w.errChan <- fmt.Errorf("failed to create request: %v", err)
if !errors.Is(ErrRetry, err) {
w.errChan <- fmt.Errorf("failed to create request: %w", err)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it deserves a comment to say that as implemented this retry uses a token from the throttle for each attempt. So if we had a throttle of 100 writes per second, but one thread retried 99 times in a second, we'd only have 1 leaf written and no errors propagated down w.errChan to explain where all the writes were going.

Copy link
Collaborator Author

@AlCutter AlCutter Aug 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've de-quietened it a bit; now it'll "squash" reporting of pushback errors to once per second (with a count), so you can see that it's happening, but can also still see what else is going on.

@AlCutter AlCutter force-pushed the hammer_pushback branch 2 times, most recently from 9967644 to 754034b Compare August 1, 2024 16:03
@AlCutter AlCutter merged commit bc6fcfe into transparency-dev:main Aug 1, 2024
6 checks passed
@AlCutter AlCutter deleted the hammer_pushback branch August 1, 2024 16:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants