Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"You can only submit max 100 commands pr. sync operation. Please split them up in smaller chunks" #34

Open
moorsey opened this issue Jan 24, 2023 · 10 comments · May be fixed by #44
Open

Comments

@moorsey
Copy link

moorsey commented Jan 24, 2023

Getting the following when running

python3 autodoist.py -a *** -l next -hf 2
2023-01-24 12:56:47 INFO     You are running with the following functionalities:

   Next action labelling mode: Enabled
   Regenerate sub-tasks mode: Disabled
   Shifted end-of-day mode: Disabled

2023-01-24 12:56:49 INFO     Autodoist has successfully connected to Todoist!
2023-01-24 12:56:50 INFO     SQLite DB has successfully initialized!

2023-01-24 12:57:04 ERROR    Error trying to sync with Todoist API: 400 Client Error: Bad Request for url: https://api.todoist.com/sync/v9/sync
Traceback (most recent call last):
  File "/srv/dev-disk-by-uuid-12b417ae-a39e-46cb-aae6-35bf23871f11/dockerdata/autodoist/autodoist/autodoist.py", line 521, in sync
    response.raise_for_status()
  File "/usr/local/lib/python3.9/dist-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.todoist.com/sync/v9/sync

@Hoffelhas
Copy link
Owner

Hm, that's indeed strange. I'll give it a look too this weekend. Looks like the API key has some issues when passed to the sync API.

@Hoffelhas
Copy link
Owner

I was not able to recreate the issue, so I've added a few additional debug logs and pushed it to a new branch '34_extra_logs'.

Could you please run this version with the --debug flag on, and send me a copy with the sensitive information removed? I'm only interested in the last part where it caches the sync API errors.

@moorsey
Copy link
Author

moorsey commented Feb 11, 2023

Many thanks. Looks like it could be a limit issue?

2023-02-11 11:51:56 DEBUG    response: {
  "error": "You can only submit max 100 commands pr. sync operation. Please split them up in smaller chunks",
  "error_code": 36,
  "error_extra": {
    "event_id": "1eab52c58829436faa9d0d2a319b6052",
    "retry_after": 7
  },
  "error_tag": "LIMITS_REACHED_COMMANDS",
  "http_code": 400
}

@moorsey
Copy link
Author

moorsey commented Mar 13, 2023

Hey @Hoffelhas

Just checking in, if you have any thoughts on this one?

Many thanks!

@moorsey
Copy link
Author

moorsey commented Apr 7, 2023

Dug a little into this, as I understand, the following is the code that sends the batched requests off and is what will fail if there are more than 100.

        data = 'sync_token=' + api.sync_token + \
            '&commands=' + json.dumps(api.queue)

Many people will not hit this, but as I have quite a lot of tasks and not been able to use autodoist for some time, there are now too many requests to send

Having a look to see if I can figure out how to only send 100 at a time and will contribute back if I manage it! Found code on splitting lists into chunks, but not fully sure how to integrate with these lines without making a mess

@moorsey moorsey changed the title 400 Client Error: Bad Request for url: https://api.todoist.com/sync/v9/sync "You can only submit max 100 commands pr. sync operation. Please split them up in smaller chunks" Apr 7, 2023
@Hoffelhas
Copy link
Owner

Hoffelhas commented Apr 7, 2023

Hi there, sorry it took a while for me to respond. Todoist indeed got quite a bit stricter with the max. amount of syncs you're allowed to send in a given period. Basically the documentation says the following:

  • For each user, you can make a maximum of 450 partial sync requests within a 15 minute period.
  • For each user, you can make a maximum of 45 full sync requests within a 15 minute period.
  • The maximum number of commands is 100 per request.

When adding new single items over time, it's indeed difficult to reach these numbers. However if you have a project with >100 items, and you would activate or change labelling on the project level, then you indeed would get a batch that's too big. However, you're thinking in the right direction; it should then be split up in multiple batches with max. 100.

This should be relatively simple: when we enter the 'if api.queue' at line 1523, we have to check if api.queue>100, if so, split it up and run each block separately through sync(api) currently at line 1524.

However do note that if you reach the 450 changes within 15 minutes, then Todoist will hard block your connection. So even if we implement this work-around, you should not label and un-label your project with >100 items more than a few times per hour.

@moorsey
Copy link
Author

moorsey commented Apr 7, 2023

Yes, I saw that other limitation. I don't think that would really be an issue normally, even the 100 commands per request is a bit unique to initial syncs I think, or as you say, large parallel processed projects

Appreciate the reply!

@moorsey
Copy link
Author

moorsey commented Apr 11, 2023

OK, I've come up with:

        # Sync all queued up changes
        if len(api.queue) < 100:
            sync(api)
        else:
		start = 0
		end = len(api.queue)
		step = 100
		for i in range(start, end, step):
		        x = i
		sync(api.queue[x:x+step])

But I don't think the last line is right, doesn't work when testing at least anyway. Looks like it needs to be "sync(api)", but needs to be called on the list batch

Will keep trying anyway!

@ShayHill
Copy link

I've done something similar to moorsey on my own project:

_COMMAND_CHUNK_SIZE = 99

def _write_some_changes(
    headers: CaseInsensitiveDict[str], commands: list[Command]
) -> str:
    """Write changes to the Todoist API.

    :param headers: Headers for the request (produced by headers.get_headers)
    :param commands: list of dictionaries (commands) to add to the API
    :return: sync_token from the API
    """
    resp = requests.post(
        SYNC_URL, headers=headers, data=json.dumps({"commands": commands})
    )
    resp.raise_for_status()
    return str(resp.json()["sync_token"])


def write_changes(
    sync_token: str, headers: CaseInsensitiveDict[str], commands: list[Command]
) -> str:
    """Write the changes to the Todoist API, one chunk at a time.

    :param sync_token: current sync_token, will be updated if any commands are sent
    :param headers: Headers for the request (produced by headers.get_headers)
    :param commands: list of dictionaries (commands) to add to the API
    :return: sync_token from the API

    I don't know what the soft limit is, but I get lot of bad request errors if I
    send 1000 commands at once.
    """
    if not commands:
        return sync_token
    try:
        sync_token = _write_some_changes(headers, commands[:_COMMAND_CHUNK_SIZE])
    except Exception:
        # give up and start the whole main loop over
        return "*"
    time.sleep(1)
    return write_changes(sync_token, headers, commands[_COMMAND_CHUNK_SIZE:])

It's not isolated enough to be pasted into autodoist unfortunately, but it might give clues to someone in the thread. It does work. I hit the limit often when hiding / unhiding large projects with autotagging.

@moorsey moorsey linked a pull request Jan 6, 2024 that will close this issue
@moorsey
Copy link
Author

moorsey commented Jan 6, 2024

Just put some code together for this also. Disclaimer, had help from my friend Google Bard on this. We muddled our way through together!

First pull request after being on the internet for some time, hoping @Hoffelhas is well and able to look through the other contributions soon

Hoping to get my GTD game back in order after a few years lost in the ocean, now hopefully have my next action labelling back! Adding "learn python" to my projects list!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants