Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 downloader has a hardcoded limit of 100 files #85

Open
slik13 opened this issue Apr 16, 2024 · 1 comment
Open

S3 downloader has a hardcoded limit of 100 files #85

slik13 opened this issue Apr 16, 2024 · 1 comment

Comments

@slik13
Copy link

slik13 commented Apr 16, 2024

Currently the S3 downloader has a hardcoded limit of 100 files (see: https://github.com/kserve/modelmesh-runtime-adapter/blob/2d5bb69e9ed19efd74fbe6f8b76ec2e970702e3c/pullman/storageproviders/s3/downloader.go#L79C3-L79C27).

This means that any model that contains more than 100 files gets cut off at that arbitrary file and causes the model to only be partially copied and then subsequently fails at runtime. For example, when using a model like the argos-translate model with many languages you can exceed the 100 file limit.

1.7132796006805687e+09	DEBUG	Triton Adapter.Triton Adapter Server	found objects to download	{"type": "s3", "cacheKey": "s3|xyz", "path": "translate", "count": 100}

This limit seems arbitrary and should be made configurable.

@nathanbowness
Copy link

nathanbowness commented Nov 20, 2024

Any updates on this issue? This has been causing problems for us when loading models from S3.

A follow up question as well:

  • Are there any technical limitations as to why 100 was original chosen? Or would making this configurable, and using a value of say "300" cause any issues?

We're considering contributing a fix, but just wanted to ask if there are any technical limitations surrounding it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants