Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): bump the pip group across 7 directories with 7 updates #13

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Jun 2, 2024

Updates the requirements on gunicorn, requests, onnx, tqdm, transformers, text-generation and gradio to permit the latest version.
Updates gunicorn from 21.2.0 to 22.0.0

Release notes

Sourced from gunicorn's releases.

Gunicorn 22.0 has been released

Gunicorn 22.0.0 has been released. This version fix the numerous security vulnerabilities. You're invited to upgrade asap your own installation.

Changes:

22.0.0 - 2024-04-17
===================
  • use utime to notify workers liveness
  • migrate setup to pyproject.toml
  • fix numerous security vulnerabilities in HTTP parser (closing some request smuggling vectors)
  • parsing additional requests is no longer attempted past unsupported request framing
  • on HTTP versions < 1.1 support for chunked transfer is refused (only used in exploits)
  • requests conflicting configured or passed SCRIPT_NAME now produce a verbose error
  • Trailer fields are no longer inspected for headers indicating secure scheme
  • support Python 3.12

** Breaking changes **

  • minimum version is Python 3.7
  • the limitations on valid characters in the HTTP method have been bounded to Internet Standards
  • requests specifying unsupported transfer coding (order) are refused by default (rare)
  • HTTP methods are no longer casefolded by default (IANA method registry contains none affected)
  • HTTP methods containing the number sign (#) are no longer accepted by default (rare)
  • HTTP versions < 1.0 or >= 2.0 are no longer accepted by default (rare, only HTTP/1.1 is supported)
  • HTTP versions consisting of multiple digits or containing a prefix/suffix are no longer accepted
  • HTTP header field names Gunicorn cannot safely map to variables are silently dropped, as in other software
  • HTTP headers with empty field name are refused by default (no legitimate use cases, used in exploits)
  • requests with both Transfer-Encoding and Content-Length are refused by default (such a message might indicate an attempt to perform request smuggling)
  • empty transfer codings are no longer permitted (reportedly seen with really old & broken proxies)

** SECURITY **

  • fix CVE-2024-1135
  1. Documentation is available there: https://docs.gunicorn.org/en/stable/news.html
  2. Packages: https://pypi.org/project/gunicorn/
Commits
  • f63d59e bump to 22.0
  • 4ac81e0 Merge pull request #3175 from e-kwsm/typo
  • 401cecf Merge pull request #3179 from dhdaines/exclude-eventlet-0360
  • 0243ec3 fix(deps): exclude eventlet 0.36.0
  • 628a0bc chore: fix typos
  • 88fc4a4 Merge pull request #3131 from pajod/patch-py12-rebased
  • deae2fc CI: back off the agressive timeout
  • f470382 docs: promise 3.12 compat
  • 5e30bfa add changelog to project.urls (updated for PEP621)
  • 481c3f9 remove setup.cfg - overridden by pyproject.toml
  • Additional commits viewable in compare view

Updates requests from 2.31.0 to 2.32.2

Release notes

Sourced from requests's releases.

v2.32.2

2.32.2 (2024-05-21)

Deprecations

  • To provide a more stable migration for custom HTTPAdapters impacted by the CVE changes in 2.32.0, we've renamed _get_connection to a new public API, get_connection_with_tls_context. Existing custom HTTPAdapters will need to migrate their code to use this new API. get_connection is considered deprecated in all versions of Requests>=2.32.0.

    A minimal (2-line) example has been provided in the linked PR to ease migration, but we strongly urge users to evaluate if their custom adapter is subject to the same issue described in CVE-2024-35195. (#6710)

v2.32.1

2.32.1 (2024-05-20)

Bugfixes

  • Add missing test certs to the sdist distributed on PyPI.

v2.32.0

2.32.0 (2024-05-20)

🐍 PYCON US 2024 EDITION 🐍

Security

  • Fixed an issue where setting verify=False on the first request from a Session will cause subsequent requests to the same origin to also ignore cert verification, regardless of the value of verify. (GHSA-9wx4-h78v-vm56)

Improvements

  • verify=True now reuses a global SSLContext which should improve request time variance between first and subsequent requests. It should also minimize certificate load time on Windows systems when using a Python version built with OpenSSL 3.x. (#6667)
  • Requests now supports optional use of character detection (chardet or charset_normalizer) when repackaged or vendored. This enables pip and other projects to minimize their vendoring surface area. The Response.text() and apparent_encoding APIs will default to utf-8 if neither library is present. (#6702)

Bugfixes

  • Fixed bug in length detection where emoji length was incorrectly calculated in the request content-length. (#6589)
  • Fixed deserialization bug in JSONDecodeError. (#6629)
  • Fixed bug where an extra leading / (path separator) could lead urllib3 to unnecessarily reparse the request URI. (#6644)

... (truncated)

Changelog

Sourced from requests's changelog.

2.32.2 (2024-05-21)

Deprecations

  • To provide a more stable migration for custom HTTPAdapters impacted by the CVE changes in 2.32.0, we've renamed _get_connection to a new public API, get_connection_with_tls_context. Existing custom HTTPAdapters will need to migrate their code to use this new API. get_connection is considered deprecated in all versions of Requests>=2.32.0.

    A minimal (2-line) example has been provided in the linked PR to ease migration, but we strongly urge users to evaluate if their custom adapter is subject to the same issue described in CVE-2024-35195. (#6710)

2.32.1 (2024-05-20)

Bugfixes

  • Add missing test certs to the sdist distributed on PyPI.

2.32.0 (2024-05-20)

Security

  • Fixed an issue where setting verify=False on the first request from a Session will cause subsequent requests to the same origin to also ignore cert verification, regardless of the value of verify. (GHSA-9wx4-h78v-vm56)

Improvements

  • verify=True now reuses a global SSLContext which should improve request time variance between first and subsequent requests. It should also minimize certificate load time on Windows systems when using a Python version built with OpenSSL 3.x. (#6667)
  • Requests now supports optional use of character detection (chardet or charset_normalizer) when repackaged or vendored. This enables pip and other projects to minimize their vendoring surface area. The Response.text() and apparent_encoding APIs will default to utf-8 if neither library is present. (#6702)

Bugfixes

  • Fixed bug in length detection where emoji length was incorrectly calculated in the request content-length. (#6589)
  • Fixed deserialization bug in JSONDecodeError. (#6629)
  • Fixed bug where an extra leading / (path separator) could lead urllib3 to unnecessarily reparse the request URI. (#6644)

Deprecations

... (truncated)

Commits
  • 88dce9d v2.32.2
  • c98e4d1 Merge pull request #6710 from nateprewitt/api_rename
  • 92075b3 Add deprecation warning
  • aa1461b Move _get_connection to get_connection_with_tls_context
  • 970e8ce v2.32.1
  • d6ebc4a v2.32.0
  • 9a40d12 Avoid reloading root certificates to improve concurrent performance (#6667)
  • 0c030f7 Merge pull request #6702 from nateprewitt/no_char_detection
  • 555b870 Allow character detection dependencies to be optional in post-packaging steps
  • d6dded3 Merge pull request #6700 from franekmagiera/update-redirect-to-invalid-uri-test
  • Additional commits viewable in compare view

Updates onnx from 1.15.0 to 1.16.0

Release notes

Sourced from onnx's releases.

v1.16.0

ONNX v1.16.0 is now available with exciting new features! We would like to thank everyone who contributed to this release! Please visit onnx.ai to learn more about ONNX and associated projects.

Key Updates

ai.onnx Opset 21

ai.onnx.ml Opset 4

IR Version 10

  • Added support for UINT4, INT4 types
  • GraphProto, FunctionProto, NodeProto, TensorProto added metadata_props field
  • FunctionProto added value_info field
  • FunctionProto and NodeProto added overload field to support overloaded functions.

Python Changes

  • Support registering custom OpSchemas via Python interface
  • Support Python3.12

Security Updates

  • Fix path sanitization bypass leading to arbitrary read (CVE-2024-27318)
  • Fix Out of bounds read due to lack of string termination in assert (CVE-2024-27319)

Deprecation notice

Bug fixes and infrastructure improvements

  • Enable empty list of values as attribute (#5559)
  • Add backward conversions from 18->17 for reduce ops (#5606)
  • DFT-20 version converter (#5613)
  • Fix version-converter to generate valid identifiers (#5628)
  • Reserve removed proto fields (#5643)
  • Cleanup shape inference implementation (#5596)
  • Do not use LFS64 on non-glibc linux (#5669)
  • Drop "one of" default attribute check in LabelEncoder (#5673)
  • TreeEnsemble base values for the reference implementation (#5665)
  • Parser/printer support external data format (#5688)
  • [cmake] Place export target file in the correct directory (#5677)

... (truncated)

Commits

Updates gunicorn to 22.0.0

Release notes

Sourced from gunicorn's releases.

Gunicorn 22.0 has been released

Gunicorn 22.0.0 has been released. This version fix the numerous security vulnerabilities. You're invited to upgrade asap your own installation.

Changes:

22.0.0 - 2024-04-17
===================
  • use utime to notify workers liveness
  • migrate setup to pyproject.toml
  • fix numerous security vulnerabilities in HTTP parser (closing some request smuggling vectors)
  • parsing additional requests is no longer attempted past unsupported request framing
  • on HTTP versions < 1.1 support for chunked transfer is refused (only used in exploits)
  • requests conflicting configured or passed SCRIPT_NAME now produce a verbose error
  • Trailer fields are no longer inspected for headers indicating secure scheme
  • support Python 3.12

** Breaking changes **

  • minimum version is Python 3.7
  • the limitations on valid characters in the HTTP method have been bounded to Internet Standards
  • requests specifying unsupported transfer coding (order) are refused by default (rare)
  • HTTP methods are no longer casefolded by default (IANA method registry contains none affected)
  • HTTP methods containing the number sign (#) are no longer accepted by default (rare)
  • HTTP versions < 1.0 or >= 2.0 are no longer accepted by default (rare, only HTTP/1.1 is supported)
  • HTTP versions consisting of multiple digits or containing a prefix/suffix are no longer accepted
  • HTTP header field names Gunicorn cannot safely map to variables are silently dropped, as in other software
  • HTTP headers with empty field name are refused by default (no legitimate use cases, used in exploits)
  • requests with both Transfer-Encoding and Content-Length are refused by default (such a message might indicate an attempt to perform request smuggling)
  • empty transfer codings are no longer permitted (reportedly seen with really old & broken proxies)

** SECURITY **

  • fix CVE-2024-1135
  1. Documentation is available there: https://docs.gunicorn.org/en/stable/news.html
  2. Packages: https://pypi.org/project/gunicorn/
Commits
  • f63d59e bump to 22.0
  • 4ac81e0 Merge pull request #3175 from e-kwsm/typo
  • 401cecf Merge pull request #3179 from dhdaines/exclude-eventlet-0360
  • 0243ec3 fix(deps): exclude eventlet 0.36.0
  • 628a0bc chore: fix typos
  • 88fc4a4 Merge pull request #3131 from pajod/patch-py12-rebased
  • deae2fc CI: back off the agressive timeout
  • f470382 docs: promise 3.12 compat
  • 5e30bfa add changelog to project.urls (updated for PEP621)
  • 481c3f9 remove setup.cfg - overridden by pyproject.toml
  • Additional commits viewable in compare view

Updates onnx from 1.15.0 to 1.16.0

Release notes

Sourced from onnx's releases.

v1.16.0

ONNX v1.16.0 is now available with exciting new features! We would like to thank everyone who contributed to this release! Please visit onnx.ai to learn more about ONNX and associated projects.

Key Updates

ai.onnx Opset 21

ai.onnx.ml Opset 4

IR Version 10

  • Added support for UINT4, INT4 types
  • GraphProto, FunctionProto, NodeProto, TensorProto added metadata_props field
  • FunctionProto added value_info field
  • FunctionProto and NodeProto added overload field to support overloaded functions.

Python Changes

  • Support registering custom OpSchemas via Python interface
  • Support Python3.12

Security Updates

  • Fix path sanitization bypass leading to arbitrary read (CVE-2024-27318)
  • Fix Out of bounds read due to lack of string termination in assert (CVE-2024-27319)

Deprecation notice

Bug fixes and infrastructure improvements

  • Enable empty list of values as attribute (#5559)
  • Add backward conversions from 18->17 for reduce ops (#5606)
  • DFT-20 version converter (#5613)
  • Fix version-converter to generate valid identifiers (#5628)
  • Reserve removed proto fields (#5643)
  • Cleanup shape inference implementation (#5596)
  • Do not use LFS64 on non-glibc linux (#5669)
  • Drop "one of" default attribute check in LabelEncoder (#5673)
  • TreeEnsemble base values for the reference implementation (#5665)
  • Parser/printer support external data format (#5688)
  • [cmake] Place export target file in the correct directory (#5677)

... (truncated)

Commits

Updates tqdm from 4.66.1 to 4.66.3

Release notes

Sourced from tqdm's releases.

tqdm v4.66.3 stable

tqdm v4.66.2 stable

  • pandas: add DataFrame.progress_map (#1549)
  • notebook: fix HTML padding (#1506)
  • keras: fix resuming training when verbose>=2 (#1508)
  • fix format_num negative fractions missing leading zero (#1548)
  • fix Python 3.12 DeprecationWarning on import (#1519)
  • linting: use f-strings (#1549)
  • update tests (#1549)
  • CI: bump actions (#1549)
Commits

Updates transformers from 4.36 to 4.38.0

Release notes

Sourced from transformers's releases.

v4.38: Gemma, Depth Anything, Stable LM; Static Cache, HF Quantizer, AQLM

New model additions

💎 Gemma 💎

Gemma is a new opensource Language Model series from Google AI that comes with a 2B and 7B variant. The release comes with the pre-trained and instruction fine-tuned versions and you can use them via AutoModelForCausalLM, GemmaForCausalLM or pipeline interface!

Read more about it in the Gemma release blogpost: https://hf.co/blog/gemma

from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("google/gemma-2b")
model = AutoModelForCausalLM.from_pretrained("google/gemma-2b", device_map="auto", torch_dtype=torch.float16)
input_text = "Write me a poem about Machine Learning."
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids)

You can use the model with Flash Attention, SDPA, Static cache and quantization API for further optimizations !

  • Flash Attention 2
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("google/gemma-2b")
model = AutoModelForCausalLM.from_pretrained(
"google/gemma-2b", device_map="auto", torch_dtype=torch.float16, attn_implementation="flash_attention_2"
)
input_text = "Write me a poem about Machine Learning."
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids)

  • bitsandbytes-4bit
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("google/gemma-2b")
model = AutoModelForCausalLM.from_pretrained(
"google/gemma-2b", device_map="auto", load_in_4bit=True
)
</tr></table>

... (truncated)

Commits
  • 08ab54a [ gemma] Adds support for Gemma 💎 (#29167)
  • 2de9314 [Maskformer] safely get backbone config (#29166)
  • 476957b 🚨 Llama: update rope scaling to match static cache changes (#29143)
  • 7a4bec6 Release: 4.38.0
  • ee3af60 Add support for fine-tuning CLIP-like models using contrastive-image-text exa...
  • 0996a10 Revert low cpu mem tie weights (#29135)
  • 15cfe38 [Core tokenization] add_dummy_prefix_space option to help with latest is...
  • efdd436 FIX [PEFT / Trainer ] Handle better peft + quantized compiled models (#29...
  • 5e95dca [cuda kernels] only compile them when initializing (#29133)
  • a7755d2 Generate: unset GenerationConfig parameters do not raise warning (#29119)
  • Additional commits viewable in compare view

Updates text-generation from 0.6.1 to 0.7.0

Release notes

Sourced from text-generation's releases.

v0.7.0

Features

  • server: reduce vram requirements of continuous batching (contributed by @​njhill)
  • server: Support BLOOMChat-176B (contributed by @​njhill)
  • server: add watermarking tests (contributed by @​ehsanmok)
  • router: Adding response schema for compat_generate (contributed by @​gsaivinay)
  • router: use number of tokins in batch as input for dynamic batching (co-authored by @​njhill)
  • server: improve download and decrease conversion to safetensors RAM requirements
  • server: optimize flash causal lm decode token
  • server: shard decode token
  • server: use cuda graph in logits warping
  • server: support trust_remote_code
  • tests: add snapshot testing

Fix

  • server: use float16
  • server: fix multinomial implem in Sampling
  • server: do not use device_map auto on single GPU

Misc

  • docker: use nvidia base image

New Contributors

Full Changelog: huggingface/text-generation-inference@v0.6.0...v0.7.0

Commits

Updates text-generation from 0.6.1 to 0.7.0

Release notes

Sourced from text-generation's releases.

v0.7.0

Features

  • server: reduce vram requirements of continuous batching (contributed by @​njhill)
  • server: Support BLOOMChat-176B (contributed by @​njhill)
  • server: add watermarking tests (contributed by @​ehsanmok)
  • router: Adding response schema for compat_generate (contributed by @​gsaivinay)
  • router: use number of tokins in batch as input for dynamic batching (co-authored by @​njhill)
  • server: improve download and decrease conversion to safetensors RAM requirements
  • server: optimize flash causal lm decode token
  • server: shard decode token
  • server: use cuda graph in logits warping
  • server: support trust_remote_code
  • tests: add snapshot testing

Fix

  • server: use float16
  • server: fix multinomial implem in Sampling
  • server: do not use device_map auto on single GPU

Misc

  • docker: use nvidia base image

New Contributors

Full Changelog: huggingface/text-generation-inference@v0.6.0...v0.7.0

Commits

Updates gradio from 3.47.1 to 4.19.2

Release notes

Sourced from gradio's releases.

@​gradio/model3d@​0.10.7

Dependency updates

  • @​gradio/upload@​0.10.7
  • @​gradio/client@​0.20.1

@​gradio/model3d@​0.10.6

Fixes

Dependency updates

  • @​gradio/client@​0.20.0
  • @​gradio/statustracker@​0.6.0
  • @​gradio/upload@​0.10.6

@​gradio/model3d@​0.10.5

Dependency updates

  • @​gradio/utils@​0.4.2
  • @​gradio/atoms@​0.7.4
  • @​gradio/statustracker@​0.5.5
  • @​gradio/upload@​0.10.5
  • @​gradio/client@​0.19.4
Changelog

Sourced from gradio's changelog.

4.19.2

Features

Fixes

4.19.1

Features

Fixes

4.19.0

Features

Fixes

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR...

Description has been truncated

Updates the requirements on [gunicorn](https://github.com/benoitc/gunicorn), [requests](https://github.com/psf/requests), [onnx](https://github.com/onnx/onnx), [tqdm](https://github.com/tqdm/tqdm), [transformers](https://github.com/huggingface/transformers), [text-generation](https://github.com/huggingface/text-generation-inference) and [gradio](https://github.com/gradio-app/gradio) to permit the latest version.

Updates `gunicorn` from 21.2.0 to 22.0.0
- [Release notes](https://github.com/benoitc/gunicorn/releases)
- [Commits](benoitc/gunicorn@21.2.0...22.0.0)

Updates `requests` from 2.31.0 to 2.32.2
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](psf/requests@v2.31.0...v2.32.2)

Updates `onnx` from 1.15.0 to 1.16.0
- [Release notes](https://github.com/onnx/onnx/releases)
- [Changelog](https://github.com/onnx/onnx/blob/main/docs/Changelog-ml.md)
- [Commits](onnx/onnx@v1.15.0...v1.16.0)

Updates `gunicorn` to 22.0.0
- [Release notes](https://github.com/benoitc/gunicorn/releases)
- [Commits](benoitc/gunicorn@21.2.0...22.0.0)

Updates `onnx` from 1.15.0 to 1.16.0
- [Release notes](https://github.com/onnx/onnx/releases)
- [Changelog](https://github.com/onnx/onnx/blob/main/docs/Changelog-ml.md)
- [Commits](onnx/onnx@v1.15.0...v1.16.0)

Updates `tqdm` from 4.66.1 to 4.66.3
- [Release notes](https://github.com/tqdm/tqdm/releases)
- [Commits](tqdm/tqdm@v4.66.1...v4.66.3)

Updates `transformers` from 4.36 to 4.38.0
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.36.0...v4.38.0)

Updates `text-generation` from 0.6.1 to 0.7.0
- [Release notes](https://github.com/huggingface/text-generation-inference/releases)
- [Commits](https://github.com/huggingface/text-generation-inference/commits/v0.7.0)

Updates `text-generation` from 0.6.1 to 0.7.0
- [Release notes](https://github.com/huggingface/text-generation-inference/releases)
- [Commits](https://github.com/huggingface/text-generation-inference/commits/v0.7.0)

Updates `gradio` from 3.47.1 to 4.19.2
- [Release notes](https://github.com/gradio-app/gradio/releases)
- [Changelog](https://github.com/gradio-app/gradio/blob/main/CHANGELOG.md)
- [Commits](https://github.com/gradio-app/gradio/compare/[email protected]@4.19.2)

---
updated-dependencies:
- dependency-name: gunicorn
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: requests
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: onnx
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: gunicorn
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: onnx
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: tqdm
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: transformers
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: text-generation
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: text-generation
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: gradio
  dependency-type: direct:production
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jun 2, 2024
Copy link
Author

dependabot bot commented on behalf of github Jun 6, 2024

Looks like these dependencies are updatable in another way, so this is no longer needed.

@dependabot dependabot bot closed this Jun 6, 2024
@dependabot dependabot bot deleted the dependabot/pip/pip-682fb84903 branch June 6, 2024 17:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants