Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): update dependency aiohttp to v3.11.11 #28

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Dec 11, 2024

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
aiohttp ==3.11.9 -> ==3.11.11 age adoption passing confidence

Release Notes

aio-libs/aiohttp (aiohttp)

v3.11.11

Compare Source

====================

Bug fixes

  • Updated :py:meth:~aiohttp.ClientSession.request to reuse the quote_cookie setting from ClientSession._cookie_jar when processing cookies parameter.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10093.

  • Fixed type of SSLContext for some static type checkers (e.g. pyright).

    Related issues and pull requests on GitHub:
    :issue:10099.

  • Updated :meth:aiohttp.web.StreamResponse.write annotation to also allow :class:bytearray and :class:memoryview as inputs -- by :user:cdce8p.

    Related issues and pull requests on GitHub:
    :issue:10154.

  • Fixed a hang where a connection previously used for a streaming
    download could be returned to the pool in a paused state.
    -- by :user:javitonino.

    Related issues and pull requests on GitHub:
    :issue:10169.

Features

  • Enabled ALPN on default SSL contexts. This improves compatibility with some
    proxies which don't work without this extension.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10156.

Miscellaneous internal changes

  • Fixed an infinite loop that can occur when using aiohttp in combination
    with async-solipsism_ -- by :user:bmerry.

    .. _async-solipsism: https://github.com/bmerry/async-solipsism

    Related issues and pull requests on GitHub:
    :issue:10149.


v3.11.10

Compare Source

====================

Bug fixes

  • Fixed race condition in :class:aiohttp.web.FileResponse that could have resulted in an incorrect response if the file was replaced on the file system during prepare -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10101, :issue:10113.

  • Replaced deprecated call to :func:mimetypes.guess_type with :func:mimetypes.guess_file_type when using Python 3.13+ -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10102.

  • Disabled zero copy writes in the StreamWriter -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10125.



Configuration

📅 Schedule: Branch creation - "* 0-4 * * 3" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.10 chore(deps): update dependency aiohttp to v3.11.11 Dec 21, 2024
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 52f07e5 to f0c973b Compare December 21, 2024 03:01
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.11 chore(deps): update dependency aiohttp to v3.11.10 Dec 21, 2024
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from f0c973b to 2f0d6a1 Compare December 21, 2024 06:35
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 2f0d6a1 to cdf822c Compare December 25, 2024 21:58
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.10 chore(deps): update dependency aiohttp to v3.11.11 Dec 25, 2024
Copy link

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..d5e119b779d 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..b07cec6a093 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,114 @@
 
 .. towncrier release notes start
 
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..589784b29cb 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -170,6 +171,7 @@ Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +366,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..b9af3f829f7 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.11"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..3b1dc08544f 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7e0986df657 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..28b14f7a791 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -72,7 +72,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +90,14 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        transport.write(b"".join(chunks))
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..6126fb5695d 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..3306b86bded 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..c4e10b44987 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -245,6 +245,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..740e3e2d559 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..72e49ed9edf 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..892ae6b164c 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..f7f98330e1f 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..05af9ae25ad 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..5f316fad2f7 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -104,16 +104,15 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     transport.write.reset_mock()  # type: ignore[attr-defined]
-    assert not transport.writelines.called  # type: ignore[attr-defined]
 
     # This payload compresses to 20447 bytes
     payload = b"".join(
         [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
     )
     await msg.write_eof(payload)
-    assert not transport.write.called  # type: ignore[attr-defined]
-    assert transport.writelines.called  # type: ignore[attr-defined]
-    chunks.extend(transport.writelines.mock_calls[0][1][0])  # type: ignore[attr-defined]
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
     content = b"".join(chunks)
     assert zlib.decompress(content) == (b"data" * 4096) + payload
 
@@ -180,7 +179,7 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
-    chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
     assert content == expected
@@ -216,7 +215,7 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
-    chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
     assert content == expected
@@ -235,16 +234,16 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
     await msg.write_eof(payload)
-    assert not transport.write.called  # type: ignore[attr-defined]
 
-    chunks = []
-    for write_lines_call in transport.writelines.mock_calls:  # type: ignore[attr-defined]
-        chunked_payload = list(write_lines_call[1][0])[1:]
-        chunked_payload.pop()
-        chunks.extend(chunked_payload)
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
 
-    assert all(chunks)
-    content = b"".join(chunks)
+    content = b"".join(compressed)
     assert zlib.decompress(content) == (b"data" * 4096) + payload
 
 
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))

Description

This pull request updates the aiohttp library with various improvements and bug fixes. It includes changes to the SSL context handling, file response optimizations, and several other enhancements across different modules.

Changes

Changes

  1. .github/workflows/ci-cd.yml:

    • Updated the actions/cache version to 4.2.0
    • Changed Python version for setup from 3.12 to 3.13
  2. CHANGES.rst:

    • Added changelog entries for versions 3.11.11 and 3.11.10, including bug fixes and features
  3. aiohttp/__init__.py:

    • Updated version to 3.11.11
  4. aiohttp/abc.py:

    • Added quote_cookie property to AbstractCookieJar
    • Updated write method signature in AbstractStreamWriter to accept Union[bytes, bytearray, memoryview]
  5. aiohttp/client.py:

    • Updated _request method to reuse quote_cookie setting from ClientSession._cookie_jar
  6. aiohttp/client_exceptions.py, aiohttp/client_reqrep.py, aiohttp/connector.py, aiohttp/web_runner.py, aiohttp/worker.py:

    • Refactored SSL context imports for better type checking
  7. aiohttp/connector.py:

    • Added ALPN protocol setting for SSL context
  8. aiohttp/cookiejar.py:

    • Implemented quote_cookie property
  9. aiohttp/http_writer.py:

    • Updated _write and write methods to accept Union[bytes, bytearray, memoryview]
  10. aiohttp/payload.py:

    • Updated mimetypes.guess_type usage for Python 3.13+ compatibility
  11. aiohttp/streams.py:

    • Added logic to resume reading when EOF is received
  12. aiohttp/web_fileresponse.py:

    • Refactored file response handling for better performance and security
  13. aiohttp/web_protocol.py:

    • Updated keepalive handling logic
  14. Various test files:

    • Added new tests and updated existing ones to cover the changes
sequenceDiagram
    participant Client
    participant ClientSession
    participant Connector
    participant FileResponse
    participant StreamWriter

    Client->>ClientSession: request()
    ClientSession->>Connector: connect()
    Connector->>Connector: _make_ssl_context()
    Note over Connector: Set ALPN protocols
    Connector-->>ClientSession: Connection
    ClientSession->>FileResponse: prepare()
    FileResponse->>FileResponse: _make_response()
    FileResponse->>StreamWriter: write()
    StreamWriter-->>ClientSession: Response
    ClientSession-->>Client: Response
Loading

Possible Issues

  • The change from Python 3.12 to 3.13 in the CI/CD workflow might cause issues if Python 3.13 is not yet stable or widely available.

Security Hotspots

No significant security hotspots were identified in this change.

This sequence diagram illustrates the main flow of a client request, highlighting the areas where significant changes have been made, such as SSL context creation with ALPN support and the optimized file response handling.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants