From 197feec3ac1e4339d0434090542e950c920ad77f Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon, 9 Jun 2025 23:54:04 +0000 Subject: [PATCH 1/8] Bump frozenlist from 1.6.2 to 1.7.0 (#11167) Bumps [frozenlist](https://github.com/aio-libs/frozenlist) from 1.6.2 to 1.7.0.
Release notes

Sourced from frozenlist's releases.

1.7.0

Features

Packaging updates and notes for downstreams


Changelog

Sourced from frozenlist's changelog.

v1.7.0

(2025-06-09)

Features

Packaging updates and notes for downstreams


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=frozenlist&package-manager=pip&previous-version=1.6.2&new-version=1.7.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 48025aef464..bce04a59359 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -16,7 +16,7 @@ brotli==1.1.0 ; platform_python_implementation == "CPython" # via -r requirements/runtime-deps.in cffi==1.17.1 # via pycares -frozenlist==1.6.2 +frozenlist==1.7.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/constraints.txt b/requirements/constraints.txt index ca30692a878..bcb2db9a085 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -78,7 +78,7 @@ freezegun==1.5.2 # via # -r requirements/lint.in # -r requirements/test.in -frozenlist==1.6.2 +frozenlist==1.7.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/dev.txt b/requirements/dev.txt index f3efd6627b5..bc14fd87460 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -76,7 +76,7 @@ freezegun==1.5.2 # via # -r requirements/lint.in # -r requirements/test.in -frozenlist==1.6.2 +frozenlist==1.7.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 49eaf6d44a4..13862e30ffc 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -16,7 +16,7 @@ brotli==1.1.0 ; platform_python_implementation == "CPython" # via -r requirements/runtime-deps.in cffi==1.17.1 # via pycares -frozenlist==1.6.2 +frozenlist==1.7.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/test.txt b/requirements/test.txt index f5ea10dc5e8..6e57578517b 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -39,7 +39,7 @@ forbiddenfruit==0.1.4 # via blockbuster freezegun==1.5.2 # via -r requirements/test.in -frozenlist==1.6.2 +frozenlist==1.7.0 # via # -r requirements/runtime-deps.in # aiosignal From fbdb9282a52379a71b4f8d3f1d3079f78f63b5ed Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 10 Jun 2025 00:03:42 +0000 Subject: [PATCH 2/8] Bump propcache from 0.3.1 to 0.3.2 (#11170) Bumps [propcache](https://github.com/aio-libs/propcache) from 0.3.1 to 0.3.2.
Release notes

Sourced from propcache's releases.

0.3.2

Improved documentation

  • Fixed incorrect decorator usage in the :func:~propcache.api.under_cached_property example code -- by :user:meanmail.

    Related issues and pull requests on GitHub: #109.

Packaging updates and notes for downstreams

  • Updated to use Cython 3.1 universally across the build path -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #117.

  • Made Cython line tracing opt-in via the with-cython-tracing build config setting -- by :user:bdraco.

    Previously, line tracing was enabled by default in :file:pyproject.toml, which caused build issues for some users and made wheels nearly twice as slow.

    Now line tracing is only enabled when explicitly requested via pip install . --config-setting=with-cython-tracing=true or by setting the PROPCACHE_CYTHON_TRACING environment variable.

    Related issues and pull requests on GitHub: #118.


Changelog

Sourced from propcache's changelog.

0.3.2

(2025-06-09)

Improved documentation

  • Fixed incorrect decorator usage in the :func:~propcache.api.under_cached_property example code -- by :user:meanmail.

    Related issues and pull requests on GitHub: :issue:109.

Packaging updates and notes for downstreams

  • Updated to use Cython 3.1 universally across the build path -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:117.

  • Made Cython line tracing opt-in via the with-cython-tracing build config setting -- by :user:bdraco.

    Previously, line tracing was enabled by default in :file:pyproject.toml, which caused build issues for some users and made wheels nearly twice as slow.

    Now line tracing is only enabled when explicitly requested via pip install . --config-setting=with-cython-tracing=true or by setting the PROPCACHE_CYTHON_TRACING environment variable.

    Related issues and pull requests on GitHub: :issue:118.


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=propcache&package-manager=pip&previous-version=0.3.1&new-version=0.3.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index bce04a59359..c39a3882253 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -30,7 +30,7 @@ multidict==6.4.4 # yarl packaging==25.0 # via gunicorn -propcache==0.3.1 +propcache==0.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index bcb2db9a085..c062316475d 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -143,7 +143,7 @@ pluggy==1.6.0 # via pytest pre-commit==4.2.0 # via -r requirements/lint.in -propcache==0.3.1 +propcache==0.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/dev.txt b/requirements/dev.txt index bc14fd87460..771bd3da18f 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -140,7 +140,7 @@ pluggy==1.6.0 # via pytest pre-commit==4.2.0 # via -r requirements/lint.in -propcache==0.3.1 +propcache==0.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 13862e30ffc..a8470fc1234 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -26,7 +26,7 @@ multidict==6.4.4 # via # -r requirements/runtime-deps.in # yarl -propcache==0.3.1 +propcache==0.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index 6e57578517b..85c585ce86c 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -73,7 +73,7 @@ pkgconfig==1.5.5 # via -r requirements/test.in pluggy==1.6.0 # via pytest -propcache==0.3.1 +propcache==0.3.2 # via # -r requirements/runtime-deps.in # yarl From b8eb2e69053195b95ba4112e2445d22aabedd6dc Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 10 Jun 2025 00:44:36 +0000 Subject: [PATCH 3/8] Bump requests from 2.32.3 to 2.32.4 (#11168) Bumps [requests](https://github.com/psf/requests) from 2.32.3 to 2.32.4.
Release notes

Sourced from requests's releases.

v2.32.4

2.32.4 (2025-06-10)

Security

  • CVE-2024-47081 Fixed an issue where a maliciously crafted URL and trusted environment will retrieve credentials for the wrong hostname/machine from a netrc file. (#6965)

Improvements

  • Numerous documentation improvements

Deprecations

  • Added support for pypy 3.11 for Linux and macOS. (#6926)
  • Dropped support for pypy 3.9 following its end of support. (#6926)
Changelog

Sourced from requests's changelog.

2.32.4 (2025-06-10)

Security

  • CVE-2024-47081 Fixed an issue where a maliciously crafted URL and trusted environment will retrieve credentials for the wrong hostname/machine from a netrc file.

Improvements

  • Numerous documentation improvements

Deprecations

  • Added support for pypy 3.11 for Linux and macOS.
  • Dropped support for pypy 3.9 following its end of support.
Commits
  • 021dc72 Polish up release tooling for last manual release
  • 821770e Bump version and add release notes for v2.32.4
  • 59f8aa2 Add netrc file search information to authentication documentation (#6876)
  • 5b4b64c Add more tests to prevent regression of CVE 2024 47081
  • 7bc4587 Add new test to check netrc auth leak (#6962)
  • 96ba401 Only use hostname to do netrc lookup instead of netloc
  • 7341690 Merge pull request #6951 from tswast/patch-1
  • 6716d7c remove links
  • a7e1c74 Update docs/conf.py
  • c799b81 docs: fix dead links to kenreitz.org
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=requests&package-manager=pip&previous-version=2.32.3&new-version=2.32.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/doc-spelling.txt | 2 +- requirements/doc.txt | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index c062316475d..0e351052e24 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -201,7 +201,7 @@ python-on-whales==0.77.0 # -r requirements/test.in pyyaml==6.0.2 # via pre-commit -requests==2.32.3 +requests==2.32.4 # via # cherry-picker # sphinx diff --git a/requirements/dev.txt b/requirements/dev.txt index 771bd3da18f..dc47aade356 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -196,7 +196,7 @@ python-on-whales==0.77.0 # -r requirements/test.in pyyaml==6.0.2 # via pre-commit -requests==2.32.3 +requests==2.32.4 # via # cherry-picker # sphinx diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt index 83b310cc181..24978fc5f11 100644 --- a/requirements/doc-spelling.txt +++ b/requirements/doc-spelling.txt @@ -36,7 +36,7 @@ pyenchant==3.2.2 # via sphinxcontrib-spelling pygments==2.19.1 # via sphinx -requests==2.32.3 +requests==2.32.4 # via # sphinx # sphinxcontrib-spelling diff --git a/requirements/doc.txt b/requirements/doc.txt index 803a6281fef..3ae89177f54 100644 --- a/requirements/doc.txt +++ b/requirements/doc.txt @@ -34,7 +34,7 @@ packaging==25.0 # via sphinx pygments==2.19.1 # via sphinx -requests==2.32.3 +requests==2.32.4 # via sphinx snowballstemmer==2.2.0 # via sphinx From 7a8400d1555018c8084d931a730a1a5537cbcbf4 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 10 Jun 2025 01:06:02 +0000 Subject: [PATCH 4/8] Bump snowballstemmer from 2.2.0 to 3.0.1 (#10855) Bumps [snowballstemmer](https://github.com/snowballstem/snowball) from 2.2.0 to 3.0.1.
Changelog

Sourced from snowballstemmer's changelog.

Snowball 3.0.1 (2025-05-09)

Python

  • The init.py in 3.0.0 was incorrectly generated due to a missing build dependency and the list of algorithms was empty. First reported by laymonage. Thanks to Dmitry Shachnev, Henry Schreiner and Adam Turner for diagnosing and fixing. (#229, #230, #231)

  • Add trove classifiers for Armenian and Yiddish which have now been registered with PyPI. Thanks to Henry Schreiner and Dmitry Shachnev. (#228)

  • Update documented details of Python 2 support in old versions.

Snowball 3.0.0 (2025-05-08)

Ada

  • Bug fixes:

    • Fix invalid Ada code generated for Snowball loop (it was partly Pascal!) None of the stemmers shipped in previous releases triggered this bug, but the Turkish stemmer now does.

    • The Ada runtime was not tracking the current length of the string but instead used the current limit value or some other substitute, which manifested as various incorrect behaviours for code inside of setlimit.

    • size was incorrectly returning the difference between the limit and the backwards limit.

    • lenof or sizeof on a string variable generated Ada code that didn't even compile.

    • Fix incorrect preconditions on some methods in the runtime.

    • Fix bug in runtime code used by attach, insert, <- and string variable assignment when a (sub)string was replaced with a larger string. This bug was triggered by code in the Kraaij-Pohlmann Dutch stemmer implementation (which was previously not enabled by default but is now the standard Dutch stemmer).

    • Fix invalid code generated for insert, <- and string variable assignment. This bug was triggered by code in the Kraaij-Pohlmann Dutch stemmer implementation (which was previously not enabled by default but is now the standard Dutch stemmer).

... (truncated)

Commits
  • e4b3efb Update for 3.0.1
  • bbd3319 Protect empty languages dict
  • 298ff9f Update details of Python 2 support in old versions
  • 53fe098 python: Specify correct dependencies for $(python_output_dir)/__init__.py
  • 00a22de Stop excluding classifiers for Armenian and Yiddish
  • abd9adc Update for 3.0.0
  • d23d356 Back out incomplete ESM support for 3.0.0
  • ff42274 Update draft NEWS entry
  • cd61f01 tamil: remove_tense_suffix signals if ending removed
  • edfe576 nepali: Reformat amongs to be clearer
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=snowballstemmer&package-manager=pip&previous-version=2.2.0&new-version=3.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/doc-spelling.txt | 2 +- requirements/doc.txt | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 0e351052e24..0ad364e6937 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -214,7 +214,7 @@ six==1.17.0 # via python-dateutil slotscheck==0.19.1 # via -r requirements/lint.in -snowballstemmer==2.2.0 +snowballstemmer==3.0.1 # via sphinx sphinx==8.1.3 # via diff --git a/requirements/dev.txt b/requirements/dev.txt index dc47aade356..126821097bb 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -208,7 +208,7 @@ six==1.17.0 # via python-dateutil slotscheck==0.19.1 # via -r requirements/lint.in -snowballstemmer==2.2.0 +snowballstemmer==3.0.1 # via sphinx sphinx==8.1.3 # via diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt index 24978fc5f11..1425e055c28 100644 --- a/requirements/doc-spelling.txt +++ b/requirements/doc-spelling.txt @@ -40,7 +40,7 @@ requests==2.32.4 # via # sphinx # sphinxcontrib-spelling -snowballstemmer==2.2.0 +snowballstemmer==3.0.1 # via sphinx sphinx==8.1.3 # via diff --git a/requirements/doc.txt b/requirements/doc.txt index 3ae89177f54..d440f9b3fca 100644 --- a/requirements/doc.txt +++ b/requirements/doc.txt @@ -36,7 +36,7 @@ pygments==2.19.1 # via sphinx requests==2.32.4 # via sphinx -snowballstemmer==2.2.0 +snowballstemmer==3.0.1 # via sphinx sphinx==8.1.3 # via From 4294d75c0c1d0b94e522eb010a68d7599b56a1d6 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 10 Jun 2025 02:14:07 +0000 Subject: [PATCH 5/8] Bump cryptography from 45.0.3 to 45.0.4 (#11176) Bumps [cryptography](https://github.com/pyca/cryptography) from 45.0.3 to 45.0.4.
Changelog

Sourced from cryptography's changelog.

45.0.4 - 2025-06-09


* Fixed decrypting PKCS#8 files encrypted with SHA1-RC4. (This is not
  considered secure, and is supported only for backwards compatibility.)

.. _v45-0-3:

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cryptography&package-manager=pip&previous-version=45.0.3&new-version=45.0.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/lint.txt | 2 +- requirements/test.txt | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 0ad364e6937..1148b4d1aba 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -56,7 +56,7 @@ coverage==7.8.2 # via # -r requirements/test.in # pytest-cov -cryptography==45.0.3 +cryptography==45.0.4 # via # pyjwt # trustme diff --git a/requirements/dev.txt b/requirements/dev.txt index 126821097bb..556651ac25d 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -56,7 +56,7 @@ coverage==7.8.2 # via # -r requirements/test.in # pytest-cov -cryptography==45.0.3 +cryptography==45.0.4 # via # pyjwt # trustme diff --git a/requirements/lint.txt b/requirements/lint.txt index 63011110409..8ea3ea4e867 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -21,7 +21,7 @@ cfgv==3.4.0 # via pre-commit click==8.1.8 # via slotscheck -cryptography==45.0.3 +cryptography==45.0.4 # via trustme distlib==0.3.9 # via virtualenv diff --git a/requirements/test.txt b/requirements/test.txt index 85c585ce86c..b8a3a70cd7b 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -29,7 +29,7 @@ coverage==7.8.2 # via # -r requirements/test.in # pytest-cov -cryptography==45.0.3 +cryptography==45.0.4 # via trustme exceptiongroup==1.3.0 # via pytest From 311ee1f5435d3731bf69837848b83e57397df2a3 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 10 Jun 2025 02:17:49 +0000 Subject: [PATCH 6/8] Bump yarl from 1.20.0 to 1.20.1 (#11177) Bumps [yarl](https://github.com/aio-libs/yarl) from 1.20.0 to 1.20.1.
Release notes

Sourced from yarl's releases.

1.20.1

Bug fixes

  • Started raising a :exc:ValueError exception raised for corrupted IPv6 URL values.

    These fixes the issue where exception :exc:IndexError was leaking from the internal code because of not being handled and transformed into a user-facing error. The problem was happening under the following conditions: empty IPv6 URL, brackets in reverse order.

    -- by :user:MaelPic.

    Related issues and pull requests on GitHub: #1512.

Packaging updates and notes for downstreams

  • Updated to use Cython 3.1 universally across the build path -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #1514.

  • Made Cython line tracing opt-in via the with-cython-tracing build config setting -- by :user:bdraco.

    Previously, line tracing was enabled by default in :file:pyproject.toml, which caused build issues for some users and made wheels nearly twice as slow. Now line tracing is only enabled when explicitly requested via pip install . --config-setting=with-cython-tracing=true or by setting the YARL_CYTHON_TRACING environment variable.

    Related issues and pull requests on GitHub: #1521.


Changelog

Sourced from yarl's changelog.

1.20.1

(2025-06-09)

Bug fixes

  • Started raising a :exc:ValueError exception raised for corrupted IPv6 URL values.

    These fixes the issue where exception :exc:IndexError was leaking from the internal code because of not being handled and transformed into a user-facing error. The problem was happening under the following conditions: empty IPv6 URL, brackets in reverse order.

    -- by :user:MaelPic.

    Related issues and pull requests on GitHub: :issue:1512.

Packaging updates and notes for downstreams

  • Updated to use Cython 3.1 universally across the build path -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:1514.

  • Made Cython line tracing opt-in via the with-cython-tracing build config setting -- by :user:bdraco.

    Previously, line tracing was enabled by default in :file:pyproject.toml, which caused build issues for some users and made wheels nearly twice as slow. Now line tracing is only enabled when explicitly requested via pip install . --config-setting=with-cython-tracing=true or by setting the YARL_CYTHON_TRACING environment variable.

    Related issues and pull requests on GitHub: :issue:1521.


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=yarl&package-manager=pip&previous-version=1.20.0&new-version=1.20.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index c39a3882253..0881620abd2 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -42,5 +42,5 @@ typing-extensions==4.13.2 # via multidict uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via -r requirements/base.in -yarl==1.20.0 +yarl==1.20.1 # via -r requirements/runtime-deps.in diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 1148b4d1aba..d86bfba44ef 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -289,7 +289,7 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.46.0 # via pip-tools -yarl==1.20.0 +yarl==1.20.1 # via -r requirements/runtime-deps.in zlib-ng==0.5.1 # via diff --git a/requirements/dev.txt b/requirements/dev.txt index 556651ac25d..6373ffa732f 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -280,7 +280,7 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.46.0 # via pip-tools -yarl==1.20.0 +yarl==1.20.1 # via -r requirements/runtime-deps.in zlib-ng==0.5.1 # via diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index a8470fc1234..dcf1d81a2c8 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -36,5 +36,5 @@ pycparser==2.22 # via cffi typing-extensions==4.13.2 # via multidict -yarl==1.20.0 +yarl==1.20.1 # via -r requirements/runtime-deps.in diff --git a/requirements/test.txt b/requirements/test.txt index b8a3a70cd7b..2537ed97f0f 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -137,7 +137,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho # via -r requirements/base.in wait-for-it==2.3.0 # via -r requirements/test.in -yarl==1.20.0 +yarl==1.20.1 # via -r requirements/runtime-deps.in zlib-ng==0.5.1 # via -r requirements/test.in From 85b0df43bf99aeb1b5258aecae19d40a16ac273e Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Mon, 9 Jun 2025 21:49:13 -0500 Subject: [PATCH 7/8] Fix cookie unquoting regression (#11173) --- CHANGES/11173.bugfix.rst | 1 + aiohttp/_cookie_helpers.py | 47 +++++-- docs/spelling_wordlist.txt | 1 + tests/test_cookie_helpers.py | 240 ++++++++++++++++++++++++++++++++++- 4 files changed, 279 insertions(+), 10 deletions(-) create mode 100644 CHANGES/11173.bugfix.rst diff --git a/CHANGES/11173.bugfix.rst b/CHANGES/11173.bugfix.rst new file mode 100644 index 00000000000..9214080d267 --- /dev/null +++ b/CHANGES/11173.bugfix.rst @@ -0,0 +1 @@ +Fixed cookie unquoting to properly handle octal escape sequences in cookie values (e.g., ``\012`` for newline) by vendoring the correct ``_unquote`` implementation from Python's ``http.cookies`` module -- by :user:`bdraco`. diff --git a/aiohttp/_cookie_helpers.py b/aiohttp/_cookie_helpers.py index 8184cc9bdc1..a5b4f81c78f 100644 --- a/aiohttp/_cookie_helpers.py +++ b/aiohttp/_cookie_helpers.py @@ -108,20 +108,49 @@ def preserve_morsel_with_coded_value(cookie: Morsel[str]) -> Morsel[str]: return mrsl_val -def _unquote(text: str) -> str: +_unquote_sub = re.compile(r"\\(?:([0-3][0-7][0-7])|(.))").sub + + +def _unquote_replace(m: re.Match[str]) -> str: + """ + Replace function for _unquote_sub regex substitution. + + Handles escaped characters in cookie values: + - Octal sequences are converted to their character representation + - Other escaped characters are unescaped by removing the backslash + """ + if m[1]: + return chr(int(m[1], 8)) + return m[2] + + +def _unquote(value: str) -> str: """ Unquote a cookie value. Vendored from http.cookies._unquote to ensure compatibility. + + Note: The original implementation checked for None, but we've removed + that check since all callers already ensure the value is not None. """ - # If there are no quotes, return as-is - if len(text) < 2 or text[0] != '"' or text[-1] != '"': - return text - # Remove quotes and handle escaped characters - text = text[1:-1] - # Replace escaped quotes and backslashes - text = text.replace('\\"', '"').replace("\\\\", "\\") - return text + # If there aren't any doublequotes, + # then there can't be any special characters. See RFC 2109. + if len(value) < 2: + return value + if value[0] != '"' or value[-1] != '"': + return value + + # We have to assume that we must decode this string. + # Down to work. + + # Remove the "s + value = value[1:-1] + + # Check for special sequences. Examples: + # \012 --> \n + # \" --> " + # + return _unquote_sub(_unquote_replace, value) def parse_cookie_headers(headers: Sequence[str]) -> List[Tuple[str, Morsel[str]]]: diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt index ff8bfb8b508..79a2b5075f5 100644 --- a/docs/spelling_wordlist.txt +++ b/docs/spelling_wordlist.txt @@ -363,6 +363,7 @@ uvloop uWSGI vcvarsall vendored +vendoring waituntil wakeup wakeups diff --git a/tests/test_cookie_helpers.py b/tests/test_cookie_helpers.py index 7a2ac7493ee..41e3eed8085 100644 --- a/tests/test_cookie_helpers.py +++ b/tests/test_cookie_helpers.py @@ -1,11 +1,17 @@ """Tests for internal cookie helper functions.""" -from http.cookies import CookieError, Morsel, SimpleCookie +from http.cookies import ( + CookieError, + Morsel, + SimpleCookie, + _unquote as simplecookie_unquote, +) import pytest from aiohttp import _cookie_helpers as helpers from aiohttp._cookie_helpers import ( + _unquote, parse_cookie_headers, preserve_morsel_with_coded_value, ) @@ -1029,3 +1035,235 @@ def test_parse_cookie_headers_date_formats_with_attributes() -> None: assert result[1][1]["expires"] == "Wednesday, 09-Jun-30 10:18:14 GMT" assert result[1][1]["domain"] == ".example.com" assert result[1][1]["samesite"] == "Strict" + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Unquoted strings should remain unchanged + ("simple", "simple"), + ("with spaces", "with spaces"), + ("", ""), + ('"', '"'), # String too short to be quoted + ('some"text', 'some"text'), # Quotes not at beginning/end + ('text"with"quotes', 'text"with"quotes'), + ], +) +def test_unquote_basic(input_str: str, expected: str) -> None: + """Test basic _unquote functionality.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Basic quoted strings + ('"quoted"', "quoted"), + ('"with spaces"', "with spaces"), + ('""', ""), # Empty quoted string + # Quoted string with special characters + ('"hello, world!"', "hello, world!"), + ('"path=/test"', "path=/test"), + ], +) +def test_unquote_quoted_strings(input_str: str, expected: str) -> None: + """Test _unquote with quoted strings.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Escaped quotes should be unescaped + (r'"say \"hello\""', 'say "hello"'), + (r'"nested \"quotes\" here"', 'nested "quotes" here'), + # Multiple escaped quotes + (r'"\"start\" middle \"end\""', '"start" middle "end"'), + ], +) +def test_unquote_escaped_quotes(input_str: str, expected: str) -> None: + """Test _unquote with escaped quotes.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Single escaped backslash + (r'"path\\to\\file"', "path\\to\\file"), + # Backslash before quote + (r'"end with slash\\"', "end with slash\\"), + # Mixed escaped characters + (r'"path\\to\\\"file\""', 'path\\to\\"file"'), + ], +) +def test_unquote_escaped_backslashes(input_str: str, expected: str) -> None: + """Test _unquote with escaped backslashes.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Common octal sequences + (r'"\012"', "\n"), # newline + (r'"\011"', "\t"), # tab + (r'"\015"', "\r"), # carriage return + (r'"\040"', " "), # space + # Octal sequences in context + (r'"line1\012line2"', "line1\nline2"), + (r'"tab\011separated"', "tab\tseparated"), + # Multiple octal sequences + (r'"\012\011\015"', "\n\t\r"), + # Mixed octal and regular text + (r'"hello\040world\041"', "hello world!"), + ], +) +def test_unquote_octal_sequences(input_str: str, expected: str) -> None: + """Test _unquote with octal escape sequences.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Test boundary values + (r'"\000"', "\x00"), # null character + (r'"\001"', "\x01"), + (r'"\177"', "\x7f"), # DEL character + (r'"\200"', "\x80"), # Extended ASCII + (r'"\377"', "\xff"), # Max octal value + # Invalid octal sequences (not 3 digits or > 377) are treated as regular escapes + (r'"\400"', "400"), # 400 octal = 256 decimal, too large + (r'"\777"', "777"), # 777 octal = 511 decimal, too large + ], +) +def test_unquote_octal_full_range(input_str: str, expected: str) -> None: + """Test _unquote with full range of valid octal sequences.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # Mix of quotes, backslashes, and octal + (r'"say \"hello\"\012new line"', 'say "hello"\nnew line'), + (r'"path\\to\\file\011\011data"', "path\\to\\file\t\tdata"), + # Complex mixed example + (r'"\042quoted\042 and \134backslash\134"', '"quoted" and \\backslash\\'), + # Escaped characters that aren't special + (r'"\a\b\c"', "abc"), # \a, \b, \c -> a, b, c + ], +) +def test_unquote_mixed_escapes(input_str: str, expected: str) -> None: + """Test _unquote with mixed escape sequences.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # String that starts with quote but doesn't end with one + ('"not closed', '"not closed'), + # String that ends with quote but doesn't start with one + ('not opened"', 'not opened"'), + # Multiple quotes + ('"""', '"'), + ('""""', '""'), + # Backslash at the end without anything to escape + (r'"ends with\"', "ends with\\"), + # Empty escape + (r'"test\"', "test\\"), + # Just escaped characters + (r'"\"\"\""', '"""'), + ], +) +def test_unquote_edge_cases(input_str: str, expected: str) -> None: + """Test _unquote edge cases.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + ("input_str", "expected"), + [ + # JSON-like data + (r'"{\"user\":\"john\",\"id\":123}"', '{"user":"john","id":123}'), + # URL-encoded then quoted + ('"hello%20world"', "hello%20world"), + # Path with backslashes (Windows-style) + (r'"C:\\Users\\John\\Documents"', "C:\\Users\\John\\Documents"), + # Complex session data + ( + r'"session_data=\"user123\";expires=2024"', + 'session_data="user123";expires=2024', + ), + ], +) +def test_unquote_real_world_examples(input_str: str, expected: str) -> None: + """Test _unquote with real-world cookie value examples.""" + assert _unquote(input_str) == expected + + +@pytest.mark.parametrize( + "test_value", + [ + '""', + '"simple"', + r'"with \"quotes\""', + r'"with \\backslash\\"', + r'"\012newline"', + r'"complex\042quote\134slash\012"', + '"not-quoted', + 'also-not-quoted"', + r'"mixed\011\042\134test"', + ], +) +def test_unquote_compatibility_with_simplecookie(test_value: str) -> None: + """Test that _unquote behaves like SimpleCookie's unquoting.""" + assert _unquote(test_value) == simplecookie_unquote(test_value), ( + f"Mismatch for {test_value!r}: " + f"our={_unquote(test_value)!r}, " + f"SimpleCookie={simplecookie_unquote(test_value)!r}" + ) + + +@pytest.mark.parametrize( + ("header", "expected_name", "expected_value", "expected_coded"), + [ + # Test cookie values with octal escape sequences + (r'name="\012newline\012"', "name", "\nnewline\n", r'"\012newline\012"'), + ( + r'tab="\011separated\011values"', + "tab", + "\tseparated\tvalues", + r'"\011separated\011values"', + ), + ( + r'mixed="hello\040world\041"', + "mixed", + "hello world!", + r'"hello\040world\041"', + ), + ( + r'complex="\042quoted\042 text with \012 newline"', + "complex", + '"quoted" text with \n newline', + r'"\042quoted\042 text with \012 newline"', + ), + ], +) +def test_parse_cookie_headers_uses_unquote_with_octal( + header: str, expected_name: str, expected_value: str, expected_coded: str +) -> None: + """Test that parse_cookie_headers correctly unquotes values with octal sequences and preserves coded_value.""" + result = parse_cookie_headers([header]) + + assert len(result) == 1 + name, morsel = result[0] + + # Check that octal sequences were properly decoded in the value + assert name == expected_name + assert morsel.value == expected_value + + # Check that coded_value preserves the original quoted string + assert morsel.coded_value == expected_coded From 915338c7b02d843114e05a94508e6d91921e88d7 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Mon, 9 Jun 2025 22:55:48 -0500 Subject: [PATCH 8/8] Fix cookie header parser ignoring reserved names (#11178) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> --- CHANGES/11178.bugfix.rst | 1 + aiohttp/_cookie_helpers.py | 63 +++- aiohttp/abc.py | 4 +- aiohttp/client_reqrep.py | 14 +- aiohttp/web_request.py | 9 +- tests/test_cookie_helpers.py | 617 ++++++++++++++++++++++++++--------- tests/test_web_request.py | 26 +- 7 files changed, 556 insertions(+), 178 deletions(-) create mode 100644 CHANGES/11178.bugfix.rst diff --git a/CHANGES/11178.bugfix.rst b/CHANGES/11178.bugfix.rst new file mode 100644 index 00000000000..dc74cddde06 --- /dev/null +++ b/CHANGES/11178.bugfix.rst @@ -0,0 +1 @@ +Fixed ``Cookie`` header parsing to treat attribute names as regular cookies per :rfc:`6265#section-5.4` -- by :user:`bdraco`. diff --git a/aiohttp/_cookie_helpers.py b/aiohttp/_cookie_helpers.py index a5b4f81c78f..4e9fc968814 100644 --- a/aiohttp/_cookie_helpers.py +++ b/aiohttp/_cookie_helpers.py @@ -12,7 +12,11 @@ from .log import internal_logger -__all__ = ("parse_cookie_headers", "preserve_morsel_with_coded_value") +__all__ = ( + "parse_set_cookie_headers", + "parse_cookie_header", + "preserve_morsel_with_coded_value", +) # Cookie parsing constants # Allow more characters in cookie names to handle real-world cookies @@ -153,7 +157,62 @@ def _unquote(value: str) -> str: return _unquote_sub(_unquote_replace, value) -def parse_cookie_headers(headers: Sequence[str]) -> List[Tuple[str, Morsel[str]]]: +def parse_cookie_header(header: str) -> List[Tuple[str, Morsel[str]]]: + """ + Parse a Cookie header according to RFC 6265 Section 5.4. + + Cookie headers contain only name-value pairs separated by semicolons. + There are no attributes in Cookie headers - even names that match + attribute names (like 'path' or 'secure') should be treated as cookies. + + This parser uses the same regex-based approach as parse_set_cookie_headers + to properly handle quoted values that may contain semicolons. + + Args: + header: The Cookie header value to parse + + Returns: + List of (name, Morsel) tuples for compatibility with SimpleCookie.update() + """ + if not header: + return [] + + cookies: List[Tuple[str, Morsel[str]]] = [] + i = 0 + n = len(header) + + while i < n: + # Use the same pattern as parse_set_cookie_headers to find cookies + match = _COOKIE_PATTERN.match(header, i) + if not match: + break + + key = match.group("key") + value = match.group("val") or "" + i = match.end(0) + + # Validate the name + if not key or not _COOKIE_NAME_RE.match(key): + internal_logger.warning("Can not load cookie: Illegal cookie name %r", key) + continue + + # Create new morsel + morsel: Morsel[str] = Morsel() + # Preserve the original value as coded_value (with quotes if present) + # We use __setstate__ instead of the public set() API because it allows us to + # bypass validation and set already validated state. This is more stable than + # setting protected attributes directly and unlikely to change since it would + # break pickling. + morsel.__setstate__( # type: ignore[attr-defined] + {"key": key, "value": _unquote(value), "coded_value": value} + ) + + cookies.append((key, morsel)) + + return cookies + + +def parse_set_cookie_headers(headers: Sequence[str]) -> List[Tuple[str, Morsel[str]]]: """ Parse cookie headers using a vendored version of SimpleCookie parsing. diff --git a/aiohttp/abc.py b/aiohttp/abc.py index 0f396414a8e..f8a8442a7b4 100644 --- a/aiohttp/abc.py +++ b/aiohttp/abc.py @@ -22,7 +22,7 @@ from multidict import CIMultiDict from yarl import URL -from ._cookie_helpers import parse_cookie_headers +from ._cookie_helpers import parse_set_cookie_headers from .typedefs import LooseCookies if TYPE_CHECKING: @@ -194,7 +194,7 @@ def update_cookies_from_headers( self, headers: Sequence[str], response_url: URL ) -> None: """Update cookies from raw Set-Cookie headers.""" - if headers and (cookies_to_update := parse_cookie_headers(headers)): + if headers and (cookies_to_update := parse_set_cookie_headers(headers)): self.update_cookies(cookies_to_update, response_url) @abstractmethod diff --git a/aiohttp/client_reqrep.py b/aiohttp/client_reqrep.py index 88f81326215..aa5b220fe48 100644 --- a/aiohttp/client_reqrep.py +++ b/aiohttp/client_reqrep.py @@ -30,7 +30,11 @@ from yarl import URL from . import hdrs, helpers, http, multipart, payload -from ._cookie_helpers import parse_cookie_headers, preserve_morsel_with_coded_value +from ._cookie_helpers import ( + parse_cookie_header, + parse_set_cookie_headers, + preserve_morsel_with_coded_value, +) from .abc import AbstractStreamWriter from .client_exceptions import ( ClientConnectionError, @@ -313,9 +317,9 @@ def cookies(self) -> SimpleCookie: if self._raw_cookie_headers is not None: # Parse cookies for response.cookies (SimpleCookie for backward compatibility) cookies = SimpleCookie() - # Use parse_cookie_headers for more lenient parsing that handles + # Use parse_set_cookie_headers for more lenient parsing that handles # malformed cookies better than SimpleCookie.load - cookies.update(parse_cookie_headers(self._raw_cookie_headers)) + cookies.update(parse_set_cookie_headers(self._raw_cookie_headers)) self._cookies = cookies else: self._cookies = SimpleCookie() @@ -1014,8 +1018,8 @@ def update_cookies(self, cookies: Optional[LooseCookies]) -> None: c = SimpleCookie() if hdrs.COOKIE in self.headers: - # parse_cookie_headers already preserves coded values - c.update(parse_cookie_headers((self.headers.get(hdrs.COOKIE, ""),))) + # parse_cookie_header for RFC 6265 compliant Cookie header parsing + c.update(parse_cookie_header(self.headers.get(hdrs.COOKIE, ""))) del self.headers[hdrs.COOKIE] if isinstance(cookies, Mapping): diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py index dfd5a530e3b..5f0317954d5 100644 --- a/aiohttp/web_request.py +++ b/aiohttp/web_request.py @@ -28,7 +28,7 @@ from yarl import URL from . import hdrs -from ._cookie_helpers import parse_cookie_headers +from ._cookie_helpers import parse_cookie_header from .abc import AbstractStreamWriter from .helpers import ( _SENTINEL, @@ -556,9 +556,10 @@ def cookies(self) -> Mapping[str, str]: A read-only dictionary-like object. """ - # Use parse_cookie_headers for more lenient parsing that accepts - # special characters in cookie names (fixes #2683) - parsed = parse_cookie_headers((self.headers.get(hdrs.COOKIE, ""),)) + # Use parse_cookie_header for RFC 6265 compliant Cookie header parsing + # that accepts special characters in cookie names (fixes #2683) + parsed = parse_cookie_header(self.headers.get(hdrs.COOKIE, "")) + # Extract values from Morsel objects return MappingProxyType({name: morsel.value for name, morsel in parsed}) @reify diff --git a/tests/test_cookie_helpers.py b/tests/test_cookie_helpers.py index 41e3eed8085..6deef6544c2 100644 --- a/tests/test_cookie_helpers.py +++ b/tests/test_cookie_helpers.py @@ -12,7 +12,8 @@ from aiohttp import _cookie_helpers as helpers from aiohttp._cookie_helpers import ( _unquote, - parse_cookie_headers, + parse_cookie_header, + parse_set_cookie_headers, preserve_morsel_with_coded_value, ) @@ -69,11 +70,11 @@ def test_preserve_morsel_with_coded_value_no_coded_value() -> None: assert result.coded_value == "simple_value" -def test_parse_cookie_headers_simple() -> None: - """Test parse_cookie_headers with simple cookies.""" +def test_parse_set_cookie_headers_simple() -> None: + """Test parse_set_cookie_headers with simple cookies.""" headers = ["name=value", "session=abc123"] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 assert result[0][0] == "name" @@ -84,14 +85,14 @@ def test_parse_cookie_headers_simple() -> None: assert result[1][1].value == "abc123" -def test_parse_cookie_headers_with_attributes() -> None: - """Test parse_cookie_headers with cookie attributes.""" +def test_parse_set_cookie_headers_with_attributes() -> None: + """Test parse_set_cookie_headers with cookie attributes.""" headers = [ "sessionid=value123; Path=/; HttpOnly; Secure", "user=john; Domain=.example.com; Max-Age=3600", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 @@ -111,8 +112,8 @@ def test_parse_cookie_headers_with_attributes() -> None: assert morsel2["max-age"] == "3600" -def test_parse_cookie_headers_special_chars_in_names() -> None: - """Test parse_cookie_headers accepts special characters in names (#2683).""" +def test_parse_set_cookie_headers_special_chars_in_names() -> None: + """Test parse_set_cookie_headers accepts special characters in names (#2683).""" # These should be accepted with relaxed validation headers = [ "ISAWPLB{A7F52349-3531-4DA9-8776-F74BC6F4F1BB}=value1", @@ -122,7 +123,7 @@ def test_parse_cookie_headers_special_chars_in_names() -> None: "cookie@domain=value5", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 5 expected_names = [ @@ -139,8 +140,8 @@ def test_parse_cookie_headers_special_chars_in_names() -> None: assert morsel.value == f"value{i+1}" -def test_parse_cookie_headers_invalid_names() -> None: - """Test parse_cookie_headers rejects truly invalid cookie names.""" +def test_parse_set_cookie_headers_invalid_names() -> None: + """Test parse_set_cookie_headers rejects truly invalid cookie names.""" # These should be rejected even with relaxed validation headers = [ "invalid\tcookie=value", # Tab character @@ -150,14 +151,14 @@ def test_parse_cookie_headers_invalid_names() -> None: "name with spaces=value", # Spaces in name ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # All should be skipped assert len(result) == 0 -def test_parse_cookie_headers_empty_and_invalid() -> None: - """Test parse_cookie_headers handles empty and invalid formats.""" +def test_parse_set_cookie_headers_empty_and_invalid() -> None: + """Test parse_set_cookie_headers handles empty and invalid formats.""" headers = [ "", # Empty header " ", # Whitespace only @@ -168,7 +169,7 @@ def test_parse_cookie_headers_empty_and_invalid() -> None: "Domain=.com", # Reserved attribute as name (should be skipped) ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # Only "name=" should be accepted assert len(result) == 1 @@ -176,15 +177,15 @@ def test_parse_cookie_headers_empty_and_invalid() -> None: assert result[0][1].value == "" -def test_parse_cookie_headers_quoted_values() -> None: - """Test parse_cookie_headers handles quoted values correctly.""" +def test_parse_set_cookie_headers_quoted_values() -> None: + """Test parse_set_cookie_headers handles quoted values correctly.""" headers = [ 'name="quoted value"', 'session="with;semicolon"', 'data="with\\"escaped\\""', ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 3 assert result[0][1].value == "quoted value" @@ -200,7 +201,7 @@ def test_parse_cookie_headers_quoted_values() -> None: 'complex="a=b;c=d"; simple=value', ], ) -def test_parse_cookie_headers_semicolon_in_quoted_values(header: str) -> None: +def test_parse_set_cookie_headers_semicolon_in_quoted_values(header: str) -> None: """ Test that semicolons inside properly quoted values are handled correctly. @@ -212,7 +213,7 @@ def test_parse_cookie_headers_semicolon_in_quoted_values(header: str) -> None: sc.load(header) # Test with our parser - result = parse_cookie_headers([header]) + result = parse_set_cookie_headers([header]) # Should parse the same number of cookies assert len(result) == len(sc) @@ -223,12 +224,12 @@ def test_parse_cookie_headers_semicolon_in_quoted_values(header: str) -> None: assert morsel.value == sc_morsel.value -def test_parse_cookie_headers_multiple_cookies_same_header() -> None: - """Test parse_cookie_headers with multiple cookies in one header.""" +def test_parse_set_cookie_headers_multiple_cookies_same_header() -> None: + """Test parse_set_cookie_headers with multiple cookies in one header.""" # Note: SimpleCookie includes the comma as part of the first cookie's value headers = ["cookie1=value1, cookie2=value2"] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # Should parse as two separate cookies assert len(result) == 2 @@ -253,14 +254,14 @@ def test_parse_cookie_headers_multiple_cookies_same_header() -> None: "complex=value; Domain=.example.com; Path=/app; Max-Age=3600", ], ) -def test_parse_cookie_headers_compatibility_with_simple_cookie(header: str) -> None: - """Test parse_cookie_headers is bug-for-bug compatible with SimpleCookie.load.""" +def test_parse_set_cookie_headers_compatibility_with_simple_cookie(header: str) -> None: + """Test parse_set_cookie_headers is bug-for-bug compatible with SimpleCookie.load.""" # Parse with SimpleCookie sc = SimpleCookie() sc.load(header) # Parse with our function - result = parse_cookie_headers([header]) + result = parse_set_cookie_headers([header]) # Should have same number of cookies assert len(result) == len(sc) @@ -286,8 +287,8 @@ def test_parse_cookie_headers_compatibility_with_simple_cookie(header: str) -> N assert morsel.get(bool_attr) is True -def test_parse_cookie_headers_relaxed_validation_differences() -> None: - """Test where parse_cookie_headers differs from SimpleCookie (relaxed validation).""" +def test_parse_set_cookie_headers_relaxed_validation_differences() -> None: + """Test where parse_set_cookie_headers differs from SimpleCookie (relaxed validation).""" # Test cookies that SimpleCookie rejects with CookieError rejected_by_simplecookie = [ ("cookie{with}braces=value1", "cookie{with}braces", "value1"), @@ -302,7 +303,7 @@ def test_parse_cookie_headers_relaxed_validation_differences() -> None: sc.load(header) # Our parser should accept them - result = parse_cookie_headers([header]) + result = parse_set_cookie_headers([header]) assert len(result) == 1 # We accept assert result[0][0] == expected_name assert result[0][1].value == expected_value @@ -320,20 +321,20 @@ def test_parse_cookie_headers_relaxed_validation_differences() -> None: # May or may not parse correctly in SimpleCookie # Our parser should accept them consistently - result = parse_cookie_headers([header]) + result = parse_set_cookie_headers([header]) assert len(result) == 1 assert result[0][0] == expected_name assert result[0][1].value == expected_value -def test_parse_cookie_headers_case_insensitive_attrs() -> None: +def test_parse_set_cookie_headers_case_insensitive_attrs() -> None: """Test that known attributes are handled case-insensitively.""" headers = [ "cookie1=value1; PATH=/test; DOMAIN=example.com", "cookie2=value2; Secure; HTTPONLY; max-AGE=60", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 @@ -347,13 +348,13 @@ def test_parse_cookie_headers_case_insensitive_attrs() -> None: assert result[1][1]["max-age"] == "60" -def test_parse_cookie_headers_unknown_attrs_ignored() -> None: +def test_parse_set_cookie_headers_unknown_attrs_ignored() -> None: """Test that unknown attributes are treated as new cookies (same as SimpleCookie).""" headers = [ "cookie=value; Path=/; unknownattr=ignored; HttpOnly", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # SimpleCookie treats unknown attributes with values as new cookies assert len(result) == 2 @@ -369,8 +370,8 @@ def test_parse_cookie_headers_unknown_attrs_ignored() -> None: assert result[1][1]["httponly"] is True # HttpOnly applies to this cookie -def test_parse_cookie_headers_complex_real_world() -> None: - """Test parse_cookie_headers with complex real-world examples.""" +def test_parse_set_cookie_headers_complex_real_world() -> None: + """Test parse_set_cookie_headers with complex real-world examples.""" headers = [ # AWS ELB cookie "AWSELB=ABCDEF1234567890ABCDEF1234567890ABCDEF1234567890; Path=/", @@ -380,7 +381,7 @@ def test_parse_cookie_headers_complex_real_world() -> None: "session_id=s%3AabcXYZ123.signature123; Path=/; Secure; HttpOnly; SameSite=Strict", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 3 @@ -396,7 +397,7 @@ def test_parse_cookie_headers_complex_real_world() -> None: assert session_morsel.get("samesite") == "Strict" -def test_parse_cookie_headers_boolean_attrs() -> None: +def test_parse_set_cookie_headers_boolean_attrs() -> None: """Test that boolean attributes (secure, httponly) work correctly.""" # Test secure attribute variations headers = [ @@ -405,7 +406,7 @@ def test_parse_cookie_headers_boolean_attrs() -> None: "cookie3=value3; Secure=true", # Non-standard but might occur ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 3 # All should have secure=True @@ -418,7 +419,7 @@ def test_parse_cookie_headers_boolean_attrs() -> None: "cookie5=value5; HttpOnly=", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 # All should have httponly=True @@ -426,7 +427,7 @@ def test_parse_cookie_headers_boolean_attrs() -> None: assert morsel.get("httponly") is True, f"{name} should have httponly=True" -def test_parse_cookie_headers_boolean_attrs_with_partitioned() -> None: +def test_parse_set_cookie_headers_boolean_attrs_with_partitioned() -> None: """Test that boolean attributes including partitioned work correctly.""" # Test secure attribute variations secure_headers = [ @@ -435,7 +436,7 @@ def test_parse_cookie_headers_boolean_attrs_with_partitioned() -> None: "cookie3=value3; Secure=true", # Non-standard but might occur ] - result = parse_cookie_headers(secure_headers) + result = parse_set_cookie_headers(secure_headers) assert len(result) == 3 for name, morsel in result: assert morsel.get("secure") is True, f"{name} should have secure=True" @@ -446,7 +447,7 @@ def test_parse_cookie_headers_boolean_attrs_with_partitioned() -> None: "cookie5=value5; HttpOnly=", ] - result = parse_cookie_headers(httponly_headers) + result = parse_set_cookie_headers(httponly_headers) assert len(result) == 2 for name, morsel in result: assert morsel.get("httponly") is True, f"{name} should have httponly=True" @@ -458,21 +459,21 @@ def test_parse_cookie_headers_boolean_attrs_with_partitioned() -> None: "cookie8=value8; Partitioned=yes", # Non-standard but might occur ] - result = parse_cookie_headers(partitioned_headers) + result = parse_set_cookie_headers(partitioned_headers) assert len(result) == 3 for name, morsel in result: assert morsel.get("partitioned") is True, f"{name} should have partitioned=True" -def test_parse_cookie_headers_encoded_values() -> None: - """Test that parse_cookie_headers preserves encoded values.""" +def test_parse_set_cookie_headers_encoded_values() -> None: + """Test that parse_set_cookie_headers preserves encoded values.""" headers = [ "encoded=hello%20world", "url=https%3A%2F%2Fexample.com%2Fpath", "special=%21%40%23%24%25%5E%26*%28%29", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 3 # Values should be preserved as-is (not decoded) @@ -481,9 +482,9 @@ def test_parse_cookie_headers_encoded_values() -> None: assert result[2][1].value == "%21%40%23%24%25%5E%26*%28%29" -def test_parse_cookie_headers_partitioned() -> None: +def test_parse_set_cookie_headers_partitioned() -> None: """ - Test that parse_cookie_headers handles partitioned attribute correctly. + Test that parse_set_cookie_headers handles partitioned attribute correctly. This tests the fix for issue #10380 - partitioned cookies support. The partitioned attribute is a boolean flag like secure and httponly. @@ -496,7 +497,7 @@ def test_parse_cookie_headers_partitioned() -> None: "cookie5=value5; Domain=.example.com; Path=/; Partitioned", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 5 @@ -517,7 +518,7 @@ def test_parse_cookie_headers_partitioned() -> None: assert result[4][1].get("path") == "/" -def test_parse_cookie_headers_partitioned_case_insensitive() -> None: +def test_parse_set_cookie_headers_partitioned_case_insensitive() -> None: """Test that partitioned attribute is recognized case-insensitively.""" headers = [ "cookie1=value1; partitioned", # lowercase @@ -526,7 +527,7 @@ def test_parse_cookie_headers_partitioned_case_insensitive() -> None: "cookie4=value4; PaRtItIoNeD", # mixed case ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 4 @@ -537,14 +538,14 @@ def test_parse_cookie_headers_partitioned_case_insensitive() -> None: ), f"Cookie {i+1} should have partitioned=True" -def test_parse_cookie_headers_partitioned_not_set() -> None: +def test_parse_set_cookie_headers_partitioned_not_set() -> None: """Test that cookies without partitioned attribute don't have it set.""" headers = [ "normal=value; Secure; HttpOnly", "regular=cookie; Path=/", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 @@ -554,7 +555,7 @@ def test_parse_cookie_headers_partitioned_not_set() -> None: # Tests that don't require partitioned support in SimpleCookie -def test_parse_cookie_headers_partitioned_with_other_attrs_manual() -> None: +def test_parse_set_cookie_headers_partitioned_with_other_attrs_manual() -> None: """ Test parsing logic for partitioned cookies combined with all other attributes. @@ -567,7 +568,7 @@ def test_parse_cookie_headers_partitioned_with_other_attrs_manual() -> None: # Test a simple case that won't trigger SimpleCookie errors headers = ["session=abc123; Secure; HttpOnly"] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 1 assert result[0][0] == "session" @@ -601,7 +602,7 @@ def test_cookie_pattern_matches_partitioned_attribute(test_string: str) -> None: assert match.group("key").lower() == "partitioned" -def test_parse_cookie_headers_issue_7993_double_quotes() -> None: +def test_parse_set_cookie_headers_issue_7993_double_quotes() -> None: """ Test that cookies with unmatched opening quotes don't break parsing of subsequent cookies. @@ -614,7 +615,7 @@ def test_parse_cookie_headers_issue_7993_double_quotes() -> None: # Test case from the issue headers = ['foo=bar; baz="qux; foo2=bar2'] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # Should parse all cookies correctly assert len(result) == 3 @@ -626,41 +627,41 @@ def test_parse_cookie_headers_issue_7993_double_quotes() -> None: assert result[2][1].value == "bar2" -def test_parse_cookie_headers_empty_headers() -> None: +def test_parse_set_cookie_headers_empty_headers() -> None: """Test handling of empty headers in the sequence.""" # Empty header should be skipped - result = parse_cookie_headers(["", "name=value"]) + result = parse_set_cookie_headers(["", "name=value"]) assert len(result) == 1 assert result[0][0] == "name" assert result[0][1].value == "value" # Multiple empty headers - result = parse_cookie_headers(["", "", ""]) + result = parse_set_cookie_headers(["", "", ""]) assert result == [] # Empty headers mixed with valid cookies - result = parse_cookie_headers(["", "a=1", "", "b=2", ""]) + result = parse_set_cookie_headers(["", "a=1", "", "b=2", ""]) assert len(result) == 2 assert result[0][0] == "a" assert result[1][0] == "b" -def test_parse_cookie_headers_invalid_cookie_syntax() -> None: +def test_parse_set_cookie_headers_invalid_cookie_syntax() -> None: """Test handling of invalid cookie syntax.""" # No valid cookie pattern - result = parse_cookie_headers(["@#$%^&*()"]) + result = parse_set_cookie_headers(["@#$%^&*()"]) assert result == [] # Cookie name without value - result = parse_cookie_headers(["name"]) + result = parse_set_cookie_headers(["name"]) assert result == [] # Multiple invalid patterns - result = parse_cookie_headers(["!!!!", "????", "name", "@@@"]) + result = parse_set_cookie_headers(["!!!!", "????", "name", "@@@"]) assert result == [] -def test_parse_cookie_headers_illegal_cookie_names( +def test_parse_set_cookie_headers_illegal_cookie_names( caplog: pytest.LogCaptureFixture, ) -> None: """ @@ -671,103 +672,105 @@ def test_parse_cookie_headers_illegal_cookie_names( logged when illegal names appear after a valid cookie. """ # Cookie name that is a known attribute (illegal) - parsing stops early - result = parse_cookie_headers(["path=value; domain=test"]) + result = parse_set_cookie_headers(["path=value; domain=test"]) assert result == [] # Cookie name that doesn't match the pattern - result = parse_cookie_headers(["=value"]) + result = parse_set_cookie_headers(["=value"]) assert result == [] # Valid cookie after illegal one - parsing stops at illegal - result = parse_cookie_headers(["domain=bad; good=value"]) + result = parse_set_cookie_headers(["domain=bad; good=value"]) assert result == [] # Illegal cookie name that appears after a valid cookie triggers warning - result = parse_cookie_headers(["good=value; Path=/; invalid,cookie=value;"]) + result = parse_set_cookie_headers(["good=value; Path=/; invalid,cookie=value;"]) assert len(result) == 1 assert result[0][0] == "good" assert "Illegal cookie name 'invalid,cookie'" in caplog.text -def test_parse_cookie_headers_attributes_before_cookie() -> None: +def test_parse_set_cookie_headers_attributes_before_cookie() -> None: """Test that attributes before any cookie are invalid.""" # Path attribute before cookie - result = parse_cookie_headers(["Path=/; name=value"]) + result = parse_set_cookie_headers(["Path=/; name=value"]) assert result == [] # Domain attribute before cookie - result = parse_cookie_headers(["Domain=.example.com; name=value"]) + result = parse_set_cookie_headers(["Domain=.example.com; name=value"]) assert result == [] # Multiple attributes before cookie - result = parse_cookie_headers(["Path=/; Domain=.example.com; Secure; name=value"]) + result = parse_set_cookie_headers( + ["Path=/; Domain=.example.com; Secure; name=value"] + ) assert result == [] -def test_parse_cookie_headers_attributes_without_values() -> None: +def test_parse_set_cookie_headers_attributes_without_values() -> None: """Test handling of attributes with missing values.""" # Boolean attribute without value (valid) - result = parse_cookie_headers(["name=value; Secure"]) + result = parse_set_cookie_headers(["name=value; Secure"]) assert len(result) == 1 assert result[0][1]["secure"] is True # Non-boolean attribute without value (invalid, stops parsing) - result = parse_cookie_headers(["name=value; Path"]) + result = parse_set_cookie_headers(["name=value; Path"]) assert len(result) == 1 # Path without value stops further attribute parsing # Multiple cookies, invalid attribute in middle - result = parse_cookie_headers(["name=value; Path; Secure"]) + result = parse_set_cookie_headers(["name=value; Path; Secure"]) assert len(result) == 1 # Secure is not parsed because Path without value stops parsing -def test_parse_cookie_headers_dollar_prefixed_names() -> None: +def test_parse_set_cookie_headers_dollar_prefixed_names() -> None: """Test handling of cookie names starting with $.""" # $Version without preceding cookie (ignored) - result = parse_cookie_headers(["$Version=1; name=value"]) + result = parse_set_cookie_headers(["$Version=1; name=value"]) assert len(result) == 1 assert result[0][0] == "name" # Multiple $ prefixed without cookie (all ignored) - result = parse_cookie_headers(["$Version=1; $Path=/; $Domain=.com; name=value"]) + result = parse_set_cookie_headers(["$Version=1; $Path=/; $Domain=.com; name=value"]) assert len(result) == 1 assert result[0][0] == "name" # $ prefix at start is ignored, cookie follows - result = parse_cookie_headers(["$Unknown=123; valid=cookie"]) + result = parse_set_cookie_headers(["$Unknown=123; valid=cookie"]) assert len(result) == 1 assert result[0][0] == "valid" -def test_parse_cookie_headers_dollar_attributes() -> None: +def test_parse_set_cookie_headers_dollar_attributes() -> None: """Test handling of $ prefixed attributes after cookies.""" # Test multiple $ attributes with cookie (case-insensitive like SimpleCookie) - result = parse_cookie_headers(["name=value; $Path=/test; $Domain=.example.com"]) + result = parse_set_cookie_headers(["name=value; $Path=/test; $Domain=.example.com"]) assert len(result) == 1 assert result[0][0] == "name" assert result[0][1]["path"] == "/test" assert result[0][1]["domain"] == ".example.com" # Test unknown $ attribute (should be ignored) - result = parse_cookie_headers(["name=value; $Unknown=test"]) + result = parse_set_cookie_headers(["name=value; $Unknown=test"]) assert len(result) == 1 assert result[0][0] == "name" # $Unknown should not be set # Test $ attribute with empty value - result = parse_cookie_headers(["name=value; $Path="]) + result = parse_set_cookie_headers(["name=value; $Path="]) assert len(result) == 1 assert result[0][1]["path"] == "" # Test case sensitivity compatibility with SimpleCookie - result = parse_cookie_headers(["test=value; $path=/lower; $PATH=/upper"]) + result = parse_set_cookie_headers(["test=value; $path=/lower; $PATH=/upper"]) assert len(result) == 1 # Last one wins, and it's case-insensitive assert result[0][1]["path"] == "/upper" -def test_parse_cookie_headers_attributes_after_illegal_cookie() -> None: +def test_parse_set_cookie_headers_attributes_after_illegal_cookie() -> None: """ Test that attributes after an illegal cookie name are handled correctly. @@ -775,25 +778,25 @@ def test_parse_cookie_headers_attributes_after_illegal_cookie() -> None: cookie name was encountered. """ # Illegal cookie followed by $ attribute - result = parse_cookie_headers(["good=value; invalid,cookie=bad; $Path=/test"]) + result = parse_set_cookie_headers(["good=value; invalid,cookie=bad; $Path=/test"]) assert len(result) == 1 assert result[0][0] == "good" # $Path should be ignored since current_morsel is None after illegal cookie # Illegal cookie followed by boolean attribute - result = parse_cookie_headers(["good=value; invalid,cookie=bad; HttpOnly"]) + result = parse_set_cookie_headers(["good=value; invalid,cookie=bad; HttpOnly"]) assert len(result) == 1 assert result[0][0] == "good" # HttpOnly should be ignored since current_morsel is None # Illegal cookie followed by regular attribute with value - result = parse_cookie_headers(["good=value; invalid,cookie=bad; Max-Age=3600"]) + result = parse_set_cookie_headers(["good=value; invalid,cookie=bad; Max-Age=3600"]) assert len(result) == 1 assert result[0][0] == "good" # Max-Age should be ignored since current_morsel is None # Multiple attributes after illegal cookie - result = parse_cookie_headers( + result = parse_set_cookie_headers( ["good=value; invalid,cookie=bad; $Path=/; HttpOnly; Max-Age=60; Domain=.com"] ) assert len(result) == 1 @@ -801,7 +804,7 @@ def test_parse_cookie_headers_attributes_after_illegal_cookie() -> None: # All attributes should be ignored after illegal cookie -def test_parse_cookie_headers_unmatched_quotes_compatibility() -> None: +def test_parse_set_cookie_headers_unmatched_quotes_compatibility() -> None: """ Test that most unmatched quote scenarios behave like SimpleCookie. @@ -823,7 +826,7 @@ def test_parse_cookie_headers_unmatched_quotes_compatibility() -> None: sc_cookies = list(sc.items()) # Test our parser behavior - result = parse_cookie_headers([header]) + result = parse_set_cookie_headers([header]) # Both should parse the same cookies (partial parsing) assert len(result) == len(sc_cookies), ( @@ -841,7 +844,7 @@ def test_parse_cookie_headers_unmatched_quotes_compatibility() -> None: assert len(sc) == 1 # Only cookie1 # Our parser handles it better - result = parse_cookie_headers([fixed_case]) + result = parse_set_cookie_headers([fixed_case]) assert len(result) == 3 # All three cookies assert result[0][0] == "cookie1" assert result[0][1].value == "value1" @@ -851,15 +854,15 @@ def test_parse_cookie_headers_unmatched_quotes_compatibility() -> None: assert result[2][1].value == "value3" -def test_parse_cookie_headers_expires_attribute() -> None: - """Test parse_cookie_headers handles expires attribute with date formats.""" +def test_parse_set_cookie_headers_expires_attribute() -> None: + """Test parse_set_cookie_headers handles expires attribute with date formats.""" headers = [ "session=abc; Expires=Wed, 09 Jun 2021 10:18:14 GMT", "user=xyz; expires=Wednesday, 09-Jun-21 10:18:14 GMT", "token=123; EXPIRES=Wed, 09 Jun 2021 10:18:14 GMT", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 3 for _, morsel in result: @@ -867,18 +870,18 @@ def test_parse_cookie_headers_expires_attribute() -> None: assert "GMT" in morsel["expires"] -def test_parse_cookie_headers_edge_cases() -> None: +def test_parse_set_cookie_headers_edge_cases() -> None: """Test various edge cases.""" # Very long cookie values long_value = "x" * 4096 - result = parse_cookie_headers([f"name={long_value}"]) + result = parse_set_cookie_headers([f"name={long_value}"]) assert len(result) == 1 assert result[0][1].value == long_value -def test_parse_cookie_headers_various_date_formats_issue_4327() -> None: +def test_parse_set_cookie_headers_various_date_formats_issue_4327() -> None: """ - Test that parse_cookie_headers handles various date formats per RFC 6265. + Test that parse_set_cookie_headers handles various date formats per RFC 6265. This tests the fix for issue #4327 - support for RFC 822, RFC 850, and ANSI C asctime() date formats in cookie expiration. @@ -899,7 +902,7 @@ def test_parse_cookie_headers_various_date_formats_issue_4327() -> None: "cookie7=value7; Expires=Tue, 01-Jan-30 00:00:00 GMT", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # All cookies should be parsed assert len(result) == 7 @@ -923,7 +926,7 @@ def test_parse_cookie_headers_various_date_formats_issue_4327() -> None: assert morsel.get("expires") == exp_expires -def test_parse_cookie_headers_ansi_c_asctime_format() -> None: +def test_parse_set_cookie_headers_ansi_c_asctime_format() -> None: """ Test parsing of ANSI C asctime() format. @@ -932,7 +935,7 @@ def test_parse_cookie_headers_ansi_c_asctime_format() -> None: """ headers = ["cookie1=value1; Expires=Wed Jun 9 10:18:14 2021"] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # Should parse correctly with the expires attribute preserved assert len(result) == 1 @@ -941,9 +944,9 @@ def test_parse_cookie_headers_ansi_c_asctime_format() -> None: assert result[0][1]["expires"] == "Wed Jun 9 10:18:14 2021" -def test_parse_cookie_headers_rfc2822_timezone_issue_4493() -> None: +def test_parse_set_cookie_headers_rfc2822_timezone_issue_4493() -> None: """ - Test that parse_cookie_headers handles RFC 2822 timezone formats. + Test that parse_set_cookie_headers handles RFC 2822 timezone formats. This tests the fix for issue #4493 - support for RFC 2822-compliant dates with timezone offsets like -0000, +0100, etc. @@ -960,7 +963,7 @@ def test_parse_cookie_headers_rfc2822_timezone_issue_4493() -> None: "classic=cookie; expires=Sat, 03 Apr 2026 12:00:00 GMT", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) # All cookies should be parsed assert len(result) == 4 @@ -983,14 +986,14 @@ def test_parse_cookie_headers_rfc2822_timezone_issue_4493() -> None: assert result[3][1]["expires"] == "Sat, 03 Apr 2026 12:00:00 GMT" -def test_parse_cookie_headers_rfc2822_with_attributes() -> None: +def test_parse_set_cookie_headers_rfc2822_with_attributes() -> None: """Test that RFC 2822 dates work correctly with other cookie attributes.""" headers = [ "session=abc123; expires=Wed, 15 Jan 2020 09:45:07 -0000; Path=/; HttpOnly; Secure", "token=xyz789; expires=Thu, 01 Feb 2024 14:30:00 +0100; Domain=.example.com; SameSite=Strict", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 @@ -1010,14 +1013,14 @@ def test_parse_cookie_headers_rfc2822_with_attributes() -> None: assert result[1][1]["samesite"] == "Strict" -def test_parse_cookie_headers_date_formats_with_attributes() -> None: +def test_parse_set_cookie_headers_date_formats_with_attributes() -> None: """Test that date formats work correctly with other cookie attributes.""" headers = [ "session=abc123; Expires=Wed, 09 Jun 2030 10:18:14 GMT; Path=/; HttpOnly; Secure", "token=xyz789; Expires=Wednesday, 09-Jun-30 10:18:14 GMT; Domain=.example.com; SameSite=Strict", ] - result = parse_cookie_headers(headers) + result = parse_set_cookie_headers(headers) assert len(result) == 2 @@ -1037,6 +1040,350 @@ def test_parse_cookie_headers_date_formats_with_attributes() -> None: assert result[1][1]["samesite"] == "Strict" +@pytest.mark.parametrize( + ("header", "expected_name", "expected_value", "expected_coded"), + [ + # Test cookie values with octal escape sequences + (r'name="\012newline\012"', "name", "\nnewline\n", r'"\012newline\012"'), + ( + r'tab="\011separated\011values"', + "tab", + "\tseparated\tvalues", + r'"\011separated\011values"', + ), + ( + r'mixed="hello\040world\041"', + "mixed", + "hello world!", + r'"hello\040world\041"', + ), + ( + r'complex="\042quoted\042 text with \012 newline"', + "complex", + '"quoted" text with \n newline', + r'"\042quoted\042 text with \012 newline"', + ), + ], +) +def test_parse_set_cookie_headers_uses_unquote_with_octal( + header: str, expected_name: str, expected_value: str, expected_coded: str +) -> None: + """Test that parse_set_cookie_headers correctly unquotes values with octal sequences and preserves coded_value.""" + result = parse_set_cookie_headers([header]) + + assert len(result) == 1 + name, morsel = result[0] + + # Check that octal sequences were properly decoded in the value + assert name == expected_name + assert morsel.value == expected_value + + # Check that coded_value preserves the original quoted string + assert morsel.coded_value == expected_coded + + +# Tests for parse_cookie_header (RFC 6265 compliant Cookie header parser) + + +def test_parse_cookie_header_simple() -> None: + """Test parse_cookie_header with simple cookies.""" + header = "name=value; session=abc123" + + result = parse_cookie_header(header) + + assert len(result) == 2 + assert result[0][0] == "name" + assert result[0][1].value == "value" + assert result[1][0] == "session" + assert result[1][1].value == "abc123" + + +def test_parse_cookie_header_empty() -> None: + """Test parse_cookie_header with empty header.""" + assert parse_cookie_header("") == [] + assert parse_cookie_header(" ") == [] + + +def test_parse_cookie_header_quoted_values() -> None: + """Test parse_cookie_header handles quoted values correctly.""" + header = 'name="quoted value"; session="with;semicolon"; data="with\\"escaped\\""' + + result = parse_cookie_header(header) + + assert len(result) == 3 + assert result[0][0] == "name" + assert result[0][1].value == "quoted value" + assert result[1][0] == "session" + assert result[1][1].value == "with;semicolon" + assert result[2][0] == "data" + assert result[2][1].value == 'with"escaped"' + + +def test_parse_cookie_header_special_chars() -> None: + """Test parse_cookie_header accepts special characters in names.""" + header = ( + "ISAWPLB{A7F52349-3531-4DA9-8776-F74BC6F4F1BB}=value1; cookie[index]=value2" + ) + + result = parse_cookie_header(header) + + assert len(result) == 2 + assert result[0][0] == "ISAWPLB{A7F52349-3531-4DA9-8776-F74BC6F4F1BB}" + assert result[0][1].value == "value1" + assert result[1][0] == "cookie[index]" + assert result[1][1].value == "value2" + + +def test_parse_cookie_header_invalid_names() -> None: + """Test parse_cookie_header rejects invalid cookie names.""" + # Invalid names with control characters + header = "invalid\tcookie=value; valid=cookie; invalid\ncookie=bad" + + result = parse_cookie_header(header) + + # Parse_cookie_header uses same regex as parse_set_cookie_headers + # Tab and newline are treated as separators, not part of names + assert len(result) == 5 + assert result[0][0] == "invalid" + assert result[0][1].value == "" + assert result[1][0] == "cookie" + assert result[1][1].value == "value" + assert result[2][0] == "valid" + assert result[2][1].value == "cookie" + assert result[3][0] == "invalid" + assert result[3][1].value == "" + assert result[4][0] == "cookie" + assert result[4][1].value == "bad" + + +def test_parse_cookie_header_no_attributes() -> None: + """Test parse_cookie_header treats all pairs as cookies (no attributes).""" + # In Cookie headers, even reserved attribute names are treated as cookies + header = ( + "session=abc123; path=/test; domain=.example.com; secure=yes; httponly=true" + ) + + result = parse_cookie_header(header) + + assert len(result) == 5 + assert result[0][0] == "session" + assert result[0][1].value == "abc123" + assert result[1][0] == "path" + assert result[1][1].value == "/test" + assert result[2][0] == "domain" + assert result[2][1].value == ".example.com" + assert result[3][0] == "secure" + assert result[3][1].value == "yes" + assert result[4][0] == "httponly" + assert result[4][1].value == "true" + + +def test_parse_cookie_header_empty_value() -> None: + """Test parse_cookie_header with empty cookie values.""" + header = "empty=; name=value; also_empty=" + + result = parse_cookie_header(header) + + assert len(result) == 3 + assert result[0][0] == "empty" + assert result[0][1].value == "" + assert result[1][0] == "name" + assert result[1][1].value == "value" + assert result[2][0] == "also_empty" + assert result[2][1].value == "" + + +def test_parse_cookie_header_spaces() -> None: + """Test parse_cookie_header handles spaces correctly.""" + header = "name1=value1 ; name2=value2 ; name3=value3" + + result = parse_cookie_header(header) + + assert len(result) == 3 + assert result[0][0] == "name1" + assert result[0][1].value == "value1" + assert result[1][0] == "name2" + assert result[1][1].value == "value2" + assert result[2][0] == "name3" + assert result[2][1].value == "value3" + + +def test_parse_cookie_header_encoded_values() -> None: + """Test parse_cookie_header preserves encoded values.""" + header = "encoded=hello%20world; url=https%3A%2F%2Fexample.com" + + result = parse_cookie_header(header) + + assert len(result) == 2 + assert result[0][0] == "encoded" + assert result[0][1].value == "hello%20world" + assert result[1][0] == "url" + assert result[1][1].value == "https%3A%2F%2Fexample.com" + + +def test_parse_cookie_header_malformed() -> None: + """Test parse_cookie_header handles malformed input.""" + # Missing value + header = "name1=value1; justname; name2=value2" + + result = parse_cookie_header(header) + + # Parser accepts cookies without values (empty value) + assert len(result) == 3 + assert result[0][0] == "name1" + assert result[0][1].value == "value1" + assert result[1][0] == "justname" + assert result[1][1].value == "" + assert result[2][0] == "name2" + assert result[2][1].value == "value2" + + # Missing name + header = "=value; name=value2" + result = parse_cookie_header(header) + assert len(result) == 2 + assert result[0][0] == "=value" + assert result[0][1].value == "" + assert result[1][0] == "name" + assert result[1][1].value == "value2" + + +def test_parse_cookie_header_complex_quoted() -> None: + """Test parse_cookie_header with complex quoted values.""" + header = 'session="abc;xyz"; data="value;with;multiple;semicolons"; simple=unquoted' + + result = parse_cookie_header(header) + + assert len(result) == 3 + assert result[0][0] == "session" + assert result[0][1].value == "abc;xyz" + assert result[1][0] == "data" + assert result[1][1].value == "value;with;multiple;semicolons" + assert result[2][0] == "simple" + assert result[2][1].value == "unquoted" + + +def test_parse_cookie_header_unmatched_quotes() -> None: + """Test parse_cookie_header handles unmatched quotes.""" + header = 'cookie1=value1; cookie2="unmatched; cookie3=value3' + + result = parse_cookie_header(header) + + # Should parse all cookies correctly + assert len(result) == 3 + assert result[0][0] == "cookie1" + assert result[0][1].value == "value1" + assert result[1][0] == "cookie2" + assert result[1][1].value == '"unmatched' + assert result[2][0] == "cookie3" + assert result[2][1].value == "value3" + + +def test_parse_cookie_header_vs_parse_set_cookie_headers() -> None: + """Test difference between parse_cookie_header and parse_set_cookie_headers.""" + # Cookie header with attribute-like pairs + cookie_header = "session=abc123; path=/test; secure=yes" + + # parse_cookie_header treats all as cookies + cookie_result = parse_cookie_header(cookie_header) + assert len(cookie_result) == 3 + assert cookie_result[0][0] == "session" + assert cookie_result[0][1].value == "abc123" + assert cookie_result[1][0] == "path" + assert cookie_result[1][1].value == "/test" + assert cookie_result[2][0] == "secure" + assert cookie_result[2][1].value == "yes" + + # parse_set_cookie_headers would treat path and secure as attributes + set_cookie_result = parse_set_cookie_headers([cookie_header]) + assert len(set_cookie_result) == 1 + assert set_cookie_result[0][0] == "session" + assert set_cookie_result[0][1].value == "abc123" + assert set_cookie_result[0][1]["path"] == "/test" + # secure with any value is treated as boolean True + assert set_cookie_result[0][1]["secure"] is True + + +def test_parse_cookie_header_compatibility_with_simple_cookie() -> None: + """Test parse_cookie_header output works with SimpleCookie.""" + header = "session=abc123; user=john; token=xyz789" + + # Parse with our function + parsed = parse_cookie_header(header) + + # Create SimpleCookie and update with our results + sc = SimpleCookie() + sc.update(parsed) + + # Verify all cookies are present + assert len(sc) == 3 + assert sc["session"].value == "abc123" + assert sc["user"].value == "john" + assert sc["token"].value == "xyz789" + + +def test_parse_cookie_header_real_world_examples() -> None: + """Test parse_cookie_header with real-world Cookie headers.""" + # Google Analytics style + header = "_ga=GA1.2.1234567890.1234567890; _gid=GA1.2.0987654321.0987654321" + result = parse_cookie_header(header) + assert len(result) == 2 + assert result[0][0] == "_ga" + assert result[0][1].value == "GA1.2.1234567890.1234567890" + assert result[1][0] == "_gid" + assert result[1][1].value == "GA1.2.0987654321.0987654321" + + # Session cookies + header = "PHPSESSID=abc123def456; csrf_token=xyz789; logged_in=true" + result = parse_cookie_header(header) + assert len(result) == 3 + assert result[0][0] == "PHPSESSID" + assert result[0][1].value == "abc123def456" + assert result[1][0] == "csrf_token" + assert result[1][1].value == "xyz789" + assert result[2][0] == "logged_in" + assert result[2][1].value == "true" + + # Complex values with proper quoting + header = r'preferences="{\"theme\":\"dark\",\"lang\":\"en\"}"; session_data=eyJhbGciOiJIUzI1NiJ9' + result = parse_cookie_header(header) + assert len(result) == 2 + assert result[0][0] == "preferences" + assert result[0][1].value == '{"theme":"dark","lang":"en"}' + assert result[1][0] == "session_data" + assert result[1][1].value == "eyJhbGciOiJIUzI1NiJ9" + + +def test_parse_cookie_header_issue_7993() -> None: + """Test parse_cookie_header handles issue #7993 correctly.""" + # This specific case from issue #7993 + header = 'foo=bar; baz="qux; foo2=bar2' + + result = parse_cookie_header(header) + + # All cookies should be parsed + assert len(result) == 3 + assert result[0][0] == "foo" + assert result[0][1].value == "bar" + assert result[1][0] == "baz" + assert result[1][1].value == '"qux' + assert result[2][0] == "foo2" + assert result[2][1].value == "bar2" + + +def test_parse_cookie_header_illegal_names(caplog: pytest.LogCaptureFixture) -> None: + """Test parse_cookie_header warns about illegal cookie names.""" + # Cookie name with comma (not allowed in _COOKIE_NAME_RE) + header = "good=value; invalid,cookie=bad; another=test" + result = parse_cookie_header(header) + # Should skip the invalid cookie but continue parsing + assert len(result) == 2 + assert result[0][0] == "good" + assert result[0][1].value == "value" + assert result[1][0] == "another" + assert result[1][1].value == "test" + assert "Can not load cookie: Illegal cookie name 'invalid,cookie'" in caplog.text + + @pytest.mark.parametrize( ("input_str", "expected"), [ @@ -1225,45 +1572,3 @@ def test_unquote_compatibility_with_simplecookie(test_value: str) -> None: f"our={_unquote(test_value)!r}, " f"SimpleCookie={simplecookie_unquote(test_value)!r}" ) - - -@pytest.mark.parametrize( - ("header", "expected_name", "expected_value", "expected_coded"), - [ - # Test cookie values with octal escape sequences - (r'name="\012newline\012"', "name", "\nnewline\n", r'"\012newline\012"'), - ( - r'tab="\011separated\011values"', - "tab", - "\tseparated\tvalues", - r'"\011separated\011values"', - ), - ( - r'mixed="hello\040world\041"', - "mixed", - "hello world!", - r'"hello\040world\041"', - ), - ( - r'complex="\042quoted\042 text with \012 newline"', - "complex", - '"quoted" text with \n newline', - r'"\042quoted\042 text with \012 newline"', - ), - ], -) -def test_parse_cookie_headers_uses_unquote_with_octal( - header: str, expected_name: str, expected_value: str, expected_coded: str -) -> None: - """Test that parse_cookie_headers correctly unquotes values with octal sequences and preserves coded_value.""" - result = parse_cookie_headers([header]) - - assert len(result) == 1 - name, morsel = result[0] - - # Check that octal sequences were properly decoded in the value - assert name == expected_name - assert morsel.value == expected_value - - # Check that coded_value preserves the original quoted string - assert morsel.coded_value == expected_coded diff --git a/tests/test_web_request.py b/tests/test_web_request.py index bac910ac0af..51d6e1b108b 100644 --- a/tests/test_web_request.py +++ b/tests/test_web_request.py @@ -407,27 +407,35 @@ def test_request_cookies_quoted_values() -> None: def test_request_cookies_with_attributes() -> None: - """Test that cookie attributes don't affect value parsing. + """Test that cookie attributes are parsed as cookies per RFC 6265. - Related to issue #5397 - ensures that the presence of domain or other - attributes doesn't change how cookie values are parsed. + Per RFC 6265 Section 5.4, Cookie headers contain only name-value pairs. + Names that match attribute names (Domain, Path, etc.) should be treated + as regular cookies, not as attributes. """ - # Cookie with domain attribute - quotes should still be removed + # Cookie with domain - both should be parsed as cookies headers = CIMultiDict(COOKIE='sess="quoted_value"; Domain=.example.com') req = make_mocked_request("GET", "/", headers=headers) - assert req.cookies == {"sess": "quoted_value"} + assert req.cookies == {"sess": "quoted_value", "Domain": ".example.com"} - # Cookie with multiple attributes + # Cookie with multiple attribute names - all parsed as cookies headers = CIMultiDict(COOKIE='token="abc123"; Path=/; Secure; HttpOnly') req = make_mocked_request("GET", "/", headers=headers) - assert req.cookies == {"token": "abc123"} + assert req.cookies == {"token": "abc123", "Path": "/", "Secure": "", "HttpOnly": ""} - # Multiple cookies with different attributes + # Multiple cookies with attribute names mixed in headers = CIMultiDict( COOKIE='c1="v1"; Domain=.example.com; c2="v2"; Path=/api; c3=v3; Secure' ) req = make_mocked_request("GET", "/", headers=headers) - assert req.cookies == {"c1": "v1", "c2": "v2", "c3": "v3"} + assert req.cookies == { + "c1": "v1", + "Domain": ".example.com", + "c2": "v2", + "Path": "/api", + "c3": "v3", + "Secure": "", + } def test_match_info() -> None: