Skip to content

Bump the pip group across 1 directory with 5 updates#1

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/pip-ac521fad79
Open

Bump the pip group across 1 directory with 5 updates#1
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/pip-ac521fad79

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Apr 7, 2025

Bumps the pip group with 5 updates in the / directory:

Package From To
langchain 0.0.335 0.2.5
requests 2.31.0 2.32.2
pillow 10.0.0 10.3.0
torch 2.1.0 2.4.0
transformers 4.35.0 4.48.0

Updates langchain from 0.0.335 to 0.2.5

Release notes

Sourced from langchain's releases.

langchain-mistralai==0.2.5

Changes since langchain-mistralai==0.2.4

mistral[patch]: release 0.2.5 (#29463) mistralai: support method="json_schema" in structured output (#29461) [Doc] Improve api doc (#29324) Revert "integrations[patch]: remove non-required chat param defaults" (#29048) integrations[patch]: remove non-required chat param defaults (#26730) partners: allow to set Prefix in AIMessage (for MistralAI) (#28846)

langchain-groq==0.2.5

Changes since langchain-groq==0.2.4

groq[patch]: release 0.2.5 (#30168) groq[patch]: warn if model is not specified (#30161) core[patch]: update structured output tracing (#30123) core: basemessage.text() (#29078) groq[patch]: remove xfails (#29794) multiple: fix uv path deps (#29790) infra: add UV_FROZEN to makefiles (#29642) infra: migrate to uv (#29566)

langchain-groq==0.2.4

Changes since langchain-groq==0.2.3

partners/groq: release 0.2.4 (#29488) multiple: structured output tracing standard metadata (#29421) groq[patch]: update model used in test (#29441) docs: groq api key links (#29402)

langchain-ollama==0.2.3

Changes since langchain-ollama==0.2.2

partners/ollama: release 0.2.3 (#29489) multiple: structured output tracing standard metadata (#29421) [feat] Added backwards compatibility for OllamaEmbeddings initialization (migration from langchain_community.embeddings to langchain_ollama.embeddings (#29296)

langchain-pinecone==0.2.3

Changes since langchain-pinecone==0.2.2

pinecone[patch]: release 0.2.3 (#29741) pinecone[patch]: add support for python 3.13 (#29737) infra: add UV_FROZEN to makefiles (#29642) infra: migrate to uv (#29566)

langchain-chroma==0.2.2

Changes since langchain-chroma==0.2.1

chroma[patch]: release 0.2.2 (#29769) (Chroma): Small Fix in add_texts when checking for embeddings (#29766)

... (truncated)

Commits

Updates requests from 2.31.0 to 2.32.2

Release notes

Sourced from requests's releases.

v2.32.2

2.32.2 (2024-05-21)

Deprecations

  • To provide a more stable migration for custom HTTPAdapters impacted by the CVE changes in 2.32.0, we've renamed _get_connection to a new public API, get_connection_with_tls_context. Existing custom HTTPAdapters will need to migrate their code to use this new API. get_connection is considered deprecated in all versions of Requests>=2.32.0.

    A minimal (2-line) example has been provided in the linked PR to ease migration, but we strongly urge users to evaluate if their custom adapter is subject to the same issue described in CVE-2024-35195. (#6710)

v2.32.1

2.32.1 (2024-05-20)

Bugfixes

  • Add missing test certs to the sdist distributed on PyPI.

v2.32.0

2.32.0 (2024-05-20)

🐍 PYCON US 2024 EDITION 🐍

Security

  • Fixed an issue where setting verify=False on the first request from a Session will cause subsequent requests to the same origin to also ignore cert verification, regardless of the value of verify. (GHSA-9wx4-h78v-vm56)

Improvements

  • verify=True now reuses a global SSLContext which should improve request time variance between first and subsequent requests. It should also minimize certificate load time on Windows systems when using a Python version built with OpenSSL 3.x. (#6667)
  • Requests now supports optional use of character detection (chardet or charset_normalizer) when repackaged or vendored. This enables pip and other projects to minimize their vendoring surface area. The Response.text() and apparent_encoding APIs will default to utf-8 if neither library is present. (#6702)

Bugfixes

  • Fixed bug in length detection where emoji length was incorrectly calculated in the request content-length. (#6589)
  • Fixed deserialization bug in JSONDecodeError. (#6629)
  • Fixed bug where an extra leading / (path separator) could lead urllib3 to unnecessarily reparse the request URI. (#6644)

... (truncated)

Changelog

Sourced from requests's changelog.

2.32.2 (2024-05-21)

Deprecations

  • To provide a more stable migration for custom HTTPAdapters impacted by the CVE changes in 2.32.0, we've renamed _get_connection to a new public API, get_connection_with_tls_context. Existing custom HTTPAdapters will need to migrate their code to use this new API. get_connection is considered deprecated in all versions of Requests>=2.32.0.

    A minimal (2-line) example has been provided in the linked PR to ease migration, but we strongly urge users to evaluate if their custom adapter is subject to the same issue described in CVE-2024-35195. (#6710)

2.32.1 (2024-05-20)

Bugfixes

  • Add missing test certs to the sdist distributed on PyPI.

2.32.0 (2024-05-20)

Security

  • Fixed an issue where setting verify=False on the first request from a Session will cause subsequent requests to the same origin to also ignore cert verification, regardless of the value of verify. (GHSA-9wx4-h78v-vm56)

Improvements

  • verify=True now reuses a global SSLContext which should improve request time variance between first and subsequent requests. It should also minimize certificate load time on Windows systems when using a Python version built with OpenSSL 3.x. (#6667)
  • Requests now supports optional use of character detection (chardet or charset_normalizer) when repackaged or vendored. This enables pip and other projects to minimize their vendoring surface area. The Response.text() and apparent_encoding APIs will default to utf-8 if neither library is present. (#6702)

Bugfixes

  • Fixed bug in length detection where emoji length was incorrectly calculated in the request content-length. (#6589)
  • Fixed deserialization bug in JSONDecodeError. (#6629)
  • Fixed bug where an extra leading / (path separator) could lead urllib3 to unnecessarily reparse the request URI. (#6644)

Deprecations

... (truncated)

Commits
  • 88dce9d v2.32.2
  • c98e4d1 Merge pull request #6710 from nateprewitt/api_rename
  • 92075b3 Add deprecation warning
  • aa1461b Move _get_connection to get_connection_with_tls_context
  • 970e8ce v2.32.1
  • d6ebc4a v2.32.0
  • 9a40d12 Avoid reloading root certificates to improve concurrent performance (#6667)
  • 0c030f7 Merge pull request #6702 from nateprewitt/no_char_detection
  • 555b870 Allow character detection dependencies to be optional in post-packaging steps
  • d6dded3 Merge pull request #6700 from franekmagiera/update-redirect-to-invalid-uri-test
  • Additional commits viewable in compare view

Updates pillow from 10.0.0 to 10.3.0

Release notes

Sourced from pillow's releases.

10.3.0

https://pillow.readthedocs.io/en/stable/releasenotes/10.3.0.html

Deprecations

  • Deprecate eval(), replacing it with lambda_eval() and unsafe_eval() #7927 [@​hugovk]
  • Deprecate ImageCms constants and versions() function #7702 [@​nulano]

Changes

... (truncated)

Changelog

Sourced from pillow's changelog.

10.3.0 (2024-04-01)

  • CVE-2024-28219: Use strncpy to avoid buffer overflow #7928 [radarhere, hugovk]

  • Deprecate eval(), replacing it with lambda_eval() and unsafe_eval() #7927 [radarhere, hugovk]

  • Raise ValueError if seeking to greater than offset-sized integer in TIFF #7883 [radarhere]

  • Add --report argument to __main__.py to omit supported formats #7818 [nulano, radarhere, hugovk]

  • Added RGB to I;16, I;16L, I;16B and I;16N conversion #7918, #7920 [radarhere]

  • Fix editable installation with custom build backend and configuration options #7658 [nulano, radarhere]

  • Fix putdata() for I;16N on big-endian #7209 [Yay295, hugovk, radarhere]

  • Determine MPO size from markers, not EXIF data #7884 [radarhere]

  • Improved conversion from RGB to RGBa, LA and La #7888 [radarhere]

  • Support FITS images with GZIP_1 compression #7894 [radarhere]

  • Use I;16 mode for 9-bit JPEG 2000 images #7900 [scaramallion, radarhere]

  • Raise ValueError if kmeans is negative #7891 [radarhere]

  • Remove TIFF tag OSUBFILETYPE when saving using libtiff #7893 [radarhere]

  • Raise ValueError for negative values when loading P1-P3 PPM images #7882 [radarhere]

  • Added reading of JPEG2000 palettes #7870 [radarhere]

  • Added alpha_quality argument when saving WebP images #7872 [radarhere]

... (truncated)

Commits
  • 5c89d88 10.3.0 version bump
  • 63cbfcf Update CHANGES.rst [ci skip]
  • 2776126 Merge pull request #7928 from python-pillow/lcms
  • aeb51cb Merge branch 'main' into lcms
  • 5beb0b6 Update CHANGES.rst [ci skip]
  • cac6ffa Merge pull request #7927 from python-pillow/imagemath
  • f5eeeac Name as 'options' in lambda_eval and unsafe_eval, but '_dict' in deprecated eval
  • facf3af Added release notes
  • 2a93aba Use strncpy to avoid buffer overflow
  • a670597 Update CHANGES.rst [ci skip]
  • Additional commits viewable in compare view

Updates torch from 2.1.0 to 2.4.0

Release notes

Sourced from torch's releases.

PyTorch 2.4: Python 3.12, AOTInductor freezing, libuv backend for TCPStore

PyTorch 2.4 Release Notes

  • Highlights
  • Tracked Regressions
  • Backward incompatible changes
  • Deprecations
  • New features
  • Improvements
  • Bug Fixes
  • Performance
  • Documentation
  • Developers
  • Security

Highlights

We are excited to announce the release of PyTorch® 2.4! PyTorch 2.4 adds support for the latest version of Python (3.12) for torch.compile. AOTInductor freezing gives developers running AOTInductor more performance based optimizations by allowing the serialization of MKLDNN weights. As well, a new default TCPStore server backend utilizing libuv has been introduced which should significantly reduce initialization times for users running large-scale jobs. Finally, a new Python Custom Operator API makes it easier than before to integrate custom kernels into PyTorch, especially for torch.compile.

This release is composed of 3661 commits and 475 contributors since PyTorch 2.3. We want to sincerely thank our dedicated community for your contributions. As always, we encourage you to try these out and report any issues as we improve 2.4. More information about how to get started with the PyTorch 2-series can be found at our Getting Started page.

... (truncated)

Commits

Updates transformers from 4.35.0 to 4.48.0

Release notes

Sourced from transformers's releases.

v4.48.0: ModernBERT, Aria, TimmWrapper, ColPali, Falcon3, Bamba, VitPose, DinoV2 w/ Registers, Emu3, Cohere v2, TextNet, DiffLlama, PixtralLarge, Moonshine

New models

ModernBERT

The ModernBert model was proposed in Smarter, Better, Faster, Longer: A Modern Bidirectional Encoder for Fast, Memory Efficient, and Long Context Finetuning and Inference by Benjamin Warner, Antoine Chaffin, Benjamin Clavié, Orion Weller, Oskar Hallström, Said Taghadouini, Alexis Galalgher, Raja Bisas, Faisal Ladhak, Tom Aarsen, Nathan Cooper, Grifin Adams, Jeremy Howard and Iacopo Poli.

It is a refresh of the traditional encoder architecture, as used in previous models such as BERT and RoBERTa.

It builds on BERT and implements many modern architectural improvements which have been developed since its original release, such as:

  • Rotary Positional Embeddings to support sequences of up to 8192 tokens.
  • Unpadding to ensure no compute is wasted on padding tokens, speeding up processing time for batches with mixed-length sequences.
  • GeGLU Replacing the original MLP layers with GeGLU layers, shown to improve performance.
  • Alternating Attention where most attention layers employ a sliding window of 128 tokens, with Global Attention only used every 3 layers.
  • Flash Attention to speed up processing.
  • A model designed following recent The Case for Co-Designing Model Architectures with Hardware, ensuring maximum efficiency across inference GPUs.
  • Modern training data scales (2 trillion tokens) and mixtures (including code ande math data)

image

Aria

The Aria model was proposed in Aria: An Open Multimodal Native Mixture-of-Experts Model by Li et al. from the Rhymes.AI team.

Aria is an open multimodal-native model with best-in-class performance across a wide range of multimodal, language, and coding tasks. It has a Mixture-of-Experts architecture, with respectively 3.9B and 3.5B activated parameters per visual token and text token.

TimmWrapper

We add a TimmWrapper set of classes such that timm models can be loaded in as transformer models into the library.

Here's a general usage example:

import torch
from urllib.request import urlopen
from PIL import Image
from transformers import AutoConfig, AutoModelForImageClassification, AutoImageProcessor
checkpoint = "timm/resnet50.a1_in1k"
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
image_processor = AutoImageProcessor.from_pretrained(checkpoint)
</tr></table>

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore <dependency name> major version will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
  • @dependabot ignore <dependency name> minor version will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
  • @dependabot ignore <dependency name> will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
  • @dependabot unignore <dependency name> will remove all of the ignore conditions of the specified dependency
  • @dependabot unignore <dependency name> <ignore condition> will remove the ignore condition of the specified dependency and ignore conditions
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Bumps the pip group with 5 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [langchain](https://github.com/langchain-ai/langchain) | `0.0.335` | `0.2.5` |
| [requests](https://github.com/psf/requests) | `2.31.0` | `2.32.2` |
| [pillow](https://github.com/python-pillow/Pillow) | `10.0.0` | `10.3.0` |
| [torch](https://github.com/pytorch/pytorch) | `2.1.0` | `2.4.0` |
| [transformers](https://github.com/huggingface/transformers) | `4.35.0` | `4.48.0` |



Updates `langchain` from 0.0.335 to 0.2.5
- [Release notes](https://github.com/langchain-ai/langchain/releases)
- [Commits](langchain-ai/langchain@v0.0.335...langchain==0.2.5)

Updates `requests` from 2.31.0 to 2.32.2
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](psf/requests@v2.31.0...v2.32.2)

Updates `pillow` from 10.0.0 to 10.3.0
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](python-pillow/Pillow@10.0.0...10.3.0)

Updates `torch` from 2.1.0 to 2.4.0
- [Release notes](https://github.com/pytorch/pytorch/releases)
- [Changelog](https://github.com/pytorch/pytorch/blob/main/RELEASE.md)
- [Commits](pytorch/pytorch@v2.1.0...v2.4.0)

Updates `transformers` from 4.35.0 to 4.48.0
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.35.0...v4.48.0)

---
updated-dependencies:
- dependency-name: langchain
  dependency-version: 0.2.5
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: requests
  dependency-version: 2.32.2
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: pillow
  dependency-version: 10.3.0
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: torch
  dependency-version: 2.4.0
  dependency-type: direct:production
  dependency-group: pip
- dependency-name: transformers
  dependency-version: 4.48.0
  dependency-type: direct:production
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Apr 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants