You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
urllib3's streaming API is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once.
When streaming a compressed response, urllib3 can perform decoding or decompression based on the HTTP Content-Encoding header (e.g., gzip, deflate, br, or zstd). The library must read compressed data from the network and decompress it until the requested chunk size is met. Any resulting decompressed data that exceeds the requested amount is held in an internal buffer for the next read operation.
The decompression logic could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This can result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side, even if the application only requested a small chunk of data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier to stream large compressed responses or content from untrusted sources.
stream(), read(amt=256), read1(amt=256), read_chunked(amt=256), readinto(b) are examples of urllib3.HTTPResponse method calls using the affected logic unless decoding is disabled explicitly.
Remediation
Upgrade to at least urllib3 v2.6.0 in which the library avoids decompressing data that exceeds the requested amount.
If your environment contains a package facilitating the Brotli encoding, upgrade to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 too. These versions are enforced by the urllib3[brotli] extra in the patched versions of urllib3.
Credits
The issue was reported by @Cycloctane.
Supplemental information was provided by @stamparm during a security audit performed by 7ASecurity and facilitated by OSTIF.
urllib3 supports chained HTTP encoding algorithms for response content according to RFC 9110 (e.g., Content-Encoding: gzip, zstd).
However, the number of links in the decompression chain was unbounded allowing a malicious server to insert a virtually unlimited number of compression steps leading to high CPU usage and massive memory allocation for the decompressed data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier for HTTP requests to untrusted sources unless they disable content decoding explicitly.
Remediation
Upgrade to at least urllib3 v2.6.0 in which the library limits the number of links to 5.
If upgrading is not immediately possible, use preload_content=False and ensure that resp.headers["content-encoding"] contains a safe number of encodings before reading the response content.
Fixed a security issue where streaming API could improperly handle highly
compressed HTTP content ("decompression bombs") leading to excessive resource
consumption even when a small amount of data was requested. Reading small
chunks of compressed data is safer and much more efficient now.
(GHSA-2xpw-w6gg-jr37 <https://github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37>__)
Fixed a security issue where an attacker could compose an HTTP response with
virtually unlimited links in the Content-Encoding header, potentially
leading to a denial of service (DoS) attack by exhausting system resources
during decoding. The number of allowed chained encodings is now limited to 5.
(GHSA-gm62-xv2j-4w53 <https://github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53>__)
.. caution::
If urllib3 is not installed with the optional urllib3[brotli] extra, but
your environment contains a Brotli/brotlicffi/brotlipy package anyway, make
sure to upgrade it to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 to
benefit from the security fixes and avoid warnings. Prefer using urllib3[brotli] to install a compatible Brotli package automatically.
If you use custom decompressors, please make sure to update them to
respect the changed API of urllib3.response.ContentDecoder.
Features
Enabled retrieval, deletion, and membership testing in HTTPHeaderDict using bytes keys. (#​3653 <https://github.com/urllib3/urllib3/issues/3653>__)
Added host and port information to string representations of HTTPConnection. (#​3666 <https://github.com/urllib3/urllib3/issues/3666>__)
Added support for Python 3.14 free-threading builds explicitly. (#​3696 <https://github.com/urllib3/urllib3/issues/3696>__)
Removals
Removed the HTTPResponse.getheaders() method in favor of HTTPResponse.headers.
Removed the HTTPResponse.getheader(name, default) method in favor of HTTPResponse.headers.get(name, default). (#​3622 <https://github.com/urllib3/urllib3/issues/3622>__)
Bugfixes
Fixed redirect handling in urllib3.PoolManager when an integer is passed
for the retries parameter. (#​3649 <https://github.com/urllib3/urllib3/issues/3649>__)
Fixed HTTPConnectionPool when used in Emscripten with no explicit port. (#​3664 <https://github.com/urllib3/urllib3/issues/3664>__)
Fixed handling of SSLKEYLOGFILE with expandable variables. (#​3700 <https://github.com/urllib3/urllib3/issues/3700>__)
Misc
Changed the zstd extra to install backports.zstd instead of zstandard on Python 3.13 and before. (#​3693 <https://github.com/urllib3/urllib3/issues/3693>__)
Improved the performance of content decoding by optimizing BytesQueueBuffer class. (#​3710 <https://github.com/urllib3/urllib3/issues/3710>__)
Allowed building the urllib3 package with newer setuptools-scm v9.x. (#​3652 <https://github.com/urllib3/urllib3/issues/3652>__)
Ensured successful urllib3 builds by setting Hatchling requirement to >= 1.27.0. (#​3638 <https://github.com/urllib3/urllib3/issues/3638>__)
Configuration
📅 Schedule: Branch creation - "" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
If you want to rebase/retry this PR, check this box
Within HostnameError.Error(), when constructing an error string, there is no limit to the number of hosts that will be printed out. Furthermore, the error string is constructed by repeated string concatenation, leading to quadratic runtime. Therefore, a certificate provided by a malicious actor can result in excessive resource consumption.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.026%
EPSS Percentile
6th percentile
Description
The ParseAddress function constructeds domain-literal address components through repeated string concatenation. When parsing large domain-literal components, this can cause excessive CPU consumption.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.026%
EPSS Percentile
6th percentile
Description
The processing time for parsing some invalid inputs scales non-linearly with respect to the size of the input.
This affects programs which parse untrusted PEM inputs.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.014%
EPSS Percentile
2nd percentile
Description
Validating certificate chains which contain DSA public keys can cause programs to panic, due to a interface cast that assumes they implement the Equal method.
This affects programs which validate arbitrary certificate chains.
Affected range
<1.24.9
Fixed version
1.24.9
EPSS Score
0.014%
EPSS Percentile
2nd percentile
Description
Due to the design of the name constraint checking algorithm, the processing time of some inputs scale non-linearly with respect to the size of the certificate.
This affects programs which validate arbitrary certificate chains.
Affected range
<1.24.11
Fixed version
1.24.11
EPSS Score
0.016%
EPSS Percentile
3rd percentile
Description
An excluded subdomain constraint in a certificate chain does not restrict the usage of wildcard SANs in the leaf certificate. For example a constraint that excludes the subdomain test.example.com does not prevent a leaf certificate from claiming the SAN *.example.com.
Affected range
>=1.24.0 <1.24.6
Fixed version
1.24.6
EPSS Score
0.020%
EPSS Percentile
4th percentile
Description
If the PATH environment variable contains paths which are executables (rather than just directories), passing certain strings to LookPath ("", ".", and ".."), can result in the binaries listed in the PATH being unexpectedly returned.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.025%
EPSS Percentile
6th percentile
Description
The Reader.ReadResponse function constructs a response string through repeated string concatenation of lines. When the number of lines in a response is large, this can cause excessive CPU consumption.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.019%
EPSS Percentile
4th percentile
Description
When Conn.Handshake fails during ALPN negotiation the error contains attacker controlled information (the ALPN protocols sent by the client) which is not escaped.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.025%
EPSS Percentile
6th percentile
Description
Despite HTTP headers having a default limit of 1MB, the number of cookies that can be parsed does not have a limit. By sending a lot of very small cookies such as "a=;", an attacker can make an HTTP server allocate a large amount of structs, causing large memory consumption.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.033%
EPSS Percentile
9th percentile
Description
Parsing a maliciously crafted DER payload could allocate large amounts of memory, causing memory exhaustion.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.025%
EPSS Percentile
6th percentile
Description
The Parse function permits values other than IPv6 addresses to be included in square brackets within the host component of a URL. RFC 3986 permits IPv6 addresses to be included within the host component, enclosed within square brackets. For example: "http://[::1]/". IPv4 addresses and hostnames must not appear within square brackets. Parse did not enforce this requirement.
Affected range
<1.24.8
Fixed version
1.24.8
EPSS Score
0.014%
EPSS Percentile
2nd percentile
Description
tar.Reader does not set a maximum size on the number of sparse region data blocks in GNU tar pax 1.0 sparse files. A maliciously-crafted archive containing a large number of sparse regions can cause a Reader to read an unbounded amount of data from the archive into memory. When reading from a compressed source, a small compressed input can result in large allocations.
Improper Handling of Length Parameter Inconsistency
Affected range
<1.21
Fixed version
1.21
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Score
0.802%
EPSS Percentile
73rd percentile
Description
When reading a specially crafted ZIP archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' zip package.
Improper Handling of Length Parameter Inconsistency
Affected range
<1.21
Fixed version
1.21
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Score
1.437%
EPSS Percentile
80th percentile
Description
When reading a specially crafted TAR archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' tar package.
Improper Handling of Length Parameter Inconsistency
Affected range
<1.21
Fixed version
1.21
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Score
1.893%
EPSS Percentile
83rd percentile
Description
When reading a specially crafted 7Z archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' sevenz package.
Excessive Iteration
Affected range
<1.21
Fixed version
1.21
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Score
0.843%
EPSS Percentile
74th percentile
Description
When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress' sevenz package.
Loop with Unreachable Exit Condition ('Infinite Loop')
Affected range
>=1.3 <1.26.0
Fixed version
1.26.0
CVSS Score
5.9
CVSS Vector
CVSS:3.1/AV:L/AC:H/PR:N/UI:N/S:C/C:N/I:N/A:H
EPSS Score
0.019%
EPSS Percentile
4th percentile
Description
Loop with Unreachable Exit Condition ('Infinite Loop') vulnerability in Apache Commons Compress. This issue affects Apache Commons Compress: from 1.3 through 1.25.0.
Users are recommended to upgrade to version 1.26.0 which fixes the issue.
When parsing unknown fields in the Protobuf Java Lite and Full library, a maliciously crafted message can cause a StackOverflow error and lead to a program crash.
Reporter: Alexis Challande, Trail of Bits Ecosystem Security Team [email protected]
Affected versions: This issue affects all versions of both the Java full and lite Protobuf runtimes, as well as Protobuf for Kotlin and JRuby, which themselves use the Java Protobuf runtime.
Severity
CVE-2024-7254High CVSS4.0 Score 8.7 (NOTE: there may be a delay in publication)
This is a potential Denial of Service. Parsing nested groups as unknown fields with DiscardUnknownFieldsParser or Java Protobuf Lite parser, or against Protobuf map fields, creates unbounded recursions that can be abused by an attacker.
Proof of Concept
For reproduction details, please refer to the unit tests (Protobuf Java LiteTest and CodedInputStreamTest) that identify the specific inputs that exercise this parsing weakness.
Remediation and Mitigation
We have been working diligently to address this issue and have released a mitigation that is available now. Please update to the latest available versions of the following packages:
The tokenizer incorrectly interprets tags with unquoted attribute values that end with a solidus character (/) as self-closing. When directly using Tokenizer, this can result in such tags incorrectly being marked as self-closing, and when using the Parse functions, this can result in content following such tags as being placed in the wrong scope during DOM construction, but only when tags are in foreign content (e.g. , , etc contexts).
Misinterpretation of Input
Affected range
<0.36.0
Fixed version
0.36.0
CVSS Score
4.4
CVSS Vector
CVSS:3.1/AV:L/AC:L/PR:L/UI:N/S:U/C:L/I:N/A:L
EPSS Score
0.020%
EPSS Percentile
5th percentile
Description
Matching of hosts against proxy patterns can improperly treat an IPv6 zone ID as a hostname component. For example, when the NO_PROXY environment variable is set to "*.example.com", a request to "[::1%25.example.com]:80` will incorrectly match and not be proxied.
Uncontrolled Recursion vulnerability in Apache Commons Lang.
This issue affects Apache Commons Lang: Starting with commons-lang:commons-lang 2.0 to 2.6, and, from org.apache.commons:commons-lang3 3.0 before 3.18.0.
The methods ClassUtils.getClass(...) can throw StackOverflowError on very long inputs. Because an Error is usually not handled by applications and libraries, a StackOverflowError could cause an application to stop.
Users are recommended to upgrade to version 3.18.0, which fixes the issue.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
==2.5.0→==2.6.0urllib3 streaming API improperly handles highly compressed data
CVE-2025-66471 / GHSA-2xpw-w6gg-jr37
More information
Details
Impact
urllib3's streaming API is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once.
When streaming a compressed response, urllib3 can perform decoding or decompression based on the HTTP
Content-Encodingheader (e.g.,gzip,deflate,br, orzstd). The library must read compressed data from the network and decompress it until the requested chunk size is met. Any resulting decompressed data that exceeds the requested amount is held in an internal buffer for the next read operation.The decompression logic could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This can result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side, even if the application only requested a small chunk of data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier to stream large compressed responses or content from untrusted sources.
stream(),read(amt=256),read1(amt=256),read_chunked(amt=256),readinto(b)are examples ofurllib3.HTTPResponsemethod calls using the affected logic unless decoding is disabled explicitly.Remediation
Upgrade to at least urllib3 v2.6.0 in which the library avoids decompressing data that exceeds the requested amount.
If your environment contains a package facilitating the Brotli encoding, upgrade to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 too. These versions are enforced by the
urllib3[brotli]extra in the patched versions of urllib3.Credits
The issue was reported by @Cycloctane.
Supplemental information was provided by @stamparm during a security audit performed by 7ASecurity and facilitated by OSTIF.
Severity
CVSS:4.0/AV:N/AC:L/AT:P/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:HReferences
This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).
urllib3 allows an unbounded number of links in the decompression chain
CVE-2025-66418 / GHSA-gm62-xv2j-4w53
More information
Details
Impact
urllib3 supports chained HTTP encoding algorithms for response content according to RFC 9110 (e.g.,
Content-Encoding: gzip, zstd).However, the number of links in the decompression chain was unbounded allowing a malicious server to insert a virtually unlimited number of compression steps leading to high CPU usage and massive memory allocation for the decompressed data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier for HTTP requests to untrusted sources unless they disable content decoding explicitly.
Remediation
Upgrade to at least urllib3 v2.6.0 in which the library limits the number of links to 5.
If upgrading is not immediately possible, use
preload_content=Falseand ensure thatresp.headers["content-encoding"]contains a safe number of encodings before reading the response content.Severity
CVSS:4.0/AV:N/AC:L/AT:P/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:HReferences
This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).
Release Notes
urllib3/urllib3 (urllib3)
v2.6.0Compare Source
==================
Security
compressed HTTP content ("decompression bombs") leading to excessive resource
consumption even when a small amount of data was requested. Reading small
chunks of compressed data is safer and much more efficient now.
(
GHSA-2xpw-w6gg-jr37 <https://github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37>__)virtually unlimited links in the
Content-Encodingheader, potentiallyleading to a denial of service (DoS) attack by exhausting system resources
during decoding. The number of allowed chained encodings is now limited to 5.
(
GHSA-gm62-xv2j-4w53 <https://github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53>__).. caution::
If urllib3 is not installed with the optional
urllib3[brotli]extra, butyour environment contains a Brotli/brotlicffi/brotlipy package anyway, make
sure to upgrade it to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 to
benefit from the security fixes and avoid warnings. Prefer using
urllib3[brotli]to install a compatible Brotli package automatically.If you use custom decompressors, please make sure to update them to
respect the changed API of
urllib3.response.ContentDecoder.Features
HTTPHeaderDictusing bytes keys. (#​3653 <https://github.com/urllib3/urllib3/issues/3653>__)HTTPConnection. (#​3666 <https://github.com/urllib3/urllib3/issues/3666>__)#​3696 <https://github.com/urllib3/urllib3/issues/3696>__)Removals
HTTPResponse.getheaders()method in favor ofHTTPResponse.headers.Removed the
HTTPResponse.getheader(name, default)method in favor ofHTTPResponse.headers.get(name, default). (#​3622 <https://github.com/urllib3/urllib3/issues/3622>__)Bugfixes
urllib3.PoolManagerwhen an integer is passedfor the retries parameter. (
#​3649 <https://github.com/urllib3/urllib3/issues/3649>__)HTTPConnectionPoolwhen used in Emscripten with no explicit port. (#​3664 <https://github.com/urllib3/urllib3/issues/3664>__)SSLKEYLOGFILEwith expandable variables. (#​3700 <https://github.com/urllib3/urllib3/issues/3700>__)Misc
zstdextra to installbackports.zstdinstead ofzstandardon Python 3.13 and before. (#​3693 <https://github.com/urllib3/urllib3/issues/3693>__)BytesQueueBufferclass. (#​3710 <https://github.com/urllib3/urllib3/issues/3710>__)#​3652 <https://github.com/urllib3/urllib3/issues/3652>__)#​3638 <https://github.com/urllib3/urllib3/issues/3638>__)Configuration
📅 Schedule: Branch creation - "" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.