Skip to content

Commit

Permalink
[PR #8948/cc6d7632 backport][3.10] Make n argument clearer (#8949)
Browse files Browse the repository at this point in the history
Co-authored-by: Sam Bull <git@sambull.org>
  • Loading branch information
patchback[bot] and Dreamsorcerer authored Aug 30, 2024
1 parent 181c042 commit a40dbad
Showing 1 changed file with 15 additions and 3 deletions.
18 changes: 15 additions & 3 deletions docs/streams.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,17 @@ Reading Methods
.. method:: StreamReader.read(n=-1)
:async:

Read up to *n* bytes. If *n* is not provided, or set to ``-1``, read until
EOF and return all read bytes.
Read up to a maximum of *n* bytes. If *n* is not provided, or set to ``-1``,
read until EOF and return all read bytes.

When *n* is provided, data will be returned as soon as it is available.
Therefore it will return less than *n* bytes if there are less than *n*
bytes in the buffer.

If the EOF was received and the internal buffer is empty, return an
empty bytes object.

:param int n: how many bytes to read, ``-1`` for the whole stream.
:param int n: maximum number of bytes to read, ``-1`` for the whole stream.

:return bytes: the given data

Expand Down Expand Up @@ -127,6 +131,14 @@ size limit and over any available data.
async for data in response.content.iter_chunked(1024):
print(data)

To get chunks that are exactly *n* bytes, you could use the
`asyncstdlib.itertools <https://asyncstdlib.readthedocs.io/en/stable/source/api/itertools.html>`_
module::

chunks = batched(chain.from_iterable(response.content.iter_chunked(n)), n)
async for data in chunks:
print(data)

.. method:: StreamReader.iter_any()
:async:

Expand Down

0 comments on commit a40dbad

Please sign in to comment.