Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trailer support in the API #981

Open
annevk opened this issue Dec 11, 2019 · 29 comments
Open

Trailer support in the API #981

annevk opened this issue Dec 11, 2019 · 29 comments
Labels
addition/proposal New features or enhancements needs implementer interest Moving the issue forward requires implementers to express interest topic: api topic: http

Comments

@annevk
Copy link
Member

annevk commented Dec 11, 2019

I still think this is something we should do as it's part of HTTP and with newer iterations of H/2 it's a feature that's a lot easier to make use of due to overall improved infrastructure.

My current thinking is that building this on top of #980 and #607 (as you can have multiple trailer header lists per request/response) is the way to go. FetchObserver could have a sendTrailer(Headers trailer) and ontrailer event handler or some such. Details probably best sorted once there's more firm implementer interest.

@annevk annevk added addition/proposal New features or enhancements needs implementer interest Moving the issue forward requires implementers to express interest topic: http topic: api labels Dec 11, 2019
annevk added a commit that referenced this issue Dec 12, 2019
As noted in #772 (comment) the current way it is exposed subsets what HTTP supports and therefore does not feel like a good starting point. Both for the internal and public API.

Additionally, the public-facing API has not seen interest from implementers, at least for the past year and a half.

Tests: web-platform-tests/wpt#20720.

Closes #772. Follow-ups: #980 and #981.
@mlasak
Copy link

mlasak commented Jun 11, 2021

We are in an unsatisfying situation. Using fetch API with chunked transfer encoding there is currently no way to get the time when the chunk reception has started, see whatwg/streams#1126

When i stumbled upon HTTP Trailer [1] i shortly believed that we could solve our issue with it. The idea is to send a list with timestamps as Trailer with information when each single chunk was put into "pipe" at server side. With some calculation we would be then enabled to calculate exact e2e network throughput.

But unfortunately, fetch API doesn't support trailers.
If you need a use case: we are in need for exact throughput calculation in low latency video streaming.

Is there any chance for Trailer support in fetch API in future?

[1] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Trailer

@annevk
Copy link
Member Author

annevk commented Jun 14, 2021

There's a chance, otherwise this would be closed. It does however require folks to be willing to implement it and drive the associated work that is required and that thus far has not materialized.

@mnot
Copy link
Member

mnot commented Jun 14, 2021

Note that HTTP provides no guarantee that chunks are preserved end-to-end; furthermore, many (most?) HTTP APIs don't guarantee that chunk delineation is exposed accurately.

@davidben
Copy link
Collaborator

I don't think that scenario is a good justification for adding trailers to the platform. It seems much better addressed by looking at timing and when bytes come out of the stream. Additionally, with the various layers in between JavaScript and HTTP-level chunking (TLS, QUIC, HTTP/2 and HTTP/3 frames, the browser's own mechanisms), guarantees about chunk boundaries wouldn't be coherent anyway.

@mlasak
Copy link

mlasak commented Jun 15, 2021

Note that HTTP provides no guarantee that chunks are preserved end-to-end; furthermore, many (most?) HTTP APIs don't guarantee that chunk delineation is exposed accurately.

Yes that's true and exactly for this reason it is so incredibly difficult to calculate throughput on receiver side (in JavaScript app logic). And surprisingly accurate e2e chunk preservation is not needed here (and not what we are asking for), we just suggest that one should be able to add meta information at the end of running chunked transfer transmission as the HTTP spec allows.

Please let me provide you a simple example for clarification why trailers would be helpful.
Videos are streamed very often with 30 frames per second. So ideally every 33 milliseconds one frame is produced and send to the client in form of e.g. 2 seconds long video segments. The major problem is that the production (aka idle) time and transmission time is perceived jointly at client.

On server we know the timestamp when the frame was sent.
On client we know the timestamp when the frame was received.
So, to calculate the e2e throughput correctly we need this distributed timestamp information somehow while keeping the requests stateless(!). Throughput information is needed for adaptive bitrate algorithms to work properly and to improve the viewer experience.

Regarding the comment by @davidben

looking at timing and when bytes come out of the stream

this is the way we currently go and mostly fail to calculate throughput correctly. What you will get is more or less that network throughput equals the (current) streamed video bitrate. Please elaborate a bit more on this if we missed your point but pls think of the existence of the idle times -> chunks are NOT send immediately one after another, there are production/idle times between them.

@acbegen
Copy link

acbegen commented Jun 20, 2021

At 25 frames per second, you generate a single video frame at every 40 ms and assume you ship each frame as they are encoded. Large frames may take longer than 40 ms, smaller ones may take a ms or so to transmit depending on the server's network connection. But that is not that important. What is important is the interarrival time of these chunks as they are received by the client. Each chunk has two associated times (t1 and t2). We know the chunk's size and if we know delta t (t2-t1), we can compute the chunk's throughput. But, we know t2 only, not t1 - we can only estimate it [1]. If that estimate is not good, then the computed chunk throughput will be pretty wrong leading to incorrect bandwidth measurements. As a result, your adaptive streaming client will adapt incorrectly (mostly getting stuck at a bitrate lower than you could get otherwise).

What @mlasak is asking for whether there is a way to expose t1 in the API. The info is in somewhere there, it just needs to be exposed.

[1] Bandwidth Prediction in Low-Latency Chunked Streaming talk at https://mile-high.video/files/mhv2019/index.html

@vlovich
Copy link

vlovich commented Sep 15, 2022

FWIW S3 now providing checksums in HTTP trailers: https://docs.aws.amazon.com/AmazonS3/latest/userguide/checking-object-integrity.html

This way you can upload and provide a checksum that is validated against before the upload is published.

@JakeChampion
Copy link

JakeChampion commented Jul 18, 2023

Howdy all,

I work at Fastly on their JavaScript runtime and SDK 👋
Fastly are interested in implementing HTTP Trailer support in order to support gRPC over HTTP/2.
I would like to implement the HTTP Trailer support in a standard compatible way, and would be happy to work with folks on bringing this functionality into the Fetch Standard.

What I am thinking as a potential solution is:

Updating the Response interface to include a new trailers method which returns a Promise<Headers> instance:

[SameObject] Promise<Headers> trailers();

Updating the ResponseInit interface to include a new trailers field which implements the HeadersInit interface

[HeadersInit](https://fetch.spec.whatwg.org/#typedefdef-headersinit) trailers;

And the same for both Request and RequestInit.

Which in an application context would look like this:

// Service-Worker API based example of reading a request trailer
// and responding with a response which contains the same trailer

addEventListener('fetch', event => event.respondWith(app(event)));

async function app(event) {
	const request = event.request;
	// Resolves when all trailer fields have been received and returns an instance of Headers
	const incomingTrailers = await request.trailers();
	const animal = incomingTrailers.get('animal');
	const response = new Response('I am a body', {
		headers: {
			'Trailer': 'animal'
		},
		trailers: {
			animal: animal
	});
	return response;
}

@jasnell
Copy link

jasnell commented Jul 19, 2023

I would definitely like to see the conversation around this progress but I'm not sure the proposal is adequate here. In some cases when sending trailers we do not actually want to calculate the value of a trailer until after the body payload has been fully sent (largely because we won't know what the full body is until it is fully streamed. In these cases, timing exactly when and how to get the content of the trailing header can be tricky. For instance, in the Workers runtime (https://github.com/cloudflare/workerd), in many cases, when streaming the body does not actually require javascript to be run, we will actually move the entire stream out of the runtime and stop executing javascript at all after the Response has been provided. In these cases, it would be impossible for us to actually calculate a trailing header value based on the body payload content, which severely limits the use cases. If we know that we're going to send a trailer, we could avoid using this optimization but at the cost of increasing costs to the user by keeping the JS execution context around longer.

We use a similar optimization for certain Request objects that are received and are forwarded on to fetch subrequests. If the code attaches a continuation to the trailers promise as proposed here, it's not clear what should happen if the user code does, in fact, forward the request on, moving all remaining processing of that request out of JavaScript.

Now, I'm not suggesting that this be designed around the specific optimizations of the workers runtime, just using those as an example. If the content of the trailing header cannot be calculated in advance and we have to provide a means of calculating it at the end of the processing of the payload then we could end up surfacing implementation and timing details under the covers that could end up very inconsistent from one runtime to the next, or could end up forcing such optimizations to be disabled entirely, which is a problem.

@jasnell
Copy link

jasnell commented Jul 19, 2023

@JakeChampion:

const incomingTrailers = await request.trailers();

I assume that awaiting this would force the entire body payload to be cached in memory?

@kentonv
Copy link

kentonv commented Jul 19, 2023

I would think the point of trailers is that they can be computed after streaming the response body, so specifying them to the response constructor wouldn't be what we want. If you know the trailer values at constructor time, you might as well make them headers instead.

Also, on the receiving side, if response.trailers() is invoked before the response body has been read, what happens to the body? Does it get buffered in RAM? I think it's important to design the API to avoid accidental buffering of entire responses.

I'm less concerned about the impact on deferred proxying in workerd -- this would just be a case where deferred proxying optimization can't take effect, similar to what happens when using the streams API to generate a response body.

@JakeChampion
Copy link

@JakeChampion:

const incomingTrailers = await request.trailers();

I assume that awaiting this would force the entire body payload to be cached in memory?

I'd propose it does not buffer, but instead consumes and discards the body if the body has not already been consumed. If wanting the body, then applications could read the body before reading the trailers.

@kentonv
Copy link

kentonv commented Jul 19, 2023

Maybe when calling the constructor, you specify a callback for trailers, which is invoked after the stream is done, and returns the trailer map at that point:

	const response = new Response('I am a body', {
		headers: {
			'Trailer': 'animal'
		},
		trailers: () => { return { animal: animal } }
	});

On the receiving end, invoking response.trailers() before reading the entire body is an error.

@jasnell
Copy link

jasnell commented Jul 19, 2023

Keep in mind that these are reversed on client and server sides.

On the client side, a trailers callback would need to be provided in the RequestInit, while on the server-side, it needs to be on the ResponseInit.

On the receiving side, setting up an "on trailers" callback would avoid the issue of ordering when consuming the body.

// client side fetch api
const resp = await fetch('http://example.org', {
  headers: { 'trailers': 'foo' },
  // Called when sending the headers...
  trailers(headers) {
    headers.set('foo', 'bar');
  }
}

resp.ontrailers = (headers) => {
  // Called when trailers are received.
};
// server side fetch api
export default {
  async fetch(req) {
    req.ontrailers = (headers) => {
     // Called when trailers are received
    };
    // ...
   return new Response(stream, {
    headers: { 'trailers': 'foo' },
    trailers(headers) {
      headers.set('foo', 'bar');
    }
   });
  }
}

It feels rather clunky tho.

@vlovich
Copy link

vlovich commented Jul 20, 2023 via email

@annevk
Copy link
Member Author

annevk commented Jul 21, 2023

Thoughts:

  • Reading trailers: readonly attribute Promise<Headers> trailers; (no method; see below) makes sense for both Request and Response. I don't think we should reject if you call it before consuming the body. However, I don't think it should actively consume the body either. That seems very bad. Is it impractical to make them available before the body is consumed? If so, I guess we need to require body consumption before it will start returning something useful. Except perhaps for synthetic requests/responses.
  • Setting trailers: Promise<HeadersInit> trailers; makes sense for both Request and Response. I don't think we should reject if you resolve the promise early. Headers is not an expensive object to keep around.
  • Caching: not sure. Perhaps cached entries never have trailers?

@kentonv
Copy link

kentonv commented Jul 21, 2023

Is it impractical to make them available before the body is consumed?

I think it is inherently impossible, yes -- unless the system buffers the body somewhere, but we don't want that, especially in edge proxies.

Promise trailers() makes sense for both Request and Response

Would it make sense for this to be a property rather than a function? That provides nice symmetry between Request/Response and RequestInit/ResponseInit. Note that it's very common to use an instance of Request as a RequestInit, relying on the fact that all the same-named properties have the same type and semantics, so this might be more than a superficial difference.

@mnot
Copy link
Member

mnot commented Jul 21, 2023

Caching: not sure. Perhaps cached entries never have trailers?

RFC9111:

Caches MAY either store trailer fields separate from header fields or discard them. Caches MUST NOT combine trailer fields with header fields.

@jasnell
Copy link

jasnell commented Aug 3, 2023

I've started to outline a proposal that covers both trailer support and early hints support here: https://docs.google.com/document/d/1P4MskkFd3HHFPGDr01O3wdFXukmX9i0jAb5uh9v9x8Q/edit ... comments welcome in the doc

@jasnell
Copy link

jasnell commented Oct 2, 2023

Just keeping the conversation going. Based on feedback on the doc I referenced above, I've iterated a bit more on an approach for trailers that should be workable.

Specifically, to send trailers... we can add a new trailers option to RequestInit and ResponseInit whose value is a promise resolving Headers.

const resp = await connect('https://...', { method: 'POST', body: payload, trailers: Promise.resolve(trailingHeaders) });

When the underlying implementation has completely processed the payload and is ready to send trailing headers, it would await the trailers promise.

To receive trailers, we would add a new trailers attribute on both Request and Response.

resp.trailers.then((headers) => {
  // `headers` is a `Headers` object
});

@annevk
Copy link
Member Author

annevk commented Oct 2, 2023

You mean that the sender would resolve the trailers promise with a HeadersInit? If so, this matches #981 (comment), right?

@MattMenke2
Copy link
Contributor

MattMenke2 commented Oct 2, 2023

Caching: not sure. Perhaps cached entries never have trailers?

RFC9111:

Caches MAY either store trailer fields separate from header fields or discard them. Caches MUST NOT combine trailer fields with header fields.

Not having consistent behavior between browsers there sounds like a recipe for incompatibilities. If support for trailers is added to the web platform, I think the fetch spec needs to be clear on how browsers should handle that, even if caching proxies can behave differently (which also sounds like a major problem to me - as do caching proxies that never had to support them before potentially ignoring trailers, though guess the increased prevalence of HTTPS should help mitigate that).

The Chrome networking team has historically pushed back pretty strongly on adding trailers because it's a pretty massive change to network APIs - a ton of stuff interposes on network requests in browsers, and much of it would need to be updated (Edit: Also due to concerns about real world utility, and the expectation that the disk cache should respect cache-related trails). Has that changed? That obviously doesn't mean work here shouldn't proceed, but it's a consideration if the goal here is broad adoption.

@jasnell
Copy link

jasnell commented Oct 2, 2023

You mean that the sender would resolve the trailers promise with a HeadersInit? If so, this matches #981 (comment), right?

If I understand the comment correctly, yes, the change adopts your feedback. On the sending trailers side, the trailer field in either RequestInit or ResponseInit resolves a HeadersInit. On the receiving side, the trailers attribute on both Response and Request is a promise for a Headers.

@jasnell
Copy link

jasnell commented Oct 3, 2023

Has that changed? That obviously doesn't mean work here shouldn't proceed, but it's a consideration if the goal here is broad adoption.

I definitely cannot speak for any of the browsers. What I do know is that for Node.js and Workers, we've had a number of requests to support trailers for a number of cases -- the most common requests are to support content-digest and server-timing as trailing headers.

And yeah, I think the concerns around it being difficult to add trailers due to how it impacts the underlying implementation are absolutely valid. Supporting trailers in Workers is going to be a significant effort on multiple layers so I can definitely understand the reluctance. Still, it would be worthwhile, I think.

@anonghuser
Copy link

anonghuser commented Oct 23, 2023

Given the mutable nature of the Headers object, it may be possible to support trailers without any new APIs. Considering trailers are just delayed headers, would adding them to Response.headers right before ending the body stream not make sense? This way there is no need for a separate event or promise, the done flag returned by the body stream reader when reading or the close() method of the controller when creating a Request or Response should be sufficient.

Applications that insist on making a distinction may compare the headers before and after the body stream is consumed, or just assume that every header listed in the trailers header is a trailer. Aside of the implementation of trailers itself in systems that need to generate requests or responses (i.e. browser fetch or FetchEvent.respondWith) I'm not aware of consumer use-cases where this distinction is necessary. That's also why I'm surprised by the above-quoted caching rfc9111.

@annevk
Copy link
Member Author

annevk commented Oct 24, 2023

No, combining trailers with headers goes against the latest HTTP specification.

@mmastrac
Copy link

mmastrac commented Apr 30, 2024

Hey all, there's been some interest in supporting trailers in Deno for both the fetch side of the API, as well as the Deno.serve HTTP server. The feature is mostly useful for gRPC support in JS server runtimes. Any interest in picking this up?

I believe the Promise<...> field for {Request,Response}{Init,} is the most reasonable approach. We'd be happy to push this forward with others interested in it.

@jasnell
Copy link

jasnell commented Apr 30, 2024

I'm definitely still very interested in moving things forward here. The approach outlined in the first half of https://docs.google.com/document/d/1P4MskkFd3HHFPGDr01O3wdFXukmX9i0jAb5uh9v9x8Q/edit#heading=h.tst1r01yr7a appears to make the most sense at the moment.

For sending trailers, being able to specify a trailers option with a Promise<HeadersInit> value... e.g.

async function getTrailers() {
  // await stuff
  return new Headers([['a','b']]);
}

const req = new Request('https://example.org', { /** ... **/, trailers: getTrailers());

// likewise for `Response`

For receiving trailers, something like...

await request.trailers;
await response.trailers;

@jhudsoncedaron
Copy link

Got here with the following situation:

We have this long running job on the server triggered by a POST request. The job runs for minutes, and generates output bit by bit as it runs (that is; it reports its steps and progress indicators as it runs); the job not being my code I can't change it to make error messages follow a regular pattern. As the job never reads input, and because websocket security was designed wrong from the beginning, requiring much more code at the server endpoint to make it secure, a websocket is not appropriate.

At the very end I can check the exit code of the process and report it back to the javascript caller; the natural implementation of this is a trailer. Searching for how to get trailers found an older version of the fetch() specification that had a trailers property; but it does not actually exist.

The discussion points

  • Caching: caching of POST requests is undesirable anyway.
  • Trailers can't be read until the body is fully materialized: Correct; the point where my Javascript would look for them has already finished materializing the body (got the EOF indicator from read).
  • Promise: I fail to see how making it another promise to get trailers does any good; it's just another awkward call at that point.

So I actually think the original spec got it right; trailers is just another property of the same type as headers that returns undefined until the body is fully read, at which point it returns the collection.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
addition/proposal New features or enhancements needs implementer interest Moving the issue forward requires implementers to express interest topic: api topic: http
Development

No branches or pull requests