30

I'm trying to send data in chunked mode. All headers are set properly and data is encoded accordingly. Browsers recognize my response as a chunked one, accepting headers and start receiving data.

I was expecting the browser would update the page on each received chunk, instead it waits until all chunks are received then displays them all. Is this the expected behavior?

I was expecting to see each chunk displayed right after it was received. When using curl, each chunk is shown right after it is received. Why does the same not happen with GUI browsers? Are they using some sort of buffering/cache?

I set the Cache-Control header to no-cache, so not sure it is about cache.

4
  • Which browsers are you looking in? Generally browsers will do incremental rendering, but they can internally buffer things up for a bit because relayouts are expensive... Commented Nov 26, 2012 at 2:57
  • What type of data are you sending in the chunks? Is it just HTML or are you sending script data?
    – qqx
    Commented Nov 26, 2012 at 6:56
  • i'm sending text/html. Tried on Firefox and Chrome. Both waiting all chunks to be received.
    – Dani El
    Commented Nov 26, 2012 at 8:40
  • See also (the newer) stackoverflow.com/q/16909227/179081 Commented Feb 21, 2018 at 1:37

3 Answers 3

22

afaik browsers needs some payload to start render chunks as they received.
Curl is of course an exception.

Try to send about 1KB of arbitrary data before your first chunk.

If you are doing everything correctly, browsers should render chunks as they received.

3
  • Yay!!! that was it! works perfectly in Firefox, Chrome, Safari, even Opera! Thank you a lot.
    – Dani El
    Commented Nov 26, 2012 at 9:27
  • 1
    1KiB is indeed a good general value, for more details look here: stackoverflow.com/q/16909227/1534459
    – bodo
    Commented May 26, 2015 at 8:13
  • 1
    AFAIK browsers only gather the mentioned 1KB of data if they didn't receive a content-type header. They need the data then to make an educated guess about what they are about receiving. Beside, also anti-virus software may be causing this problem, as I described here: stackoverflow.com/a/41760573/1004651
    – Matthias
    Commented Jan 20, 2017 at 10:11
11

Fix your headers.


  1. As of 2019, if you use Content-type: text/html, no buffering occurs in Chrome.

  1. If you just want to stream text, similar to text/plain, then just using Content-type: text/event-stream will also disable buffering.

  1. If you use Content-type: text/plain, then Chrome will still buffer 1 KiB, unless you additionally specify X-Content-Type-Options: nosniff.

RFC 2045 specifies that if no Content-Type is specified, Content-type: text/plain; charset=us-ascii should be assumed

5.2. Content-Type Defaults

Default RFC 822 messages without a MIME Content-Type header are taken by this protocol to be plain text in the US-ASCII character set, which can be explicitly specified as:

Content-type: text/plain; charset=us-ascii

This default is assumed if no Content-Type header field is specified. It is also recommend that this default be assumed when a syntactically invalid Content-Type header field is encountered. In the presence of a MIME-Version header field and the absence of any Content-Type header field, a receiving User Agent can also assume that plain US-ASCII text was the sender's intent. Plain US-ASCII text may still be assumed in the absence of a MIME-Version or the presence of an syntactically invalid Content-Type header field, but the sender's intent might have been otherwise.

Browsers will start to buffer text/plain for a certain amount in order to check if they can detect if the content sent is really plain text or some media type like an image, in case the Content-Type was omitted, which would then equal a text/plain content type. This is called MIME type sniffing.

MIME type sniffing is defined by Mozilla as:

In the absence of a MIME type, or in certain cases where browsers believe they are incorrect, browsers may perform MIME sniffing — guessing the correct MIME type by looking at the bytes of the resource.

Each browser performs MIME sniffing differently and under different circumstances. (For example, Safari will look at the file extension in the URL if the sent MIME type is unsuitable.) There are security concerns as some MIME types represent executable content. Servers can prevent MIME sniffing by sending the X-Content-Type-Options header.

According to Mozilla's documentation:

The X-Content-Type-Options response HTTP header is a marker used by the server to indicate that the MIME types advertised in the Content-Type headers should not be changed and be followed. This allows to opt-out of MIME type sniffing, or, in other words, it is a way to say that the webmasters knew what they were doing.

Therefore adding X-Content-Type-Options: nosniff makes it work.

2
  • 2
    For me, the charset=xxxx was the key. With just Content-type: text/plain (in firefox 60.0.9esr) the output was buffered and only displayed all at once at the end of receiving data. When changed to Content-type: text/plain; charset=us-ascii (or Content-type: text/html; charset=utf8) suddenly the chunked progressive web rendering worked as expected. Commented Sep 23, 2019 at 1:09
  • 2
    @MatijaNalis, that should be Content-type: text/html; charset=utf-8 (or UTF-8 if case matters)
    – Tesseract
    Commented Apr 1, 2020 at 4:01
1

The browser can process and render the data as it comes in whether data is sent chunked or not. Whether a browser renders the response data is going to be a function of the data structure and what kind of buffering it employs. e.g. Before the browser can render an image, it needs to have the document (or enough of the document), the style sheet, etc.

Chunking is mostly useful when the length of a resource is unknown at the time the resource response is generated (a "Content-Length" can't be included in the response headers) and the server doesn't want to close the connection after the resource is transferred.

Not the answer you're looking for? Browse other questions tagged or ask your own question.