Hi Kevin, Thanks - for spending the time figuring this out - it is interesting what you did - i never would have imagined it was 'brotli and a misconfigured server' it never came through uncompressed here - even after a few attempts - i wonder what someone looking at the logs thougth about - so many multiple loadings of a url about secureing sshd lol brotli appears to be a google invention - i wonder about how it got into a misconfigured server :) if it ever comes up for a vote - i vote no to include brotli into dillo ;) On Fri, 21 Jun 2024 09:52:06 +1000 "Kevin Koster" <dillo@ombertech.com> wrote:
pastebin-KK0ffGbhmjU@public.gmane.org wrote:
Hi
someone please tell me what is wrong when i do
dillo https://ubuntu.com/tutorials/configure-ssh-2fa#2-installing-and-configuring-...
it seems to be encrypted or compressed?
It failed the first time for me, then trying a little later it loads. In between I tried using "wget -S --spider" and the first time the header showed it was compressed using Brotli compression: "content-encoding: br"
On subsequent tries it returned uncompressed data to Wget and Dillo, which is correct since neither supports Brotli compression. It appears that the originating Web server is mis-configured to use Brotli compression even if the client doesn't announce support for it.
I found another page on the Ubuntu site where the problem still happens for me:
---- $ wget --spider -S https://ubuntu.com/navigation Spider mode enabled. Check if remote file exists. --2024-06-21 09:41:41-- https://ubuntu.com/navigation Resolving ubuntu.com... 185.125.190.20, 185.125.190.29, 185.125.190.21, ... Connecting to ubuntu.com|185.125.190.20|:443... connected. HTTP request sent, awaiting response... HTTP/1.1 200 server: nginx/1.14.0 (Ubuntu) date: Thu, 20 Jun 2024 23:41:14 GMT content-type: text/html; charset=utf-8 x-view-name: webapp.views.navigation_nojs x-clacks-overhead: GNU Terry Pratchett permissions-policy: interest-cohort=() cache-control: max-age=60, stale-while-revalidate=86400, stale-if-error=300 x-frame-options: SAMEORIGIN x-content-type-options: NOSNIFF x-vcs-revision: 1718884808-1dd74d9 x-request-id: c211f00fd2212b0b8771952d7fa04558 strict-transport-security: max-age=15724800 link: <https://assets.ubuntu.com>; rel=preconnect; crossorigin, <https://assets.ubuntu.com>; rel=preconnect, <https://res.cloudinary.com>; rel=preconnect content-encoding: br x-cache-status: STALE from content-cache-il3/0 Length: unspecified [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. ----
After retrying a few times I got an unencrypted response:
---- $ wget --spider -S https://ubuntu.com/navigation Spider mode enabled. Check if remote file exists. --2024-06-21 09:46:12-- https://ubuntu.com/navigation Resolving ubuntu.com... 185.125.190.21, 185.125.190.29, 185.125.190.20, ... Connecting to ubuntu.com|185.125.190.21|:443... connected. HTTP request sent, awaiting response... HTTP/1.1 200 server: nginx/1.14.0 (Ubuntu) date: Thu, 20 Jun 2024 23:45:45 GMT content-type: text/html; charset=utf-8 content-length: 202990 x-view-name: webapp.views.navigation_nojs x-clacks-overhead: GNU Terry Pratchett permissions-policy: interest-cohort=() cache-control: max-age=60, stale-while-revalidate=86400, stale-if-error=300 x-frame-options: SAMEORIGIN x-content-type-options: NOSNIFF x-vcs-revision: 1718884808-1dd74d9 x-request-id: 67baf729cf4344e46b689b28c5efb56f strict-transport-security: max-age=15724800 link: <https://assets.ubuntu.com>; rel=preconnect; crossorigin, <https://assets.ubuntu.com>; rel=preconnect, <https://res.cloudinary.com>; rel=preconnect x-cache-status: HIT from content-cache-il3/1 accept-ranges: bytes Length: 202990 (198K) [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. ----
The difference appears to be whether the page is served from a cache server. The cache server still latter notices the browser's Accept-Encoding header and doesn't use Brotli compression.
While checking docs for the headers, I found this page has the same problem (again in both Dillo and Wget):
---- $ wget -S --spider --compression=gzip https://http.dev/compression Spider mode enabled. Check if remote file exists. --2024-06-21 09:30:10-- https://http.dev/compression Resolving http.dev... 34.120.39.70 Connecting to http.dev|34.120.39.70|:443... connected. HTTP request sent, awaiting response... HTTP/1.1 200 OK content-type: text/html; charset=utf-8 strict-transport-security: max-age=63072000; includeSubDomains; preload pragma: public cache-control: public, max-age=86400 x-content-type-options: nosniff x-frame-options: SAMEORIGIN x-ua-compatible: IE=Edge,chrome=1 x-xss-protection: 1; mode=block vary: Accept-Encoding x-versionid: 8Bp0x4p9 x-production: True link: <https://http.dev/css/app.min.css?v=8Bp0x4p9>; rel="preload"; as="style" x-request-id: fc33d19c-441a-4fab-ae1b-113e5e9028f5 content-encoding: br x-cloud-trace-context: a7d7fda8062ff6c0a9a86187afcacecb;o=1 Content-Length: 8989 date: Thu, 20 Jun 2024 23:29:42 GMT server: Google Frontend via: 1.1 google, 1.1 google Alt-Svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000 Length: 8989 (8.8K) [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. ---- _______________________________________________ Dillo-dev mailing list -- dillo-dev@mailman3.com To unsubscribe send an email to dillo-dev-leave@mailman3.com