nanog mailing list archives

Re: Proving Gig Speed


From: Tei <oscar.vives () gmail com>
Date: Thu, 19 Jul 2018 17:08:26 +0200

On 19 July 2018 at 07:06, Mark Tinka <mark.tinka () seacom mu> wrote:


On 18/Jul/18 17:20, Julien Goodwin wrote:

Living in Australia this is an every day experience, especially for
content served out of Europe (or for that matter, Africa).

TCP & below are rarely the biggest problem these days (at least with
TCP-BBR & friends), far too often applications, web services etc. are
simply never tested in an environment with any significant latency.

While some issues may exist for static content loading for which a CDN
can be helpful, that's not helpful for application traffic.

Yip.

Mark.

Sorry about that.

I feel bad has a webmaster.  Most of us on the web we are creating
websites that are not documents to be download and viewed, but
applications that require to work many small parts that are executed
togeter.

Most VRML examples from 1997 are unavailable because host moved,
directories changed name,  whole websites where redone with new
technologies. Only a 1% of that exist in a readable format. But the
current web is much more delicate, and will break more and sooner than
that.

Perhaps something can be done about it.  Chrome already include a
option to test websites emulating "Slow 3G" that webmasters may use
and want to use.

I suggest a header or html meta tag where a documents disable external
js scripts, or limit these to a white list of hosts.

 <meta http-equiv="script-whitelist" content="None">.

So if you are a Vodafone customer.  And you are reading a political
document. Vodafone can inject a javascript script in the page. But it
will not run because of the presence of  <meta
http-equiv="script-whitelist" content="None">.  Vodafone can still
further alter the html of the page to remove this meta and inject
their script.

Get webmasters into the idea of making websites that are documents.
That require no execution of scripts. So they will still work in 2048.
And will work in poor network conditions, where a website that load 47
different js files may break.

tl:dr:  the web is evolving into a network of applications, instead of
documents.  Documents can't "break" easily. Programs may break
completelly even to tiny changes. Maybe getting webmasters on board of
biasing in favor of documents could do us all a favour.

-- 
--
ℱin del ℳensaje.


Current thread: