On 2025-08-13, Paul Rubin wrote:
John Ames <commodorejohn@gmail.com> writes:
No, I think I'll stick with active scorn and spite towards web
designers who can't be bothered to do their job properly. The
attitude that it should be considered acceptable for web designers
to dictate people's choice of browser was contemptible in the
'90s-'00s and it's contemptible now.
There's a web standard (HTML5)
It's been WHATWG's "HTML Living Standard" for a while now, to
the point that
https://www.w3.org/TR/html/ (to which
https://www.w3.org/TR/html53/ and such redirect to) now
redirects to
https://html.spec.whatwg.org/multipage/ .
Consider:
$ lynx --dump --nolist -- html.spec.whatwg.org/index.html
...
For a number of years, both groups then worked together. In 2011,
however, the groups came to the conclusion that they had different
goals: the W3C wanted to publish a "finished" version of "HTML5", while
the WHATWG wanted to continue working on a Living Standard for HTML,
continuously maintaining the specification rather than freezing it in a
state with known problems, and adding new features as needed to evolve
the platform.
In 2019, the WHATWG and W3C signed an agreement to collaborate on a
single version of HTML going forward: this document.
...
The old "HTML5" specifications can be found on Wayback Machine:
http://web.archive.org/web/20181229172047/https://www.w3.org/TR/html53/ (WD)
http://web.archive.org/web/20181230024229/https://www.w3.org/TR/html52/ http://web.archive.org/web/20190102083213/https://www.w3.org/TR/html51/ http://web.archive.org/web/20181231135142/http://www.w3.org/TR/html5/
and it includes all those features that you (and I) dislike.
We could agree that it's a BAD standard. Some people feel the
same about ANS Forth.
I'd be cautious comparing standards for data formats, such as
HTML, and standards for programming languages. The latter tend
to document some "common subset" that different implementations
have all agreed upon supporting.
HTML standard, OTOH, /must/ document such things as <marquee />
element and bgcolor= attribute - simply because those are used
on the web at large. (Whether because the HTML document in
question has never been updated, or because its author has
never learned any newer standards.)
But it's there, and the big browsers implement it, and web
developers for the most part follow it.
The problem is not with the HTML standard itself, but rather
that by using the JavaScript programming language (the support
for which, AFAICT, the standard does /not/ require), it's
possible for the site operator to deliver a /web application/
in place of a /web page./ Which is inconvenient at best, and
can be seen as outright abuse of "modern web standards" at worst.
Consider that, e. g., MS-DOS didn't (AIUI) provide an adequate
text file viewer until EDIT.COM in version 5.0. Thus software
tended to include one in the distribution - so to allow for
viewing its README.TXT file, if there was one. (See README.EXE
in [1] and [2] for example.)
[1]
http://www.classicdosgames.com/files/games/capstone/14zorro.zip
[2]
http://www.classicdosgames.com/files/games/virgin/cc1demo.zip
Some distributions, however, /embedded/ the text file in said
viewer, thus /only/ allowing it to be viewed through that
program - which is, too, inconvenient at best; and is not
dissimilar to what /some/ of "modern websites" do.
And let me quote section 4.12 "Scripting" of the standard
itself (emphasis mine):
HTML> Scripts allow authors to add interactivity to their documents.
HTML> Authors are encouraged to use declarative alternatives to
HTML> scripting where possible, as declarative mechanisms are often
HTML> more maintainable, *and many users disable scripting.*
HTML> For example, instead of using a script to show or hide a section
HTML> to show more details, the details element could be used.
HTML> Authors are also encouraged to make their applications degrade
HTML> gracefully in the absence of scripting support.
HTML> For example, if an author provides a link in a table header to
HTML> dynamically resort the table, the link could also be made to function
HTML> without scripts by requesting the sorted table from the server.
I write C++ code sometimes. C++11 introduced a lot of new features
that weren't in earlier versions. They were refined further in C++14
and later. Am I irresponsible or not doing my job if I use those
features, instead of writing C++98 code in 2025?
The short answer is: maybe.
Now, I don't recall writing anything of substance in C++ since
c. 2002, so can't really comment on C++ standards. I do have,
however, certain rules regarding what features of the underlying
platform to use in my programs.
Most often, I write my software for the users of Debian and its
derivatives. The requirement for new software entering Debian
is that it works in "testing" / "unstable," so if the language
implementation it needs hasn't been updated /in Debian/ for a
decade, then a decade-old language is what you need to target.
(Or you can negotiate adopting the implementation's package and
update it to a newer upstream version, if there's one.)
That said, I rarely care for my software to become /part/ of
Debian proper, merely to be usable by Debian users - who
typically run its "stable" branch, but some may delay their
upgrade there for months, so it's more practical to target
"oldstable" instead.
With Debian release cycle of two years, that introduces a
lag of up to four years between a standard is adopted by an
implementation in Debian and its availability to my needs.
So, for an example, I'd be wary of relying on C++14 before 2018.
Were I to write software to be included in Debian, I'd try to
make it also run on "stable" and not just "testing." Even
though no new software is admitted into "stable," there's a
separate "backports" repository for software from "testing"
rebuilt to work on "stable."
With regards to JavaScript, I tend to stick to ES 5.1 [3] from
2011. On one hand, it has a bunch of independent implementations:
http://duktape.org/ ,
http://mujs.com/ , likely QuickJS as well.
On the other, I find said version much more manageable than the
newest ones. (My copy of [3] is 1 416 907 bytes, and of [4],
6 598 417 bytes.)
[3]
http://262.ecma-international.org/5.1/
[4]
http://262.ecma-international.org/11.0/
I believe that aside of certain special cases, it /does/ make
sense to test one's software with 20 year old hardware - though
it's not something I practice often myself, alas. Aside of
possibly helping with the global e-waste disposal problem, such
testing might reveal issues simply not noticeable on newer,
faster hardware.
Consider, e. g.,
http://t3x.org/t3x/ compilers - those, it is
my understanding, are tested to work on hardware running CP/M.
Overall, a degree of "minimalism" in software development is
advisable. I believe "The Philosophy of Forth" in [5] argues
much in favor of it, and there's also [6] for a more recent
perspective.
[5]
http://forth.com/wp-content/uploads/2018/11/thinking-forth-color.pdf
[6]
http://spectrum.ieee.org/lean-software-development
Ultimately, though, the question is: what are you trying to
achieve, and for /whom/?
HTH.
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)