otoh, the C and POSIX books were as valid as ever.
Once I got network programming figured out, I wrote my own library of functions that set everything up the way I want it (including SSH and
TLS). Sure, you have to watch your step sometimes, but it's not quite
as bad as people make it out to be.
On Tue, 1 Oct 2024 00:25 +0100 (BST), John Dallman wrote:
In article <vdf71h$2cn51$17@dont-email.me>, ldo@nz.invalid
(Lawrence D'Oliveiro) wrote:
It's not being ignored. The Linux kernel added the option to
build a 32-bit kernel with time_t having 64 bits
The same has happened for Windows and Apple's operating systems.
A lot of the work for 2038 is already done.
They're not supporting 32-bit code any more. Linux is.
In article <vdf71h$2cn51$17@dont-email.me>, ldo@nz.invalid (Lawrence D'Oliveiro) wrote:
On Mon, 30 Sep 2024 19:40:06 -0000 (UTC), candycanearter07 wrote:
For instance, the 2038 problem I'm betting will be ignored until
2035.
It's not being ignored. The Linux kernel added the option to build
a 32-bit kernel with time_t having 64 bits
The same has happened for Windows and Apple's operating systems. A lot of
the work for 2038 is already done.
On Mon, 30 Sep 2024 19:40:06 -0000 (UTC), candycanearter07 wrote:
For instance, the 2038 problem I'm betting will be ignored untilIt's not being ignored. The Linux kernel added the option to build
2035.
a 32-bit kernel with time_t having 64 bits
In article <vdfdcj$2dsh2$7@dont-email.me>, ldo@nz.invalid (Lawrence D'Oliveiro) wrote:
On Tue, 1 Oct 2024 00:25 +0100 (BST), John Dallman wrote:
In article <vdf71h$2cn51$17@dont-email.me>, ldo@nz.invalid (Lawrence
D'Oliveiro) wrote:
It's not being ignored. The Linux kernel added the option to build a
32-bit kernel with time_t having 64 bits
The same has happened for Windows and Apple's operating systems.
A lot of the work for 2038 is already done.
They're not supporting 32-bit code any more. Linux is.
Apple only run 64-bit code on recent OSes, yes, but Windows 11 still
runs 32-bit applications, even though the OS is only available in 64-bit form.
In article <vdfdcj$2dsh2$7@dont-email.me>, ldo@nz.invalid (Lawrence D'Oliveiro) wrote:
On Tue, 1 Oct 2024 00:25 +0100 (BST), John Dallman wrote:
In article <vdf71h$2cn51$17@dont-email.me>, ldo@nz.invalid
(Lawrence D'Oliveiro) wrote:
It's not being ignored. The Linux kernel added the option to
build a 32-bit kernel with time_t having 64 bits
The same has happened for Windows and Apple's operating systems.
A lot of the work for 2038 is already done.
They're not supporting 32-bit code any more. Linux is.
Apple only run 64-bit code on recent OSes, yes, but Windows 11 still runs 32-bit applications, even though the OS is only available in 64-bit form. There's no sign of Windows dropping 32-bit applications, and I check regularly: I'd like to stop supporting them.
On Tue, 1 Oct 2024 00:56 +0100 (BST), John Dallman wrote:
Apple only run 64-bit code on recent OSes, yes, but Windows 11
still runs 32-bit applications, even though the OS is only
available in 64-bit form.
But those 32-bit Windows apps are not being rebuilt for 64-bit
time_t. The option isn't there.
I think 32-bit is (already) kinda obsolete whether
anybody wants to admit it or not.
Didn't take long, did it ?
LONG back some guy at a computer store (remember
those ?) asked my why anybody would WANT 16-bit
chips/vars. This was in the c64/Atari-800 days.
I told him "graphics !" - and was right.
NOW I wonder if 128-bit should be the aim of
all future standards. It'd be HARD to use up
128 bits for almost anything.
On 2024-10-01, 186282@ud0s4.net <186283@ud0s4.net> wrote:
I think 32-bit is (already) kinda obsolete whether
anybody wants to admit it or not.
Didn't take long, did it ?
LONG back some guy at a computer store (remember
those ?) asked my why anybody would WANT 16-bit
chips/vars. This was in the c64/Atari-800 days.
I told him "graphics !" - and was right.
NOW I wonder if 128-bit should be the aim of
all future standards. It'd be HARD to use up
128 bits for almost anything.
In other words, "128 bits ought to be enough for anybody." :-)
It's a really BIG number ...
But there SHOULD be a few 256/512-bit types in ye
olde library :-)
Now CPUs ... maybe 128-bit IS what future-lookers
need to immediately switch to. Haven't heard many
complaints about 64-bit chips, yet, but doesn't
hurt to plan ahead. Circuitry can be made SO small
now that the extra stuff for 128 all through may
not be such a burden.
OR ... are 'CPUs' even bulk of The Future ? Somehow I see "AI" -
implemented on large distributed systems of diverse composition,
likely even some 'quantum' thrown in - being the coming thing. They
can emulate old CPUs.
On 2024-10-01, 186282@ud0s4.net <186283@ud0s4.net> wrote:
I think 32-bit is (already) kinda obsolete whether
anybody wants to admit it or not.
Didn't take long, did it ?
LONG back some guy at a computer store (remember
those ?) asked my why anybody would WANT 16-bit
chips/vars. This was in the c64/Atari-800 days.
I told him "graphics !" - and was right.
NOW I wonder if 128-bit should be the aim of
all future standards. It'd be HARD to use up
128 bits for almost anything.
In other words, "128 bits ought to be enough for anybody." :-)
My libraries call OpenSSL, but let me establish a connection (with or
without TLS) with a single call.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 490 |
Nodes: | 16 (2 / 14) |
Uptime: | 66:11:35 |
Calls: | 9,676 |
Files: | 13,719 |
Messages: | 6,171,848 |