Elon Musk, the super dumb-fuck retard with a face that resembles
a horse's ass, wants to rewrite the entire software base of the
Social Security Administration:
https://www.wired.com/story/doge-rebuild-social-security-administration-cobol-benefits/
Could this be a magnificent opportunity for GNU/Linux and FOSS?
Naw. That ugly dumb-fuck wants to use Java.
What? No Rust? I already hear the beatniks moaning.
Heck. I would have recommended Python. It would cut down the
programming time, and programmers, by 10,000 per cent.
Who cares if it takes 20 years for the cost-of-living increase
to process.
Ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha!
As for the SSA (and IRS) ... reports are that they're STILL using
1960s computers for certain operations and thus really can't
integrate anything.
On Sun, 30 Mar 2025 03:42:18 -0400, c186282 wrote:
As for the SSA (and IRS) ... reports are that they're STILL using
1960s computers for certain operations and thus really can't
integrate anything.
https://arstechnica.com/tech-policy/2025/03/what-could-possibly-go-wrong- doge-to-rapidly-rebuild-social-security-codebase/
I've never dealt with SSA other than as a consumer. However I have had to deal with the DOI's IMARS (Incident Management Analysis and Reporting
System) that was deployed around 2012. Great fun!
Then there is the FBI's NIBRS (National Incident Based Reporting System)
that is meant to replace the UCR (Uniform Crime Reports) system that went back to the 1920s. The format for submitting data is a moving target and crime in the US is under-reported since many agencies can't manage to
submit their data successfully.
Based on those two, I give the chance of updating the SSA system in less
than 10 years a very low probability. The US government's attention span
is more like 10 days than 10 years so good luck. Unlike the other two CFs
I have a personal interest in continuing to receive my checks every month.
On 2025-03-30, rbowman <bowman@montana.com> wrote:
On Sun, 30 Mar 2025 03:42:18 -0400, c186282 wrote:
As for the SSA (and IRS) ... reports are that they're STILL using
1960s computers for certain operations and thus really can't
integrate anything.
https://arstechnica.com/tech-policy/2025/03/what-could-possibly-go-wrong-
doge-to-rapidly-rebuild-social-security-codebase/
I've never dealt with SSA other than as a consumer. However I have had toFinancial institutions are still running COBOL as well as other legacy systems like TPF.
deal with the DOI's IMARS (Incident Management Analysis and Reporting
System) that was deployed around 2012. Great fun!
Then there is the FBI's NIBRS (National Incident Based Reporting System)
that is meant to replace the UCR (Uniform Crime Reports) system that went
back to the 1920s. The format for submitting data is a moving target and
crime in the US is under-reported since many agencies can't manage to
submit their data successfully.
Based on those two, I give the chance of updating the SSA system in less
than 10 years a very low probability. The US government's attention span
is more like 10 days than 10 years so good luck. Unlike the other two CFs
I have a personal interest in continuing to receive my checks every month. >>
They also run Linux on mainframes in logical partitions.
More recently is the adoption of the MongoDB platform which also runs on
'old iron' mainframes.
The once dead mainframe has made a huge comeback due to virtualization and reliability due to being N+1 at a minimum.
However the big market is in storage devices and programming to support AI.
the more things change, the more they seem to stay the same.
On 2025-03-30, rbowman <bowman@montana.com> wrote:
On Sun, 30 Mar 2025 03:42:18 -0400, c186282 wrote:Financial institutions are still running COBOL as well as other legacy systems like TPF.
As for the SSA (and IRS) ... reports are that they're STILL using
1960s computers for certain operations and thus really can't
integrate anything.
https://arstechnica.com/tech-policy/2025/03/what-could-possibly-go-wrong-
doge-to-rapidly-rebuild-social-security-codebase/
I've never dealt with SSA other than as a consumer. However I have had to
deal with the DOI's IMARS (Incident Management Analysis and Reporting
System) that was deployed around 2012. Great fun!
Then there is the FBI's NIBRS (National Incident Based Reporting System)
that is meant to replace the UCR (Uniform Crime Reports) system that went
back to the 1920s. The format for submitting data is a moving target and
crime in the US is under-reported since many agencies can't manage to
submit their data successfully.
Based on those two, I give the chance of updating the SSA system in less
than 10 years a very low probability. The US government's attention span
is more like 10 days than 10 years so good luck. Unlike the other two CFs
I have a personal interest in continuing to receive my checks every month. >>
They also run Linux on mainframes in logical partitions.
More recently is the adoption of the MongoDB platform which also runs on
'old iron' mainframes.
The once dead mainframe has made a huge comeback due to virtualization and reliability due to being N+1 at a minimum.
However the big market is in storage devices and programming to support AI.
the more things change, the more they seem to stay the same.
On 2025-03-30, rbowman <bowman@montana.com> wrote:
On Sun, 30 Mar 2025 03:42:18 -0400, c186282 wrote:Financial institutions are still running COBOL as well as other legacy systems like TPF.
As for the SSA (and IRS) ... reports are that they're STILL using
1960s computers for certain operations and thus really can't
integrate anything.
https://arstechnica.com/tech-policy/2025/03/what-could-possibly-go-wrong-
doge-to-rapidly-rebuild-social-security-codebase/
I've never dealt with SSA other than as a consumer. However I have had to
deal with the DOI's IMARS (Incident Management Analysis and Reporting
System) that was deployed around 2012. Great fun!
Then there is the FBI's NIBRS (National Incident Based Reporting System)
that is meant to replace the UCR (Uniform Crime Reports) system that went
back to the 1920s. The format for submitting data is a moving target and
crime in the US is under-reported since many agencies can't manage to
submit their data successfully.
Based on those two, I give the chance of updating the SSA system in less
than 10 years a very low probability. The US government's attention span
is more like 10 days than 10 years so good luck. Unlike the other two CFs
I have a personal interest in continuing to receive my checks every month. >>
They also run Linux on mainframes in logical partitions.It never went away really. Banks want their data *physically* secure in
More recently is the adoption of the MongoDB platform which also runs on
'old iron' mainframes.
The once dead mainframe has made a huge comeback due to virtualization and reliability due to being N+1 at a minimum.
However the big market is in storage devices and programming to support AI.
the more things change, the more they seem to stay the same.
On 4/1/25 11:08 AM, -hh wrote:
On 4/1/25 10:40, Robert Heller wrote:
It should be noted that GnuCOBOL actually translates COBOL to C, and
then
compiles the C code with GnuC. In *theory* one could just run the
whole code
base through GnuCOBOL and create a C code base, but good luck making
much
sense of the generated C code...
I had to listen to a PMP rant all weekend about how profoundly stupid
of a plan this is from DOGE,
Musk's "plan" isn't bad ... per-se. As noted in a
variety of news, some of those important federal
agencies are STILL using 60s hardware & programming.
The TRICK is getting any newer stuff RIGHT before it
goes mainline. That old COBOL code was GREAT, never
diss those narrow-tie Dilberts from the day.
BUT ... as those COBOLs were kinda tweaked for the
particular old boxes, no modern translation tool is
gonna work worth a damn. It'll have to be HAND DONE,
line by line, and done RIGHT. Forget Gen-Z/A2 and AI.
As I said to be hated - PYTHON ! :-)
So, to the complexity of handling old code, you add the complexities of translating to another language.
Keep it simple: use today's COBOL. Less translation effort. Fewer
errors.
On Wed, 2 Apr 2025 22:46:00 +0200, Carlos E.R. wrote:
So, to the complexity of handling old code, you add the complexities of
translating to another language.
Keep it simple: use today's COBOL. Less translation effort. Fewer
errors.
The problem may be finding competent COBOL programmers. That leads me to another question. Presumably the SSA and other government agencies
currently employ COBOL programmers. What have they been doing the last
fifty years?
A more important question might be what have their managers been doing?
There is no question a re-write would be very painful and expensive. For a government agency there isn't a market force to improve so what would
trigger them doing anything to rock the boat?
The data is in EBCDIC, not ASCII. Sequential files can be fixed or
variable length. For fixed, the length of each record is specified in
the file declaration, while for variable, the length is in the first
bytes of each record. There is no character or combination of characters
used to represent the end of a record.
On Wed, 2 Apr 2025 22:46:00 +0200, Carlos E.R. wrote:
So, to the complexity of handling old code, you add the complexities of
translating to another language.
Keep it simple: use today's COBOL. Less translation effort. Fewer
errors.
The problem may be finding competent COBOL programmers. That leads me to another question. Presumably the SSA and other government agencies
currently employ COBOL programmers. What have they been doing the last
fifty years?
A more important question might be what have their managers been doing?
There is no question a re-write would be very painful and expensive. For a government agency there isn't a market force to improve so what would
trigger them doing anything to rock the boat?
On Wed, 02 Apr 2025 18:36:02 -0400, rbowman <bowman@montana.com> wrote:
On Wed, 2 Apr 2025 22:46:00 +0200, Carlos E.R. wrote:
So, to the complexity of handling old code, you add the complexities of
translating to another language.
Keep it simple: use today's COBOL. Less translation effort. Fewer
errors.
The problem may be finding competent COBOL programmers. That leads me to
another question. Presumably the SSA and other government agencies
currently employ COBOL programmers. What have they been doing the last
fifty years?
A more important question might be what have their managers been doing?
There is no question a re-write would be very painful and expensive.
For a
government agency there isn't a market force to improve so what would
trigger them doing anything to rock the boat?
There is a lot more to those systems than just the COBOL programs.
The data is in EBCDIC, not ASCII. Sequential files can be fixed or
variable length.
For fixed, the length of each record is specified in the file
declaration, while for
variable, the length is in the first bytes of each record. There is no character or
combination of characters used to represent the end of a record.
There can be ISAM and VSAM (indexed direct access) files, IMS
(heirarchical) databases
and data comunnication with MFS for the 3270 style screen handling, as
well as DB2
databases all used by those COBOL programs.
The COBOL programs may have ASM subroutines for the parts that had to be highly
optimized.
The systems are constantly under maintenance due to changes imposed by governements
regulations or due to interfaces with other systems such as pc or cloud usage.
Regards, Dave Hodgins
On Wed, 2 Apr 2025 22:46:00 +0200, Carlos E.R. wrote:
So, to the complexity of handling old code, you add the complexities of
translating to another language.
Keep it simple: use today's COBOL. Less translation effort. Fewer
errors.
The problem may be finding competent COBOL programmers. That leads me to another question. Presumably the SSA and other government agencies
currently employ COBOL programmers. What have they been doing the last
fifty years?
A more important question might be what have their managers been doing?
There is no question a re-write would be very painful and expensive. For a government agency there isn't a market force to improve so what would
trigger them doing anything to rock the boat?
You're seeing the True Picture - It's *not* "easy" to
re-write at all.
And if you get it wrong there's all hell to pay.
Which is why bcrats are hyper-conservative in these
regards. They have good jobs/pensions to protect.
IMHO, any re-write will HAVE to involve a 'parallel system'
for awhile ... give it the same data, the same tasks, and
eval if it's always doing the same thing as the old stuff.
THEN, in a few years, quietly switch.
On Thu, 03 Apr 2025 02:00:30 -0400, David W. Hodgins wrote:
The data is in EBCDIC, not ASCII. Sequential files can be fixed or
variable length. For fixed, the length of each record is specified in
the file declaration, while for variable, the length is in the first
bytes of each record. There is no character or combination of characters
used to represent the end of a record.
I don't see any of that to be a problem. Until quite recently a mid-
western US state's criminal justice system used EBCDIC and the IBM
protocol where the record lengths are specified in a known size hearer.
When I wrote the interface it converted from ASCII to EBCDIC and vice
versa on the fly. Lookup tables were necessary. I would hope all the data uses the same code page.
All this is business as usual when dealing with data. Again hopefully they have maintained uniformity over the decades and it is not the situation of the Obamacare roll-out that had to deal with many companies and agencies
that march to their own drummer.
On Thu, 03 Apr 2025 06:11:37 -0400, c186282 <c186282@nnada.net> wrote:
<snip>
You're seeing the True Picture - It's *not* "easy" to
re-write at all.
And if you get it wrong there's all hell to pay.
Which is why bcrats are hyper-conservative in these
regards. They have good jobs/pensions to protect.
IMHO, any re-write will HAVE to involve a 'parallel system'
for awhile ... give it the same data, the same tasks, and
eval if it's always doing the same thing as the old stuff.
THEN, in a few years, quietly switch.
The problem is people come in who do not understand how many parts there
are or
are willing to spend the time to learn. They look at one small part such
as the code
for the main module, and think it's easy to convert.
They rush the conversion, and only after they start using the new
versions, learn that
they missed or misunderstood many of the edge cases.
They then get wrong results, but insist their results are correct!
There are many languages used, not just COBOL Some of the ones I worked
with include
Fortran, PL/1, RPG III, Mark IV, ADF, ASM (360/370),
Even simple looking things like MFS code for screen definitions has
quirks that are not
going to be obvious to anyone who has not encountered them.
For example, in a 3270 style terminal where an input field is restricted
to numeric, uppercase
letters are still allowed.
It's done that way to allow for signed numeric fields. In EBCDIC, the
zoned decimal value for
minus one and the capital letter D both use the same hexadecimal value.
Plus one is "C".
With edge cases, the problem is that the person doing the conversion
doesn't understand that
they exist, so they don't include them in test data, and don't encounter
them during any parallel
testing. Later, the system fails to handle the edge cases.
Regards, Dave Hodgins
Dealing with decades of the old RECORDS ... there's
a pain regardless.
So yea, I can perfectly believe they're keeping 360s and such alive
and working ...
Think long term, train them, and offer them a long career, with a
binding contract, so that it is worth their while. Something the current
USA administration can not offer.
On Thu, 3 Apr 2025 13:44:06 +0200, Carlos E.R. wrote:
Think long term, train them, and offer them a long career, with a
binding contract, so that it is worth their while. Something the current
USA administration can not offer.
Maybe. I doubt you would get the best and brightest but that isn't what
you would want anyway.
Note a significant SOCIAL shift in the USA as well.
By the 80s people stopped being so interested in "careers" - they'd
jump companies often, even jump job types.
The 360/370 boxes WERE really popular, so I'm gonna GUESS there's a
least one or two still chugging away. Maint cost would be insane
these days ... but you can kinda bury that in the budget while new
hardware stands out more and in more places.
On Thu, 3 Apr 2025 17:12:53 -0400, c186282 wrote:
Note a significant SOCIAL shift in the USA as well.
By the 80s people stopped being so interested in "careers" - they'd
jump companies often, even jump job types.
I never went looking for a career. My eyes would glaze over when potential employers would start talking about pension plans and so forth. It was
always about the next challenge.
I slowed down in my 50s, found a place I wanted to stay, and was lucky
enough to find a job that required frequent new interfaces and gave me the latitude for my skunk works projects.
It's been interesting but I guess I'm not going to get a gold watch.
The latest boss wasn't interested in 'research' or DIY and hated the
word "Linux" .... so it was time to retire. Just Office365 and Cloud
... like ALL the good ordinary places. Then you can't be "criticized"
........
And look REALLY cool in sci-fi movies !
On 04/04/2025 01:16, rbowman wrote:
On Thu, 3 Apr 2025 16:38:57 -0400, c186282 wrote:
The 360/370 boxes WERE really popular, so I'm gonna GUESS there's a >>> least one or two still chugging away. Maint cost would be insane
these days ... but you can kinda bury that in the budget while new >>> hardware stands out more and in more places.
I'm not sure there are any little old ladies left to knit magnetic core.
Last time I looked they were little asian ladies wit teeny nimble fingers. Did all the coil winding in that factory.
Le 03-04-2025, c186282 <c186282@nnada.net> a écrit :
It's even worse now, seriously worse. Means nobody
becomes "experts" in the usual sense of the word.
I don't know why it's like that in the entire world, but in France, the reason is obvious. The company refuse to take into account technical
skills. If you want to increase your salary, you have to switch to management. So, as nobody wants to become the most important guy in the company with the lowest salary, there is no more experts.
And most importantly, things evolved very fast, so if you become an
expert on something which disappear, you switch very fast from very
required guy to useless guy. So, before becoming an expert, you need to
be sure your skills will stay useful until you retire. Which is
difficult if you are young.
It's even worse now, seriously worse. Means nobody
becomes "experts" in the usual sense of the word.
Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
Le 03-04-2025, c186282 <c186282@nnada.net> a écrit :
It's even worse now, seriously worse. Means nobody
becomes "experts" in the usual sense of the word.
I don't know why it's like that in the entire world, but in France, the
reason is obvious. The company refuse to take into account technical
skills. If you want to increase your salary, you have to switch to
management. So, as nobody wants to become the most important guy in the
company with the lowest salary, there is no more experts.
I think the issue is general, and so are the exceptions to it. On the
one hand I’ve heard similar complaints about UK and US employers.
And most importantly, things evolved very fast, so if you become an
expert on something which disappear, you switch very fast from very
required guy to useless guy. So, before becoming an expert, you need to
be sure your skills will stay useful until you retire. Which is
difficult if you are young.
Specializing in the wrong thing is certainly a risk. For any given
technology it’s useful to be able to look past what its boosters (and detractors) say about it to whether it does anything useful in reality.
On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
No, but we can move to quantum computing, which may become
a reality before too long.
I heard about that before I was born.
In the US, the NIST is already researching algorithms for "post-quantum cryptography:"
https://csrc.nist.gov/projects/post-quantum-cryptography
Quantum computing is definitely going to happen.
And most importantly, things evolved very fast, so if you become an
expert on something which disappear, you switch very fast from very
required guy to useless guy. So, before becoming an expert, you need to
be sure your skills will stay useful until you retire. Which is
difficult if you are young.
Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
No, but we can move to quantum computing, which may become
a reality before too long.
I heard about that before I was born.
In the US, the NIST is already researching algorithms for "post-quantum
cryptography:"
https://csrc.nist.gov/projects/post-quantum-cryptography
Yes, the algorithms are farther away from the computers. Doesn't that
ring a bell?
Quantum computing is definitely going to happen.
Yes, I know. Soon. Very soon. It's almost there. I heard that before I
was born.
Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
No, but we can move to quantum computing, which may become
a reality before too long.
I heard about that before I was born.
In the US, the NIST is already researching algorithms for "post-quantum
cryptography:"
https://csrc.nist.gov/projects/post-quantum-cryptography
Yes, the algorithms are farther away from the computers. Doesn't that
ring a bell?
Not quite sure what the argument is here,
but “already researching” is
severely behind the times. Multiple PQC algorithms are well past the
research stage, with finalized standards published in August and a
couple more on the way. Adaptation of higher-level standards (APIs, PKI,
etc) and adoption leading to deployment is well underway.
Quantum computing is definitely going to happen.
Yes, I know. Soon. Very soon. It's almost there. I heard that before I
was born.
I’m not sure anyone thinks quantum computing is “almost there” in the sense of e.g. a quantum computer big enough to break RSA existing this
year.
However the risk is real enough to be worth actively mitigating,
both because we may not know when the first one is deployed and due to
the “harvest now, decrypt later” strategy.
Le 06-04-2025, Richard Kettlewell <invalid@invalid.invalid> a écrit :
Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
No, but we can move to quantum computing, which may become
a reality before too long.
I heard about that before I was born.
In the US, the NIST is already researching algorithms for "post-quantum >>>> cryptography:"
https://csrc.nist.gov/projects/post-quantum-cryptography
Yes, the algorithms are farther away from the computers. Doesn't that
ring a bell?
Not quite sure what the argument is here,
I mean the algorithms are very well advanced. They just need a computer
to switch from ready to usable.
but “already researching” is severely behind the times. Multiple PQC
algorithms are well past the research stage, with finalized standards
published in August and a couple more on the way. Adaptation of
higher-level standards (APIs, PKI, etc) and adoption leading to
deployment is well underway.
Yep. The algorithms. For the big computers, it's another story.
Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
No, but we can move to quantum computing, which may become
a reality before too long.
I heard about that before I was born.
In the US, the NIST is already researching algorithms for "post-quantum
cryptography:"
https://csrc.nist.gov/projects/post-quantum-cryptography
Yes, the algorithms are farther away from the computers. Doesn't that
ring a bell?
Not quite sure what the argument is here, but “already researching” is severely behind the times. Multiple PQC algorithms are well past the
research stage, with finalized standards published in August and a
couple more on the way. Adaptation of higher-level standards (APIs, PKI,
etc) and adoption leading to deployment is well underway.
Quantum computing is definitely going to happen.
Yes, I know. Soon. Very soon. It's almost there. I heard that before I
was born.
I’m not sure anyone thinks quantum computing is “almost there” in the sense of e.g. a quantum computer big enough to break RSA existing this
year. However the risk is real enough to be worth actively mitigating,
both because we may not know when the first one is deployed and due to
the “harvest now, decrypt later” strategy.
Ummm ... given option ... I'd rewrite SSA/IRS using
one of the BSDs (maybe a commercial version) as the
c186282 <c186282@nnada.net> wrote:
Ummm ... given option ... I'd rewrite SSA/IRS using
one of the BSDs (maybe a commercial version) as the
The problem that will be encountered in "rewriting" SSA or IRS is not
the software.
The problem area is the 'rule book' defining what the software is to
do. The first problem is, there is no single "rule book" with which to refer. It is all spread over thousands of statutes that themselves
have been patched plural (millions?) of times throughout the years both
SSA and IRS have been around. If one could collect all the 'rules' of
what should happen given specific inputs together into a single 'rule
book' and print it out the result would likely be a 6 foot high stack
of double sided US letter sheets of paper.
And the rules will be things like (made up, but the actual rules are
just as arcane):
Person X receives 4.75% of their total SSA payments over their lifetime
as pension, unless they are also a veteran, in which case they receive
6.25%, but if they served in the Airborne rangers from 1975 to 1982
they get an additional 1.27%, however if they also worked for the NSA
from 1987 to 1993 they receive 1.87% less. However, for payments from
1957 to 1962, they receive a 3.2% bonus, but for payments made from
1967 to 1974 they take a 1.4% penalty. Further, if the payments were
for self employment income from 1975 to 1986 they get a 3.2% bonus.
Etc.
Think about the arcane tax rules for what numbers to put where on the
tax forms every year, the SSA rules are very much like the tax rules
(because both have been created, piecemeal, over the course of decades,
by different politicians getting patches to the statutes through
congress).
The problem that will be encountered is that the existing code base has
been built up over the decades in concert with the politicans making
changes, so both evolved in concert, and each change was incremental at
the time. But trying to rewrite it all from the ground up is going to quickly hit the quagmire of exponential complexity just to understand
all the rules about what to do when for some payment Y (or for some tax filing Z). The result will be something that either screws up royally
at every result, or simply omits 95+% of all the arcane, interdependent, things the congres folk have added to the statutes over the decades
(and someone loses their SS payment the statues say they should
receive).
A commercial Unix is the better OS base.
That's what Apple did, and what Big G should do.
On 07/04/2025 03:49, c186282 wrote:
A commercial Unix is the better OS base.
That's what Apple did, and what Big G should do.
IIRC Apple OS/X is based on Free BSD
On 4/7/25 6:56 AM, The Natural Philosopher wrote:Evidence?
On 07/04/2025 03:49, c186282 wrote:
A commercial Unix is the better OS base.
That's what Apple did, and what Big G should do.
IIRC Apple OS/X is based on Free BSD
But a 'commercial' variant - kinda like RHEL is
a commercial Linux variant. Apple PAID for it.
Indeed ! However ... probably COULD be done, it's
a bunch of shifting values - input to some accts,
calx ops, shift to other accts ....... lots and
lots of rheostats ........
On 30/05/2025 20:21, % wrote:
Joel wrote:
John Ames <commodorejohn@gmail.com> wrote:no because there would be tariffs that go to trumps pocket
On Fri, 30 May 2025 07:22:51 -0000 (UTC)
Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
Well, I guess that’s over, now that Elon Musk has left the building. >>>>> That’s the end of DOGE, without “saving” anywhere near the trillion >>>>> dollars he originally promised.
Comes as a *total* shock, lemmetellya.
Let's say they had cut that much with this DOGE BS (not that it's 100%
a stupid idea, of course, but they were not approaching very
rationally), wouldn't the proposed tax breaks offset it? Wouldn't we
still be spending a large fortune every year on the damn military?
Military spending is pretty low in reality.
And it is a *pragmatic* program, whereas so much is spent on purely
*moral* initiatives to employ people who think they can therefore tell
how to run your life better than you can yourself.
On 5/30/25 16:33, The Natural Philosopher wrote:
On 30/05/2025 20:21, % wrote:
Joel wrote:
John Ames <commodorejohn@gmail.com> wrote:no because there would be tariffs that go to trumps pocket
On Fri, 30 May 2025 07:22:51 -0000 (UTC) Lawrence D'Oliveiro
<ldo@nz.invalid> wrote:
Well, I guess that’s over, now that Elon Musk has left the
building. That’s the end of DOGE, without “saving” anywhere
near the trillion dollars he originally promised.
Comes as a *total* shock, lemmetellya.
Let's say they had cut that much with this DOGE BS (not that
it's 100% a stupid idea, of course, but they were not
approaching very rationally), wouldn't the proposed tax breaks
offset it? Wouldn't we still be spending a large fortune every
year on the damn military?
Military spending is pretty low in reality.
It is not low but the latest tools are fabulously expensive though
as un-piloted aircraft take the load off bombers and fighters whose
pilots will go to observational planes to check out the accuracy of
our unmanned devices.
And it is a *pragmatic* program, whereas so much is spent on purely
*moral* initiatives to employ people who think they can therefore
tell how to run your life better than you can yourself.
So spending millions then on ineffective equipment was worth it to
find out that unarmored personal carriers for example waste the lives
of trained and equipped soldiers combined with home built explosive
devices.
Pragmatically we need to return to a world in which civilian
contractors are extraneous to the missions. Civilian contractors not
only who build less effective weapons and vehicles but who enrich
themselves by selling the next "great thing" to the generals.
Civilian contractors who are empowered to ignore the military rules
of engagement and are uncivil to the indigenous populations and
persons.
On 31/05/2025 22:04, Bobbie Sellers wrote:
about 90% of *all* government spending is wasted.
On 5/30/25 16:33, The Natural Philosopher wrote:
On 30/05/2025 20:21, % wrote:
Joel wrote:
John Ames <commodorejohn@gmail.com> wrote:no because there would be tariffs that go to trumps pocket
On Fri, 30 May 2025 07:22:51 -0000 (UTC) Lawrence D'Oliveiro
<ldo@nz.invalid> wrote:
Well, I guess that’s over, now that Elon Musk has left the
building. That’s the end of DOGE, without “saving” anywhere >>>>>>> near the trillion dollars he originally promised.
Comes as a *total* shock, lemmetellya.
Let's say they had cut that much with this DOGE BS (not that
it's 100% a stupid idea, of course, but they were not
approaching very rationally), wouldn't the proposed tax breaks
offset it? Wouldn't we still be spending a large fortune every
year on the damn military?
Military spending is pretty low in reality.
It is not low but the latest tools are fabulously expensive though
as un-piloted aircraft take the load off bombers and fighters whose
pilots will go to observational planes to check out the accuracy of
our unmanned devices.
And it is a *pragmatic* program, whereas so much is spent on purely
*moral* initiatives to employ people who think they can therefore
tell how to run your life better than you can yourself.
So spending millions then on ineffective equipment was worth it to
find out that unarmored personal carriers for example waste the lives
of trained and equipped soldiers combined with home built explosive
devices.
Did you know that in WWII the chief means of German logistics was horses ?
"Over the course of the war, Germany (2.75 million) and the Soviet Union
(3.5 million) together employed more than six million horses. "
Russia has gone back to Donkeys. You can eat donkeys and horses with
you cant do with a personnell carrier.
The USA were always dumb fucks who never listened.
We TOLD them in 43 that B17s would get shot out of the sky in daylight
raids, but oh no, they thought they had enough defensive armament.
After Ireland and the IRA we knew pretty much how to protect soldiers
from IEDS
Did they listen? Nah.
Pragmatically we need to return to a world in which civilianThe problem is that no one knows what the next war will be like until
contractors are extraneous to the missions. Civilian contractors not
only who build less effective weapons and vehicles but who enrich
themselves by selling the next "great thing" to the generals.
Civilian contractors who are empowered to ignore the military rules of
engagement and are uncivil to the indigenous populations and
persons.
they are in it.
AS it happens probably the most useful thing right now after the
homebuilt drones in Ukraine, has been the Bradley fighting vehicle.
Built by BAE Systems US division.
The Bushmaster cannon is very effective against Russian armour, which is admittedly crap.
But for every Bradley, there is a useless Abrams tank, and some useless
F35s.
The obsolete ATACMS too has proved extremely useful.
IN order to get stuff that works, you have to build a lot of stuff, and
then when war happens you stop building the dogs and throw money at
stuff that works.
That's how life works, and especially politics, You just fuck around until something actually works, then do more of it an pretend that was
the idea all along.
The USA has always wasted money on everything until they found stuff
that worked. In Europe we haven't got the cash to waste, so we tend to
plan out how do do more for less.
The ARM architecture was developed by a bunch of guys in Cambridge some
of whom I know, who basically wanted something as cheap and simple as a
6502 because they couldn't afford to fabricate a huge chip, that
nevertheless would run blindingly fast compared with a z80.
And they spent ages deciding on what instruction set would fit into a
tiny architecture and still be flexible and useful.
INTEL just threw money at making chips with massive instruction sets
that cost a bloody fortune
Its just the way you make use of what you have, USA is 10 times the size
of the UK with population, but 100 times more land area, and 1000 times
more natural resources. Its hard to avoid being rich in the USA and you don't even need to be clever.
UK has no natural resources left, so it has to be smart instead..
UK has no natural resources left, so it has to be smart instead..
Brexit was smart?
I never thought so.
Pragmatically we need to return to a world in which civilian contractors are extraneous to the missions. Civilian contractors not
only who build less effective weapons and vehicles but who enrich
themselves by selling the next "great thing"
to the generals.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 02:47:47 |
Calls: | 10,387 |
Calls today: | 2 |
Files: | 14,061 |
Messages: | 6,416,755 |