• Re: Meta: Re: How I deal with the enormous amount of spam

    From The Starmaker@21:1/5 to The Starmaker on Tue Feb 6 10:30:28 2024
    XPost: sci.physics.relativity

    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 12:53 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 09:55 AM, Ross Finlayson wrote:
    On 02/03/2024 02:46 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 01/30/2024 12:54 PM, Ross Finlayson wrote:
    On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote: >>>>>>>>> Tom Roberts wrote:

    I use Thunderbird to read Usenet. Recently sci.physics.relativity
    has
    been getting hundreds of spam posts each day, completely >>>>>>>>>> overwhelming
    legitimate content. These spam posts share the property that they
    are
    written in a non-latin script.

    Thunderbird implements message filters that can mark a message >>>>>>>>>> Read. So
    I created a filter to run on sci.physics.relativity that marks >>>>>>>>>> messages
    Read. Then when reading the newsgroups, I simply display only unread
    messages. The key to making this work is to craft the filter so >>>>>>>>>> it marks
    messages in which the Subject matches any of a dozen characters >>>>>>>>>> picked
    from some spam messages.

    This doesn't completely eliminate the spam, but it is now only a few
    messages per day.

    Tom Roberts
    I would like to do the same thing, so I installed Thunderbird... >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer >>>>>>>>> skills and is not at all intuitive. If anyone can point to an >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated. >>>>>>>>>
    \Paul Alsing

    Yeah, it's pretty bad, or, worse anybody's ever seen it.

    I as well sort of mow the lawn a bit or mark the spam.

    It seems alright if it'll be a sort of clean break: on Feb 22 >>>>>>>> according to Google,
    Google will break its compeerage to Usenet, and furthermore make >>>>>>>> read-only
    the archives, what it has, what until then, will be as it was. >>>>>>>>
    Over on sci.math I've had the idea for a while of making some brief
    and
    special purpose Usenet compeers, for only some few groups, or, you >>>>>>>> know, the _belles lettres_ of the text hierarchy.

    "Meta: a usenet server just for sci.math"
    -- https://groups.google.com/g/sci.math/c/zggff_pVEks >>>>>>>>
    So, there you can read the outlook of this kind of thing, then >>>>>>>> while sort
    of simple as the protocol is simple and its implementations >>>>>>>> widespread,
    how to deal with the "signal and noise" of "exposed messaging >>>>>>>> destinations
    on the Internet", well on that thread I'm theorizing a sort of, >>>>>>>> "NOOBNB protocol",
    figuring to make an otherwise just standard Usenet compeer, and >>>>>>>> also for
    email or messaging destinations, sort of designed with the >>>>>>>> expectation that
    there will be spam, and spam and ham are hand in hand, to exclude >>>>>>>> it in simple terms.

    NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed

    (That and a firmer sort of "Load Shed" or "Load Hold" at the >>>>>>>> transport layer.)

    Also it would be real great if at least there was surfaced to the >>>>>>>> Internet a
    read-only view of any message by its message ID, a "URL", or as for
    a "URI",
    a "URN", a reliable perma-link in the IETF "news" protocol, namespace.

    https://groups.google.com/g/sci.math/c/zggff_pVEks

    I wonder that there's a reliable sort of long-term project that >>>>>>>> surfaces
    "news" protocol message-IDs, .... It's a stable, standards-based >>>>>>>> protocol.


    Thunderbird, "SLRN", .... Thanks for caring. We care.


    https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw >>>>>>>>

    One fellow reached me via e-mail and he said, hey, the Googler spam is
    outrageous, can we do anything about it? Would you write a script to
    funnel all their message-ID's into the abuse reporting? And I was >>>>>>> like,
    you know, about 2008 I did just that, there was a big spam flood, >>>>>>> and I wrote a little script to find them and extract their
    posting-account,
    and the message-ID, and a little script to post to the posting-host,
    each one of the wicked spams.

    At the time that seemed to help, they sort of dried up, here there's
    that basically they're not following the charter, but, it's the >>>>>>> posting-account
    in the message headers that indicate the origin of the post, not the
    email address. So, I wonder, given that I can extract the
    posting-accounts
    of all the spams, how to match the posting-account to then determine
    whether it's a sockpuppet-farm or what, and basically about sending >>>>>>> them up.

    Let me see your little script. Post it here.

    Here is a list I currently have:

    salz.txt
    usenet.death.penalty.gz
    purify.txt
    NewsAgent110-MS.exe
    HipCrime's NewsAgent (v1_11).htm
    NewsAgent111-BE.zip
    SuperCede.exe
    NewsAgent023.exe
    NewsAgent025.exe
    ActiveAgent.java
    HipCrime's NewsAgent (v1_02)_files
    NewsCancel.java (source code)

    (plus updated python versions)



    (Maybe your script is inthere somewhere?)



    Show me what you got. walk the walk.



    I try to avoid sketchy things like hiring a criminal botnet,
    there's the impression that that's looking at 1000's of counts
    of computer intrusion.

    With those being something about $50K and 10-25 apiece,
    there's a pretty significant deterrence to such activities.

    I've never much cared for "OAuth", giving away the
    keys-to-the-kingdom and all, here it looks like either
    a) a bunch of duped browsers clicked away their identities,
    or b) it's really that Google and Facebook are more than
    half full of fake identities for the sole purpose of being fake.

    (How's your new deal going?
    Great, we got a million users.
    Why are my conversions around zero?
    Your ad must not speak to them.
    Would it help if I spiced it up?
    Don't backtalk me, I'll put you on a list!)

    So, it seems mostly a sort of "spam-walling the Internet",
    where it was like "we're going to reinvent the Internet",
    "no, you aren't", "all right then we'll ruin this one".

    As far as search goes, there's something to be said
    for a new sort of approach to search, given that
    Google, Bing, Duck, ..., _all make the same results_. It's
    just so highly unlikely that they'd _all make the same
    results_, you figure they're just one.

    So, the idea, for somebody like me who's mostly interested
    in writing on the Internet, is that lots of that is of the sort
    of "works" vis-a-vis, the "feuilleton" or what you might
    call it, ephemeral junk, that I just learned about in
    Herman Hesse's "The Glass Bead Game".

    Then, there's an idea, that basically to surface high-quality
    works to a search, is that there's what's called metadata,
    for content like HTML, with regards to Dublin Core and
    RDF and so on, about a sort of making for fungible collections
    of works, what results searchable fragments of various
    larger bodies of works, according to their robots.txt and
    their summaries and with regards to crawling the content
    and so on, then to make federated common search corpi,
    these kinds of things.




    It's like "why are they building that new data center",
    and it's like "well it's like Artificial Intelligence, inside
    that data center is a million virts and each one has a
    browser emulator and a phone app sandbox and a
    little notecard that prompts its name, basically it's
    a million-headed hydra called a sims-bot-farm,
    that for pennies on the dollar is an instant audience."

    "Wow, great, do they get a cut?" "Don't be talking about my cut."

    Usenet traffic had been up recently, ....

    I think they used to call it "astro-turfing".
    "Artificial Intelligence?" "No, 'Fake eyeballs'."

    I have NewsAgent111-MS.exe

    I seem to be missing version 2.0

    Do you have the 2.0 version?

    I'll trade you.

    I'll give you my python version with (GUI)!!!! (Tinter)

    let's trade!

    don't bogart

    I seem to be missing this version:

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    Do you have it? you must have!






    Nope, I just wrote a little script to connect to NNTP
    with a Yes/No button on the subject, tapped through
    those, and a little script to send an HTTP request to
    the publicly-facing return-to-sender in-box, for each.

    Here's all the sources you need: IETF RFC editor.
    Look for "NNTP". How to advise Google of this is
    that each domain on the Internet is supposed to
    have an "abuse@domain" email inbox, though there's
    probably also a web request interface, as with regards
    to publicly facing services, and expected to be
    good actors on the network.

    Anyways if you read through "Meta: a usenet server
    just for sci.math", what I have in mind is a sort
    of author's and writer's oriented installation,
    basically making for vanity printouts and generating
    hypertext collections of contents and authors and
    subjects and these kinds of things, basically for
    on the order of "find all the postings of Archimedes
    Plutonium, and, the threads they are in, and,
    make a hypertext page of all that, a linear timeline,
    and also thread it out as a linear sequence".

    I.e. people who actually post to Usenet are sometimes
    having written interesting things, and, thus having
    it so that it would be simplified to generate message-ID
    listings and their corresponding standard URL's in the
    standard IETF "news" URL protocol, and to point that
    at a given news server or like XLink, is for treating
    Usenet its archives like a living museum of all these
    different authors posts and their interactions together.

    I.e., here it's "belles lettres" and "fair use",
    not just "belles" and "use".

    It seemed nice of Google Groups to front this for a long time,
    now they're quitting.

    I imagine Internet Relay Chat's still insane, though.

    Anyways I stay away from any warez and am proud that
    since about Y2K at least I've never bootlegged anything,
    and never uploaded a bootleg. Don't want to give old Shylock
    excuses, and besides, I wrote software for a living.

    Anyways, I don't know who was talking about "any warez" or "bootlegs",
    since I was refering to programs and scripts that reads:

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and redistribute it at will, as long as you give credit to the original
    author. Similarly, the source is provided as reference and can be redistributed freely as well.

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    You seem to be too much 'in your head', on a high horse...

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    So, show me that wicked script you wrote : "funnel all their
    message-ID's"
    by people you call spammers who 'funnel' their products and services
    through Usenet newsgroups.

    You are sooooo wicked.

    and a nanofossils

    Anyways, there is only one person that know what 'nanofossils' means,
    and that is Ross Finlayson.


    I just realized that Ross Finlayson doesn't know of NEWSAGENT.


    Anyways, ...

    "Anyways"???? Who talks like that?


    Anyways..


    the problem of the 'flooding' is not the spammers, it's the 'scientific community'. They caused the problem.
    They removed the feature that NewsAgent used to get rid of ALL flooding
    and spammers. But, but, the
    members of the scientific community could not trust their own members to
    use it against them.

    If one member of the 'scientific community' disagreed with another
    member of the 'scientific community'...they were removed!


    Too much power.


    I called it...God Mode.











    --
    The Starmaker -- To question the unquestionable, ask the unaskable,
    to think the unthinkable, mention the unmentionable, say the unsayable,
    and challenge the unchallengeable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Starmaker@21:1/5 to The Starmaker on Tue Feb 6 11:02:02 2024
    XPost: sci.physics.relativity

    The Starmaker wrote:

    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 12:53 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 09:55 AM, Ross Finlayson wrote:
    On 02/03/2024 02:46 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 01/30/2024 12:54 PM, Ross Finlayson wrote:
    On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
    Tom Roberts wrote:

    I use Thunderbird to read Usenet. Recently sci.physics.relativity
    has
    been getting hundreds of spam posts each day, completely >>>>>>>>>> overwhelming
    legitimate content. These spam posts share the property that they
    are
    written in a non-latin script.

    Thunderbird implements message filters that can mark a message >>>>>>>>>> Read. So
    I created a filter to run on sci.physics.relativity that marks >>>>>>>>>> messages
    Read. Then when reading the newsgroups, I simply display only unread
    messages. The key to making this work is to craft the filter so
    it marks
    messages in which the Subject matches any of a dozen characters
    picked
    from some spam messages.

    This doesn't completely eliminate the spam, but it is now only a few
    messages per day.

    Tom Roberts
    I would like to do the same thing, so I installed Thunderbird...
    but setting it up to read newsgroups is beyond my paltry computer
    skills and is not at all intuitive. If anyone can point to an >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.

    \Paul Alsing

    Yeah, it's pretty bad, or, worse anybody's ever seen it. >>>>>>>>
    I as well sort of mow the lawn a bit or mark the spam.

    It seems alright if it'll be a sort of clean break: on Feb 22 >>>>>>>> according to Google,
    Google will break its compeerage to Usenet, and furthermore make >>>>>>>> read-only
    the archives, what it has, what until then, will be as it was. >>>>>>>>
    Over on sci.math I've had the idea for a while of making some brief
    and
    special purpose Usenet compeers, for only some few groups, or, you
    know, the _belles lettres_ of the text hierarchy.

    "Meta: a usenet server just for sci.math"
    -- https://groups.google.com/g/sci.math/c/zggff_pVEks >>>>>>>>
    So, there you can read the outlook of this kind of thing, then >>>>>>>> while sort
    of simple as the protocol is simple and its implementations >>>>>>>> widespread,
    how to deal with the "signal and noise" of "exposed messaging >>>>>>>> destinations
    on the Internet", well on that thread I'm theorizing a sort of, >>>>>>>> "NOOBNB protocol",
    figuring to make an otherwise just standard Usenet compeer, and >>>>>>>> also for
    email or messaging destinations, sort of designed with the >>>>>>>> expectation that
    there will be spam, and spam and ham are hand in hand, to exclude
    it in simple terms.

    NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed

    (That and a firmer sort of "Load Shed" or "Load Hold" at the >>>>>>>> transport layer.)

    Also it would be real great if at least there was surfaced to the
    Internet a
    read-only view of any message by its message ID, a "URL", or as for
    a "URI",
    a "URN", a reliable perma-link in the IETF "news" protocol, namespace.

    https://groups.google.com/g/sci.math/c/zggff_pVEks

    I wonder that there's a reliable sort of long-term project that >>>>>>>> surfaces
    "news" protocol message-IDs, .... It's a stable, standards-based
    protocol.


    Thunderbird, "SLRN", .... Thanks for caring. We care.


    https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw >>>>>>>>

    One fellow reached me via e-mail and he said, hey, the Googler spam is
    outrageous, can we do anything about it? Would you write a script to
    funnel all their message-ID's into the abuse reporting? And I was
    like,
    you know, about 2008 I did just that, there was a big spam flood, >>>>>>> and I wrote a little script to find them and extract their >>>>>>> posting-account,
    and the message-ID, and a little script to post to the posting-host,
    each one of the wicked spams.

    At the time that seemed to help, they sort of dried up, here there's
    that basically they're not following the charter, but, it's the >>>>>>> posting-account
    in the message headers that indicate the origin of the post, not the
    email address. So, I wonder, given that I can extract the >>>>>>> posting-accounts
    of all the spams, how to match the posting-account to then determine
    whether it's a sockpuppet-farm or what, and basically about sending
    them up.

    Let me see your little script. Post it here.

    Here is a list I currently have:

    salz.txt
    usenet.death.penalty.gz
    purify.txt
    NewsAgent110-MS.exe
    HipCrime's NewsAgent (v1_11).htm
    NewsAgent111-BE.zip
    SuperCede.exe
    NewsAgent023.exe
    NewsAgent025.exe
    ActiveAgent.java
    HipCrime's NewsAgent (v1_02)_files
    NewsCancel.java (source code)

    (plus updated python versions)



    (Maybe your script is inthere somewhere?)



    Show me what you got. walk the walk.



    I try to avoid sketchy things like hiring a criminal botnet,
    there's the impression that that's looking at 1000's of counts
    of computer intrusion.

    With those being something about $50K and 10-25 apiece,
    there's a pretty significant deterrence to such activities.

    I've never much cared for "OAuth", giving away the
    keys-to-the-kingdom and all, here it looks like either
    a) a bunch of duped browsers clicked away their identities,
    or b) it's really that Google and Facebook are more than
    half full of fake identities for the sole purpose of being fake. >>>>
    (How's your new deal going?
    Great, we got a million users.
    Why are my conversions around zero?
    Your ad must not speak to them.
    Would it help if I spiced it up?
    Don't backtalk me, I'll put you on a list!)

    So, it seems mostly a sort of "spam-walling the Internet",
    where it was like "we're going to reinvent the Internet",
    "no, you aren't", "all right then we'll ruin this one".

    As far as search goes, there's something to be said
    for a new sort of approach to search, given that
    Google, Bing, Duck, ..., _all make the same results_. It's
    just so highly unlikely that they'd _all make the same
    results_, you figure they're just one.

    So, the idea, for somebody like me who's mostly interested
    in writing on the Internet, is that lots of that is of the sort
    of "works" vis-a-vis, the "feuilleton" or what you might
    call it, ephemeral junk, that I just learned about in
    Herman Hesse's "The Glass Bead Game".

    Then, there's an idea, that basically to surface high-quality
    works to a search, is that there's what's called metadata,
    for content like HTML, with regards to Dublin Core and
    RDF and so on, about a sort of making for fungible collections
    of works, what results searchable fragments of various
    larger bodies of works, according to their robots.txt and
    their summaries and with regards to crawling the content
    and so on, then to make federated common search corpi,
    these kinds of things.




    It's like "why are they building that new data center",
    and it's like "well it's like Artificial Intelligence, inside
    that data center is a million virts and each one has a
    browser emulator and a phone app sandbox and a
    little notecard that prompts its name, basically it's
    a million-headed hydra called a sims-bot-farm,
    that for pennies on the dollar is an instant audience."

    "Wow, great, do they get a cut?" "Don't be talking about my cut." >>>
    Usenet traffic had been up recently, ....

    I think they used to call it "astro-turfing".
    "Artificial Intelligence?" "No, 'Fake eyeballs'."

    I have NewsAgent111-MS.exe

    I seem to be missing version 2.0

    Do you have the 2.0 version?

    I'll trade you.

    I'll give you my python version with (GUI)!!!! (Tinter)

    let's trade!

    don't bogart

    I seem to be missing this version:

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    Do you have it? you must have!






    Nope, I just wrote a little script to connect to NNTP
    with a Yes/No button on the subject, tapped through
    those, and a little script to send an HTTP request to
    the publicly-facing return-to-sender in-box, for each.

    Here's all the sources you need: IETF RFC editor.
    Look for "NNTP". How to advise Google of this is
    that each domain on the Internet is supposed to
    have an "abuse@domain" email inbox, though there's
    probably also a web request interface, as with regards
    to publicly facing services, and expected to be
    good actors on the network.

    Anyways if you read through "Meta: a usenet server
    just for sci.math", what I have in mind is a sort
    of author's and writer's oriented installation,
    basically making for vanity printouts and generating
    hypertext collections of contents and authors and
    subjects and these kinds of things, basically for
    on the order of "find all the postings of Archimedes
    Plutonium, and, the threads they are in, and,
    make a hypertext page of all that, a linear timeline,
    and also thread it out as a linear sequence".

    I.e. people who actually post to Usenet are sometimes
    having written interesting things, and, thus having
    it so that it would be simplified to generate message-ID
    listings and their corresponding standard URL's in the
    standard IETF "news" URL protocol, and to point that
    at a given news server or like XLink, is for treating
    Usenet its archives like a living museum of all these
    different authors posts and their interactions together.

    I.e., here it's "belles lettres" and "fair use",
    not just "belles" and "use".

    It seemed nice of Google Groups to front this for a long time,
    now they're quitting.

    I imagine Internet Relay Chat's still insane, though.

    Anyways I stay away from any warez and am proud that
    since about Y2K at least I've never bootlegged anything,
    and never uploaded a bootleg. Don't want to give old Shylock
    excuses, and besides, I wrote software for a living.

    Anyways, I don't know who was talking about "any warez" or "bootlegs", since I was refering to programs and scripts that reads:

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and redistribute it at will, as long as you give credit to the original
    author. Similarly, the source is provided as reference and can be redistributed freely as well.

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    You seem to be too much 'in your head', on a high horse...

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    So, show me that wicked script you wrote : "funnel all their
    message-ID's"
    by people you call spammers who 'funnel' their products and services through Usenet newsgroups.

    You are sooooo wicked.

    and a nanofossils

    Anyways, there is only one person that know what 'nanofossils' means,
    and that is Ross Finlayson.

    I just realized that Ross Finlayson doesn't know of NEWSAGENT.

    Anyways, ...

    "Anyways"???? Who talks like that?

    Anyways..

    the problem of the 'flooding' is not the spammers, it's the 'scientific community'. They caused the problem.
    They removed the feature that NewsAgent used to get rid of ALL flooding
    and spammers. But, but, the
    members of the scientific community could not trust their own members to
    use it against them.

    If one member of the 'scientific community' disagreed with another
    member of the 'scientific community'...they were removed!

    Too much power.

    I called it...God Mode.

    Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it properly you can make a mistake and remove *EVERYONE'S* posts by accident.

    Everyone just completely disapeared!


    Oops. i made a booboo.


    Like that Twilight Zone episode where everyone disapears by a click of a watch.



    Where is everybody? MAJOR KILLFILE!

    So, which is worse?



    --
    The Starmaker -- To question the unquestionable, ask the unaskable,
    to think the unthinkable, mention the unmentionable, say the unsayable,
    and challenge the unchallengeable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Starmaker@21:1/5 to The Starmaker on Tue Feb 6 11:23:03 2024
    XPost: sci.physics.relativity

    The Starmaker wrote:

    The Starmaker wrote:

    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 12:53 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 09:55 AM, Ross Finlayson wrote:
    On 02/03/2024 02:46 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 01/30/2024 12:54 PM, Ross Finlayson wrote:
    On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
    Tom Roberts wrote:

    I use Thunderbird to read Usenet. Recently sci.physics.relativity
    has
    been getting hundreds of spam posts each day, completely >>>>>>>>>> overwhelming
    legitimate content. These spam posts share the property that they
    are
    written in a non-latin script.

    Thunderbird implements message filters that can mark a message
    Read. So
    I created a filter to run on sci.physics.relativity that marks
    messages
    Read. Then when reading the newsgroups, I simply display only unread
    messages. The key to making this work is to craft the filter so
    it marks
    messages in which the Subject matches any of a dozen characters
    picked
    from some spam messages.

    This doesn't completely eliminate the spam, but it is now only a few
    messages per day.

    Tom Roberts
    I would like to do the same thing, so I installed Thunderbird...
    but setting it up to read newsgroups is beyond my paltry computer
    skills and is not at all intuitive. If anyone can point to an >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.

    \Paul Alsing

    Yeah, it's pretty bad, or, worse anybody's ever seen it. >>>>>>>>
    I as well sort of mow the lawn a bit or mark the spam. >>>>>>>>
    It seems alright if it'll be a sort of clean break: on Feb 22 >>>>>>>> according to Google,
    Google will break its compeerage to Usenet, and furthermore make
    read-only
    the archives, what it has, what until then, will be as it was. >>>>>>>>
    Over on sci.math I've had the idea for a while of making some brief
    and
    special purpose Usenet compeers, for only some few groups, or, you
    know, the _belles lettres_ of the text hierarchy.

    "Meta: a usenet server just for sci.math"
    -- https://groups.google.com/g/sci.math/c/zggff_pVEks >>>>>>>>
    So, there you can read the outlook of this kind of thing, then >>>>>>>> while sort
    of simple as the protocol is simple and its implementations >>>>>>>> widespread,
    how to deal with the "signal and noise" of "exposed messaging >>>>>>>> destinations
    on the Internet", well on that thread I'm theorizing a sort of,
    "NOOBNB protocol",
    figuring to make an otherwise just standard Usenet compeer, and
    also for
    email or messaging destinations, sort of designed with the >>>>>>>> expectation that
    there will be spam, and spam and ham are hand in hand, to exclude
    it in simple terms.

    NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed

    (That and a firmer sort of "Load Shed" or "Load Hold" at the >>>>>>>> transport layer.)

    Also it would be real great if at least there was surfaced to the
    Internet a
    read-only view of any message by its message ID, a "URL", or as for
    a "URI",
    a "URN", a reliable perma-link in the IETF "news" protocol, namespace.

    https://groups.google.com/g/sci.math/c/zggff_pVEks

    I wonder that there's a reliable sort of long-term project that
    surfaces
    "news" protocol message-IDs, .... It's a stable, standards-based
    protocol.


    Thunderbird, "SLRN", .... Thanks for caring. We care. >>>>>>>>

    https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw


    One fellow reached me via e-mail and he said, hey, the Googler spam is
    outrageous, can we do anything about it? Would you write a script to
    funnel all their message-ID's into the abuse reporting? And I was
    like,
    you know, about 2008 I did just that, there was a big spam flood,
    and I wrote a little script to find them and extract their >>>>>>> posting-account,
    and the message-ID, and a little script to post to the posting-host,
    each one of the wicked spams.

    At the time that seemed to help, they sort of dried up, here there's
    that basically they're not following the charter, but, it's the >>>>>>> posting-account
    in the message headers that indicate the origin of the post, not the
    email address. So, I wonder, given that I can extract the >>>>>>> posting-accounts
    of all the spams, how to match the posting-account to then determine
    whether it's a sockpuppet-farm or what, and basically about sending
    them up.

    Let me see your little script. Post it here.

    Here is a list I currently have:

    salz.txt
    usenet.death.penalty.gz
    purify.txt
    NewsAgent110-MS.exe
    HipCrime's NewsAgent (v1_11).htm
    NewsAgent111-BE.zip
    SuperCede.exe
    NewsAgent023.exe
    NewsAgent025.exe
    ActiveAgent.java
    HipCrime's NewsAgent (v1_02)_files
    NewsCancel.java (source code)

    (plus updated python versions)



    (Maybe your script is inthere somewhere?)



    Show me what you got. walk the walk.



    I try to avoid sketchy things like hiring a criminal botnet,
    there's the impression that that's looking at 1000's of counts >>>> of computer intrusion.

    With those being something about $50K and 10-25 apiece,
    there's a pretty significant deterrence to such activities.

    I've never much cared for "OAuth", giving away the
    keys-to-the-kingdom and all, here it looks like either
    a) a bunch of duped browsers clicked away their identities,
    or b) it's really that Google and Facebook are more than
    half full of fake identities for the sole purpose of being fake. >>>>
    (How's your new deal going?
    Great, we got a million users.
    Why are my conversions around zero?
    Your ad must not speak to them.
    Would it help if I spiced it up?
    Don't backtalk me, I'll put you on a list!)

    So, it seems mostly a sort of "spam-walling the Internet",
    where it was like "we're going to reinvent the Internet",
    "no, you aren't", "all right then we'll ruin this one".

    As far as search goes, there's something to be said
    for a new sort of approach to search, given that
    Google, Bing, Duck, ..., _all make the same results_. It's
    just so highly unlikely that they'd _all make the same
    results_, you figure they're just one.

    So, the idea, for somebody like me who's mostly interested
    in writing on the Internet, is that lots of that is of the sort >>>> of "works" vis-a-vis, the "feuilleton" or what you might
    call it, ephemeral junk, that I just learned about in
    Herman Hesse's "The Glass Bead Game".

    Then, there's an idea, that basically to surface high-quality
    works to a search, is that there's what's called metadata,
    for content like HTML, with regards to Dublin Core and
    RDF and so on, about a sort of making for fungible collections >>>> of works, what results searchable fragments of various
    larger bodies of works, according to their robots.txt and
    their summaries and with regards to crawling the content
    and so on, then to make federated common search corpi,
    these kinds of things.




    It's like "why are they building that new data center",
    and it's like "well it's like Artificial Intelligence, inside
    that data center is a million virts and each one has a
    browser emulator and a phone app sandbox and a
    little notecard that prompts its name, basically it's
    a million-headed hydra called a sims-bot-farm,
    that for pennies on the dollar is an instant audience."

    "Wow, great, do they get a cut?" "Don't be talking about my cut." >>>
    Usenet traffic had been up recently, ....

    I think they used to call it "astro-turfing".
    "Artificial Intelligence?" "No, 'Fake eyeballs'."

    I have NewsAgent111-MS.exe

    I seem to be missing version 2.0

    Do you have the 2.0 version?

    I'll trade you.

    I'll give you my python version with (GUI)!!!! (Tinter)

    let's trade!

    don't bogart

    I seem to be missing this version:

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    Do you have it? you must have!






    Nope, I just wrote a little script to connect to NNTP
    with a Yes/No button on the subject, tapped through
    those, and a little script to send an HTTP request to
    the publicly-facing return-to-sender in-box, for each.

    Here's all the sources you need: IETF RFC editor.
    Look for "NNTP". How to advise Google of this is
    that each domain on the Internet is supposed to
    have an "abuse@domain" email inbox, though there's
    probably also a web request interface, as with regards
    to publicly facing services, and expected to be
    good actors on the network.

    Anyways if you read through "Meta: a usenet server
    just for sci.math", what I have in mind is a sort
    of author's and writer's oriented installation,
    basically making for vanity printouts and generating
    hypertext collections of contents and authors and
    subjects and these kinds of things, basically for
    on the order of "find all the postings of Archimedes
    Plutonium, and, the threads they are in, and,
    make a hypertext page of all that, a linear timeline,
    and also thread it out as a linear sequence".

    I.e. people who actually post to Usenet are sometimes
    having written interesting things, and, thus having
    it so that it would be simplified to generate message-ID
    listings and their corresponding standard URL's in the
    standard IETF "news" URL protocol, and to point that
    at a given news server or like XLink, is for treating
    Usenet its archives like a living museum of all these
    different authors posts and their interactions together.

    I.e., here it's "belles lettres" and "fair use",
    not just "belles" and "use".

    It seemed nice of Google Groups to front this for a long time,
    now they're quitting.

    I imagine Internet Relay Chat's still insane, though.

    Anyways I stay away from any warez and am proud that
    since about Y2K at least I've never bootlegged anything,
    and never uploaded a bootleg. Don't want to give old Shylock
    excuses, and besides, I wrote software for a living.

    Anyways, I don't know who was talking about "any warez" or "bootlegs", since I was refering to programs and scripts that reads:

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and redistribute it at will, as long as you give credit to the original author. Similarly, the source is provided as reference and can be redistributed freely as well.

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    You seem to be too much 'in your head', on a high horse...

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed freely as well. "

    So, show me that wicked script you wrote : "funnel all their message-ID's"
    by people you call spammers who 'funnel' their products and services through Usenet newsgroups.

    You are sooooo wicked.

    and a nanofossils

    Anyways, there is only one person that know what 'nanofossils' means,
    and that is Ross Finlayson.

    I just realized that Ross Finlayson doesn't know of NEWSAGENT.

    Anyways, ...

    "Anyways"???? Who talks like that?

    Anyways..

    the problem of the 'flooding' is not the spammers, it's the 'scientific community'. They caused the problem.
    They removed the feature that NewsAgent used to get rid of ALL flooding
    and spammers. But, but, the
    members of the scientific community could not trust their own members to use it against them.

    If one member of the 'scientific community' disagreed with another
    member of the 'scientific community'...they were removed!

    Too much power.

    I called it...God Mode.

    Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it
    properly you can make a mistake and remove *EVERYONE'S* posts by accident.

    Everyone just completely disapeared!

    Oops. i made a booboo.

    Like that Twilight Zone episode where everyone disapears by a click of a watch.

    Where is everybody? MAJOR KILLFILE!

    So, which is worse?

    You know what GOD MODE Killfile is? That means you not only killed filed everyone, but you also sort of
    turn everyones esles killfile on. Nobody sees nobody.


    It's like the Atomic Bomb of Usenet!


    (only yous guys make bombs like that)
    (then yous get angry when everyone has the atomic bomb)

    typical.





    --
    The Starmaker -- To question the unquestionable, ask the unaskable,
    to think the unthinkable, mention the unmentionable, say the unsayable,
    and challenge the unchallengeable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Starmaker@21:1/5 to The Starmaker on Sun Feb 11 12:37:18 2024
    XPost: sci.physics.relativity

    The Starmaker wrote:

    The Starmaker wrote:

    The Starmaker wrote:

    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 12:53 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 02/04/2024 09:55 AM, Ross Finlayson wrote:
    On 02/03/2024 02:46 PM, The Starmaker wrote:
    The Starmaker wrote:

    Ross Finlayson wrote:

    On 01/30/2024 12:54 PM, Ross Finlayson wrote:
    On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
    Tom Roberts wrote:

    I use Thunderbird to read Usenet. Recently sci.physics.relativity
    has
    been getting hundreds of spam posts each day, completely >>>>>>>>>> overwhelming
    legitimate content. These spam posts share the property that they
    are
    written in a non-latin script.

    Thunderbird implements message filters that can mark a message
    Read. So
    I created a filter to run on sci.physics.relativity that marks
    messages
    Read. Then when reading the newsgroups, I simply display only unread
    messages. The key to making this work is to craft the filter so
    it marks
    messages in which the Subject matches any of a dozen characters
    picked
    from some spam messages.

    This doesn't completely eliminate the spam, but it is now only a few
    messages per day.

    Tom Roberts
    I would like to do the same thing, so I installed Thunderbird...
    but setting it up to read newsgroups is beyond my paltry computer
    skills and is not at all intuitive. If anyone can point to an
    idiot-proof tutorial for doing this It would be much appreciated.

    \Paul Alsing

    Yeah, it's pretty bad, or, worse anybody's ever seen it. >>>>>>>>
    I as well sort of mow the lawn a bit or mark the spam. >>>>>>>>
    It seems alright if it'll be a sort of clean break: on Feb 22
    according to Google,
    Google will break its compeerage to Usenet, and furthermore make
    read-only
    the archives, what it has, what until then, will be as it was.

    Over on sci.math I've had the idea for a while of making some brief
    and
    special purpose Usenet compeers, for only some few groups, or, you
    know, the _belles lettres_ of the text hierarchy.

    "Meta: a usenet server just for sci.math"
    -- https://groups.google.com/g/sci.math/c/zggff_pVEks >>>>>>>>
    So, there you can read the outlook of this kind of thing, then
    while sort
    of simple as the protocol is simple and its implementations >>>>>>>> widespread,
    how to deal with the "signal and noise" of "exposed messaging
    destinations
    on the Internet", well on that thread I'm theorizing a sort of,
    "NOOBNB protocol",
    figuring to make an otherwise just standard Usenet compeer, and
    also for
    email or messaging destinations, sort of designed with the >>>>>>>> expectation that
    there will be spam, and spam and ham are hand in hand, to exclude
    it in simple terms.

    NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed

    (That and a firmer sort of "Load Shed" or "Load Hold" at the >>>>>>>> transport layer.)

    Also it would be real great if at least there was surfaced to the
    Internet a
    read-only view of any message by its message ID, a "URL", or as for
    a "URI",
    a "URN", a reliable perma-link in the IETF "news" protocol, namespace.

    https://groups.google.com/g/sci.math/c/zggff_pVEks

    I wonder that there's a reliable sort of long-term project that
    surfaces
    "news" protocol message-IDs, .... It's a stable, standards-based
    protocol.


    Thunderbird, "SLRN", .... Thanks for caring. We care. >>>>>>>>

    https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw


    One fellow reached me via e-mail and he said, hey, the Googler spam is
    outrageous, can we do anything about it? Would you write a script to
    funnel all their message-ID's into the abuse reporting? And I was
    like,
    you know, about 2008 I did just that, there was a big spam flood,
    and I wrote a little script to find them and extract their >>>>>>> posting-account,
    and the message-ID, and a little script to post to the posting-host,
    each one of the wicked spams.

    At the time that seemed to help, they sort of dried up, here there's
    that basically they're not following the charter, but, it's the
    posting-account
    in the message headers that indicate the origin of the post, not the
    email address. So, I wonder, given that I can extract the >>>>>>> posting-accounts
    of all the spams, how to match the posting-account to then determine
    whether it's a sockpuppet-farm or what, and basically about sending
    them up.

    Let me see your little script. Post it here.

    Here is a list I currently have:

    salz.txt
    usenet.death.penalty.gz
    purify.txt
    NewsAgent110-MS.exe
    HipCrime's NewsAgent (v1_11).htm
    NewsAgent111-BE.zip
    SuperCede.exe
    NewsAgent023.exe
    NewsAgent025.exe
    ActiveAgent.java
    HipCrime's NewsAgent (v1_02)_files
    NewsCancel.java (source code)

    (plus updated python versions)



    (Maybe your script is inthere somewhere?)



    Show me what you got. walk the walk.



    I try to avoid sketchy things like hiring a criminal botnet, >>>> there's the impression that that's looking at 1000's of counts >>>> of computer intrusion.

    With those being something about $50K and 10-25 apiece,
    there's a pretty significant deterrence to such activities.

    I've never much cared for "OAuth", giving away the
    keys-to-the-kingdom and all, here it looks like either
    a) a bunch of duped browsers clicked away their identities,
    or b) it's really that Google and Facebook are more than
    half full of fake identities for the sole purpose of being fake. >>>>
    (How's your new deal going?
    Great, we got a million users.
    Why are my conversions around zero?
    Your ad must not speak to them.
    Would it help if I spiced it up?
    Don't backtalk me, I'll put you on a list!)

    So, it seems mostly a sort of "spam-walling the Internet",
    where it was like "we're going to reinvent the Internet",
    "no, you aren't", "all right then we'll ruin this one".

    As far as search goes, there's something to be said
    for a new sort of approach to search, given that
    Google, Bing, Duck, ..., _all make the same results_. It's
    just so highly unlikely that they'd _all make the same
    results_, you figure they're just one.

    So, the idea, for somebody like me who's mostly interested
    in writing on the Internet, is that lots of that is of the sort >>>> of "works" vis-a-vis, the "feuilleton" or what you might
    call it, ephemeral junk, that I just learned about in
    Herman Hesse's "The Glass Bead Game".

    Then, there's an idea, that basically to surface high-quality >>>> works to a search, is that there's what's called metadata,
    for content like HTML, with regards to Dublin Core and
    RDF and so on, about a sort of making for fungible collections >>>> of works, what results searchable fragments of various
    larger bodies of works, according to their robots.txt and
    their summaries and with regards to crawling the content
    and so on, then to make federated common search corpi,
    these kinds of things.




    It's like "why are they building that new data center",
    and it's like "well it's like Artificial Intelligence, inside
    that data center is a million virts and each one has a
    browser emulator and a phone app sandbox and a
    little notecard that prompts its name, basically it's
    a million-headed hydra called a sims-bot-farm,
    that for pennies on the dollar is an instant audience."

    "Wow, great, do they get a cut?" "Don't be talking about my cut."

    Usenet traffic had been up recently, ....

    I think they used to call it "astro-turfing".
    "Artificial Intelligence?" "No, 'Fake eyeballs'."

    I have NewsAgent111-MS.exe

    I seem to be missing version 2.0

    Do you have the 2.0 version?

    I'll trade you.

    I'll give you my python version with (GUI)!!!! (Tinter)

    let's trade!

    don't bogart

    I seem to be missing this version:

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    Do you have it? you must have!






    Nope, I just wrote a little script to connect to NNTP
    with a Yes/No button on the subject, tapped through
    those, and a little script to send an HTTP request to
    the publicly-facing return-to-sender in-box, for each.

    Here's all the sources you need: IETF RFC editor.
    Look for "NNTP". How to advise Google of this is
    that each domain on the Internet is supposed to
    have an "abuse@domain" email inbox, though there's
    probably also a web request interface, as with regards
    to publicly facing services, and expected to be
    good actors on the network.

    Anyways if you read through "Meta: a usenet server
    just for sci.math", what I have in mind is a sort
    of author's and writer's oriented installation,
    basically making for vanity printouts and generating
    hypertext collections of contents and authors and
    subjects and these kinds of things, basically for
    on the order of "find all the postings of Archimedes
    Plutonium, and, the threads they are in, and,
    make a hypertext page of all that, a linear timeline,
    and also thread it out as a linear sequence".

    I.e. people who actually post to Usenet are sometimes
    having written interesting things, and, thus having
    it so that it would be simplified to generate message-ID
    listings and their corresponding standard URL's in the
    standard IETF "news" URL protocol, and to point that
    at a given news server or like XLink, is for treating
    Usenet its archives like a living museum of all these
    different authors posts and their interactions together.

    I.e., here it's "belles lettres" and "fair use",
    not just "belles" and "use".

    It seemed nice of Google Groups to front this for a long time,
    now they're quitting.

    I imagine Internet Relay Chat's still insane, though.

    Anyways I stay away from any warez and am proud that
    since about Y2K at least I've never bootlegged anything,
    and never uploaded a bootleg. Don't want to give old Shylock excuses, and besides, I wrote software for a living.

    Anyways, I don't know who was talking about "any warez" or "bootlegs", since I was refering to programs and scripts that reads:

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed
    freely as well. "

    HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and redistribute it at will, as long as you give credit to the original author. Similarly, the source is provided as reference and can be redistributed freely as well.

    https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/

    You seem to be too much 'in your head', on a high horse...

    "FREE, which means you can copy it and redistribute"
    "Similarly, the source is provided as reference and can be redistributed
    freely as well. "

    So, show me that wicked script you wrote : "funnel all their message-ID's"
    by people you call spammers who 'funnel' their products and services through Usenet newsgroups.

    You are sooooo wicked.

    and a nanofossils

    Anyways, there is only one person that know what 'nanofossils' means,
    and that is Ross Finlayson.

    I just realized that Ross Finlayson doesn't know of NEWSAGENT.

    Anyways, ...

    "Anyways"???? Who talks like that?

    Anyways..

    the problem of the 'flooding' is not the spammers, it's the 'scientific community'. They caused the problem.
    They removed the feature that NewsAgent used to get rid of ALL flooding and spammers. But, but, the
    members of the scientific community could not trust their own members to use it against them.

    If one member of the 'scientific community' disagreed with another
    member of the 'scientific community'...they were removed!

    Too much power.

    I called it...God Mode.

    Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it
    properly you can make a mistake and remove *EVERYONE'S* posts by accident.

    Everyone just completely disapeared!

    Oops. i made a booboo.

    Like that Twilight Zone episode where everyone disapears by a click of a watch.

    Where is everybody? MAJOR KILLFILE!

    So, which is worse?

    You know what GOD MODE Killfile is? That means you not only killed filed everyone, but you also sort of
    turn everyones esles killfile on. Nobody sees nobody.

    It's like the Atomic Bomb of Usenet!

    (only yous guys make bombs like that)
    (then yous get angry when everyone has the atomic bomb)

    typical.

    As I mentioned above,
    the problem of the 'flooding' is not the spammers,
    it's the 'scientific community'. They caused the problem.
    They removed the feature that was used to get rid of ALL flooding...

    Why? Because of the War of the Gods.

    If one member of the 'scientific community' disagreed with another
    member, or especially if he had a higher IQ...they were removed!

    War of the Gods.

    I will have no other gods before me...they think of themselves.


    Albert Einstein was a Jewish Supremacists.

    If you disagreed with him, you were called
    "not man of science" and "not Jewish".

    Of course his people would blacklist yous.


    So, put back the feature that gets rid of the spamming flooders and
    watch what will happen to the rests of yous.


    It will be again...The War of the Gods.


    And I will not have any Gods before me either.


    off wit your heads!

    Only those below average intelligence should dominate the Usenet!

    anything above dat...banished!








    --
    The Starmaker -- To question the unquestionable, ask the unaskable,
    to think the unthinkable, mention the unmentionable, say the unsayable,
    and challenge the unchallengeable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Starmaker@21:1/5 to All on Mon Feb 19 11:06:04 2024
    XPost: sci.physics.relativity

    So, where do you people get the idea AFTER Feb 22, the flooding will
    stop???


    Are yous saying it's all coming from Google Groups source website???

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Volney@21:1/5 to The Starmaker on Mon Feb 19 20:44:28 2024
    XPost: sci.physics.relativity

    On 2/19/2024 2:06 PM, The Starmaker wrote:
    So, where do you people get the idea AFTER Feb 22, the flooding will
    stop???


    Are yous saying it's all coming from Google Groups source website???

    The Thai casino spam is from Google and will (should) stop.

    Any spammers not spamming through Google can continue to spam, but if
    Google is no longer indexing Usenet, it will no longer be profitable for
    them.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)