• Re: Apple accused of underreporting suspected CSAM on its platforms

    From Jolly Roger@21:1/5 to badgolferman on Tue Jul 23 01:11:29 2024
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation, instead pointing
    The Guardian to a statement it made when it shelved the CSAM scanning
    plan. Apple said it opted for a different strategy that “prioritizes
    the security and privacy of [its] users.” The company told Wired in
    August 2022 that "children can be protected without companies combing
    through personal data."

    This is one reason many people choose Apple over alternatives.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Taliban is right about women@21:1/5 to badgolferman on Mon Jul 22 23:06:19 2024
    badgolferman wrote:
    Apple has been accused of underreporting the prevalence of child sexual
    abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in
    the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.

    That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
    Interactive Entertainment (3,974). Every US-based tech company is required
    to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.

    As The Guardian, which first reported on the NSPCC's claim, points out,
    Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has E2EE as well, and that service
    reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.

    “There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” Richard Collard, the NSPCC's head of child safety online policy, said. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the
    roll out of the Online Safety Act in the UK.”

    Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.
    Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company told Wired in August 2022 that "children can be protected without companies combing through personal
    data."



    https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html

    The vast majority of "child sexual abuse material" is made up of sluts aged
    11 to 17 taking nude photos of themselves to send to their "boyfriends".

    Who then, is the victim in those cases?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Your Name on Tue Jul 23 04:04:25 2024
    Your Name wrote on Tue, 23 Jul 2024 15:35:17 +1200 :

    Of course, anyone with more than one braincell couldn't care less about
    this "privacy" nonsense when it means catching more of the illegal
    scumbags who are doing wrong.

    Why stop there, Your Name?

    Why not break down the front doors of every home in the world to look for suspected images that people don't like and search each & every home?

    Why not grab people out of their cars while they're waiting at stop lights
    to sift through their belongings just in case they have images in the car?

    Hell, strip search every person at every crosswalk in the world to check
    their pockets, purses, and wallets for any images that people don't like.

    Where is it, Your Name, that your unreasonable search & seizure ends?

    Hint: When it comes to privacy, I'm in favor of Apple's strategic decision
    not to strip search every single person who owns an Apple device.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Your Name@21:1/5 to badgolferman on Tue Jul 23 15:35:17 2024
    badgolferman wrote:
    Apple has been accused of underreporting the prevalence of child sexual
    abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in
    the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.

    That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
    Interactive Entertainment (3,974). Every US-based tech company is required
    to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.

    As The Guardian, which first reported on the NSPCC's claim, points out,
    Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has E2EE as well, and that service
    reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.

    “There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” Richard Collard, the NSPCC's head of child safety online policy, said. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the
    roll out of the Online Safety Act in the UK.”

    Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.
    Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company told Wired in August 2022 that "children can be protected without companies combing through personal
    data."


    https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html


    Apple had a system ready to go, but a pile of brainless morons
    complained about their "privacy" being invaded (which it wasn't), so
    Apple was forced to abandon it. All this report achieves is to
    acknowledge that all those other companies listed above are less
    stringent about their users' privacy.

    Of course, anyone with more than one braincell couldn't care less about
    this "privacy" nonsense when it means catching more of the illegal
    scumbags who are doing wrong.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Tue Jul 23 07:45:52 2024
    Am 23.07.24 um 05:35 schrieb Your Name:
    Apple had a system ready to go, but a pile of brainless morons
    complained about their "privacy" being invaded (which it wasn't), so
    Apple was forced to abandon it. All this report achieves is to
    acknowledge that all those other companies listed above are less
    stringent about their users' privacy.

    Of course, anyone with more than one braincell couldn't care less about
    this "privacy" nonsense when it means catching more of the illegal
    scumbags who are doing wrong.

    Brain dead idiot: Learn to think! When the pictures are on the devices
    the damage to the children is irreversibly done.

    Scientific reasearch proves that spying on users does not help the
    abused or the to be abused children. This is the reality.

    Spying on users is undermining democracy and human rights.

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Tue Jul 23 07:39:07 2024
    Am 23.07.24 um 03:11 schrieb Jolly Roger:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation, instead pointing
    The Guardian to a statement it made when it shelved the CSAM scanning
    plan. Apple said it opted for a different strategy that “prioritizes
    the security and privacy of [its] users.” The company told Wired in
    August 2022 that "children can be protected without companies combing
    through personal data."

    This is one reason many people choose Apple over alternatives.

    *FACK*!
    Keyword being *prevention*.

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Tue Jul 23 15:32:29 2024
    Am 23.07.24 um 13:31 schrieb Chris:
    After being a bit skeptical of Apple's solution, I realised it was a pretty good and pragmatic balance between respecting people's privacy and
    protecting vulnerable people. I was disappointed that the angry "muh
    freedom" brigade scuppered it.

    It was neither a good nor an acceptable solution. This being the reason
    why Apple decided against it in the end.

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Tue Jul 23 15:30:13 2024
    Am 23.07.24 um 13:21 schrieb Chris:
    Jörg Lorenz <hugybear@gmx.net> wrote:
    Am 23.07.24 um 05:35 schrieb Your Name:
    Apple had a system ready to go, but a pile of brainless morons
    complained about their "privacy" being invaded (which it wasn't), so
    Apple was forced to abandon it. All this report achieves is to
    acknowledge that all those other companies listed above are less
    stringent about their users' privacy.

    Of course, anyone with more than one braincell couldn't care less about
    this "privacy" nonsense when it means catching more of the illegal
    scumbags who are doing wrong.

    Brain dead idiot: Learn to think! When the pictures are on the devices
    the damage to the children is irreversibly done.
    Obviously you have a source for that, right?

    Spying on users is undermining democracy and human rights.

    This isn't spying so no it doesn't.

    Troll! Guess why Apple dropped its plans to spy on its users.
    With your bigot attitude you are the gravedigger of a free society.

    You missed everything in the last two years, dear.

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to badgolferman on Tue Jul 23 16:04:17 2024
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved the
    CSAM scanning plan. Apple said it opted for a different strategy
    that “prioritizes the security and privacy of [its] users.” The
    company told Wired in August 2022 that "children can be protected
    without companies combing through personal data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to badgolferman on Tue Jul 23 09:22:52 2024
    On 2024-07-23 09:21, badgolferman wrote:
    Jolly Roger wrote:

    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved
    the CSAM scanning plan. Apple said it opted for a different
    strategy that “prioritizes the security and privacy of [its]
    users.” The company told Wired in August 2022 that "children
    can be protected without companies combing through personal
    data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    He said it poorly, but "privacy for everyone" IS important in that when
    you create a system that purports only to take privacy away from the bad actors, you are really taking it away from everyone.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Tue Jul 23 18:40:27 2024
    Am 23.07.24 um 18:27 schrieb Chris:
    Jörg Lorenz <hugybear@gmx.net> wrote:
    Am 23.07.24 um 13:31 schrieb Chris:
    After being a bit skeptical of Apple's solution, I realised it was a pretty >>> good and pragmatic balance between respecting people's privacy and
    protecting vulnerable people. I was disappointed that the angry "muh
    freedom" brigade scuppered it.

    It was neither a good nor an acceptable solution. This being the reason
    why Apple decided against it in the end.

    It was sunk by reactionary know-it-alls. If anyone bothered looked at the technology - which Apple published openly - they would have seen it was pretty elegant and privacy preserving.

    It could have been a really good tool to protect children, but no, people's non-rights were more important.

    You reaction is Nazi-style: At no point in the discussion you wanted to
    protect innocent children. You probably have a hidden agenda.

    Bigot religious groups are the biggest risk to free Western societies.

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to badgolferman on Tue Jul 23 09:48:37 2024
    On 2024-07-23 09:29, badgolferman wrote:
    Alan wrote:

    On 2024-07-23 09:21, badgolferman wrote:
    Jolly Roger wrote:

    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it
    shelved the CSAM scanning plan. Apple said it opted for
    a different strategy that “prioritizes the security and
    privacy of [its] users.” The company told Wired in August
    2022 that "children can be protected without companies
    combing through personal data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    He said it poorly, but "privacy for everyone" IS important in that
    when you create a system that purports only to take privacy away from
    the bad actors, you are really taking it away from everyone.


    If you're on the internet there is no pretense of privacy.

    That's a cop out.

    Sorry, but it is.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Tue Jul 23 17:52:47 2024
    On 2024-07-23, Chris <ithinkiam@gmail.com> wrote:
    badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved the
    CSAM scanning plan. Apple said it opted for a different
    strategy that “prioritizes the security and privacy of [its]
    users.” The company told Wired in August 2022 that "children
    can be protected without companies combing through personal
    data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Agreed.

    No-one is forced to use icloud. If they didn't like the policy, they
    could go elsewhere.

    True. Unlike others, Apple's proposal was to only scan images on device
    that were being uploaded to Apple's servers, and only match hashes of
    them to a database of hashes matching known CSAM images. And only after multiple matches reached a threshold would Apple investigate further.
    Yet even with those precautions, there was still a realistic chance of
    false positives and invasion of privacy, which is why they scrapped the proposal.

    Like to google and meta, who are more than happy to share millionsof
    people's private photos with law enforcement which apparently is just
    fine.

    And they do so with zero privacy protections in place.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to badgolferman on Tue Jul 23 17:49:14 2024
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved the
    CSAM scanning plan. Apple said it opted for a different strategy
    that “prioritizes the security and privacy of [its] users.” The >>>>> company told Wired in August 2022 that "children can be protected
    without companies combing through personal data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can
    work, and why Apple wisely withdrew their proposal even though it worked
    harder to preserve privacy than any other solution.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Tue Jul 23 17:55:55 2024
    On 2024-07-23, Chris <ithinkiam@gmail.com> wrote:
    Jörg Lorenz <hugybear@gmx.net> wrote:
    Am 23.07.24 um 13:31 schrieb Chris:

    After being a bit skeptical of Apple's solution, I realised it was a
    pretty good and pragmatic balance between respecting people's
    privacy and protecting vulnerable people. I was disappointed that
    the angry "muh freedom" brigade scuppered it.

    It was neither a good nor an acceptable solution. This being the
    reason why Apple decided against it in the end.

    It was sunk by reactionary know-it-alls. If anyone bothered looked at
    the technology - which Apple published openly - they would have seen
    it was pretty elegant and privacy preserving.

    It still allowed for privacy invasion with false positives - and that's
    the point.

    It could have been a really good tool to protect children, but no,
    people's non-rights were more important.

    Apple's proposal was to match personal photos with *known* CSAM images.
    It would do nothing to detect *new* CSAM images. And it could not
    prevent false positive matches.

    Everyone on this planet should have a right to basic privacy.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to badgolferman on Tue Jul 23 11:09:24 2024
    On 2024-07-23 10:53, badgolferman wrote:
    Jolly Roger wrote:

    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman
    <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation,
    instead pointing The Guardian to a statement it made when
    it shelved the CSAM scanning plan. Apple said it opted
    for a different strategy that “prioritizes the security
    and privacy of [its] users.” The company told Wired in
    August 2022 that "children can be protected without
    companies combing through personal data."

    This is one reason many people choose Apple over
    alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can
    work, and why Apple wisely withdrew their proposal even though it
    worked harder to preserve privacy than any other solution.

    No, I take exception to your statement that privacy is for *everyone*.
    There are plenty of people in this world who should have their privacy
    and freedom taken away forever. Many of them their lives too.

    The trouble is differentiating between them BEFORE you know who they are.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Tue Jul 23 20:03:46 2024
    Jolly Roger wrote on 23 Jul 2024 17:55:55 GMT :

    Everyone on this planet should have a right to basic privacy.

    I agree with anyone who makes a logically sensible assessment of fact.

    I fully agree with Jolly Roger (and I disagree with badgolferman).

    We should never have to defend our right to privacy.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to badgolferman on Tue Jul 23 19:59:41 2024
    badgolferman wrote on Tue, 23 Jul 2024 17:53:10 -0000 (UTC) :

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can >>work, and why Apple wisely withdrew their proposal even though it
    worked harder to preserve privacy than any other solution.

    No, I take exception to your statement that privacy is for *everyone*.
    There are plenty of people in this world who should have their privacy
    and freedom taken away forever. Many of them their lives too.

    Tattoos are for everyone also, but if this unwarranted search & seizure continues unabated we will soon be strip searching everyone at every
    entrance to public transportation for tasteless tattoos & arresting them.

    The point is that what Google/Meta are doing appears to be (as far as we
    know) completely and wholly unwarranted illegal searches & seizures.

    I know this because I know one item that every adult should ask...

    Q: What is the conviction rate of Meta/Google versus that of Apple?
    A: If we don't know that - then everything is complete bullshit.

    This is a logically sensible point of view.

    The news, as reported, is complete bullshit as without that conviction
    rate, it's meaningless that Google & Meta report more tasteless tattoos
    than does Apple.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Tue Jul 23 19:56:57 2024
    Chris wrote on Tue, 23 Jul 2024 11:26:41 -0000 (UTC) :

    Hint: When it comes to privacy, I'm in favor of Apple's strategic decision >> not to strip search every single person who owns an Apple device.

    But you're happy that Google, Meta et al are reporting millions of images
    to law enforcement? I don't see you criticising them.

    You religious zealots always jump to the conclusion that everyone who is
    not an Apple nutcase must be an Android nutcase - which just isn't true.

    First off, I can't even spell CSAM and I don't even know what it stands
    for, although I'm sure I could look it up if I cared just to find it's
    perhaps it's some well marketed acronym of something related to child porn.

    Having said that I don't even think about CSAM in my daily life, I saw the
    news just like everyone else did - and I applauded Apple because I'm a
    sensibly logical adult who understands more about privacy than most people.

    First off, the numbers are of *reported* child porn, as I read the article (which I only skimmed as it's not any big concern to me) so I call bullshit
    on Google and Meta and whomever it is *reporting* these suspected images.

    I could report my neighbors for sodomy because I hear sounds that I don't recognize and it would be just as valid as these "suspected" images are.

    I could report my neighbors for hiding a body in the basement because I
    heard them shoveling or of burning that body because I smelled a BBQ fire.

    Hell, I could report all the ignorant iKooks for suspected CSAM too.

    What matters in terms of comparing Apple to the likes of Google & Meta is
    the number of comparative convictions - of which we have no useful data.

    Bullshit in. Bullshit out.

    We have no idea of the comparative conviction rate, so for all we know,
    Apple's reports could have resulted in ten times as many convictions as
    Meta's and Google's reports combined. Bullshit in. Bullshit out.

    At this point, I'm completely against unreasonable search and seizure,
    which is why I applaud Apple and denigrate Google & Meta on CSAM'ing.

    Why don't we look for banned books next?
    And evil thoughts too!

    Hell, let's strip search every person to check them for tasteless tattoos.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to badgolferman on Tue Jul 23 22:10:44 2024
    badgolferman wrote on Tue, 23 Jul 2024 20:13:38 -0000 (UTC) :

    Everyone on this planet should have a right to basic privacy.

    I agree with anyone who makes a logically sensible assessment of fact.

    I fully agree with Jolly Roger (and I disagree with badgolferman).

    We should never have to defend our right to privacy.


    The object of my disagreement is not CSAM, it's the idea that everyone has the right to privacy. Prisoners, murderers, rapists, child molestors, etc.
    do not deserve even basic privacy. Many of them are let off scot free for stupid technicalities while the families are left with no justice. Many of these people have a long history of harming others and should have been
    under surveillance or even locked up long before they hurt another person.

    It's an interesting adult question of who deserves the right to privacy.

    I'm sure in some fundamentalist countries, gay peop... ooops "LGBTQIA+"
    people (according to this lookup https://gaycenter.org/community/lgbtq/)
    have no right to live, let alone their right to privacy.

    Hence, if I think, on a whim, that my neighbor is practicing sodomy, simply because he closed his curtains at night, then certainly, he deserves no
    right to privacy. The police... nay, the entire neighborhood is justified
    in barging onto his property, breaking down his door, busting in his
    bedroom door - on the mere whim of a suspicion he is practicing sodomy.

    Why should our laws allow my neighbor any rights when he's probably having
    anal sex with his lover? He has no right to privacy. None whatsoever.

    Why not?
    Because I don't like what he's doing, that's why.

    Only I get to make the rules.
    Nobody else.

    Of course, I'm making a point about who decides who has rights.
    You've decided entire groups of people have no rights.
    So have fundamentalist societies decided gay people have no rights.

    It's how the world works.
    But we don't have to agree with the decision that gay people have no rights
    and we don't have to agree that lawbreakers have no right.

    Hell, they're all recidivists, in the way you're stating things.
    So they should never be afforded any of the Constitutional rights.

    This is, of course, absurd.
    The point of who has "rights" is up to the society that they live in.

    While I need not say I'm no lawyer, it's my understanding that, here, in
    the USA, we are *all* presumed innocent until proven guilty - right?

    And, we have in our Constitution the fundamental right to not be subject to unreasonable search & seizure, right? Nor should our property be detained.

    Bearing in mind that for all we know, exactly ZERO people may have been convicted after all those Google, Meta (and yes, Apple) reports, the
    article is clearly bullshit meant to be an unwarranted attack on Apple.

    For now, I'm going to assume, for lack of data, that exactly zero people
    were convicted - which means Google, Meta, and yes, Apple, broke the law.

    Apple just does it far less than Google & Meta did.
    Without the conviction rate - we have no business lambasting Apple.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to badgolferman on Tue Jul 23 23:54:57 2024
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 23 Jul 2024 17:55:55 GMT :

    Everyone on this planet should have a right to basic privacy.

    I agree with anyone who makes a logically sensible assessment of
    fact.

    I fully agree with Jolly Roger (and I disagree with badgolferman).

    We should never have to defend our right to privacy.

    The object of my disagreement is not CSAM, it’s the idea that everyone
    has the right to privacy.

    This entire thread is in the context of CSAM, so that's a cop out.

    Prisoners, murderers, rapists, child molestors, etc.

    Again, in the context of this thread, we're talking about people who
    have never been convicted or suspected of such crimes.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to badgolferman on Tue Jul 23 23:52:26 2024
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman
    <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation,
    instead pointing The Guardian to a statement it made when it
    shelved the CSAM scanning plan. Apple said it opted for a
    different strategy that “prioritizes the security and privacy >>>>>>> of [its] users.” The company told Wired in August 2022 that
    "children can be protected without companies combing through
    personal data."

    This is one reason many people choose Apple over
    alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can
    work, and why Apple wisely withdrew their proposal even though it
    worked harder to preserve privacy than any other solution.

    No, I take exception to your statement that privacy is for *everyone*.
    There are plenty of people in this world who should have their privacy
    and freedom taken away forever. Many of them their lives too.

    Not people who haven't been convicted of crimes, no - which is exactly
    what CSAM scanning does.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Tue Jul 23 23:57:40 2024
    On 2024-07-23, Andrew <andrew@spam.net> wrote:
    Chris wrote on Tue, 23 Jul 2024 11:26:41 -0000 (UTC) :

    Hint: When it comes to privacy, I'm in favor of Apple's strategic
    decision not to strip search every single person who owns an Apple
    device.

    But you're happy that Google, Meta et al are reporting millions of
    images to law enforcement? I don't see you criticising them.

    You religious zealots always jump to the conclusion that everyone who
    is not an Apple nutcase must be an Android nutcase - which just isn't
    true.

    Yet what he said is true: We don't see you criticizing other companies
    for it even though they do CSAM scanning and are much more invasive
    about it than Apple proposed.

    First off, I can't even spell CSAM

    Pretty sure you just did.

    and I don't even know what it stands for

    Everyone knows it stands for Child Sexual Abuse Material.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Wed Jul 24 02:10:40 2024
    Jolly Roger wrote on 23 Jul 2024 23:57:40 GMT :

    You religious zealots always jump to the conclusion that everyone who
    is not an Apple nutcase must be an Android nutcase - which just isn't
    true.

    Yet what he said is true: We don't see you criticizing other companies
    for it even though they do CSAM scanning and are much more invasive
    about it than Apple proposed.

    First off, I agree with anyone proposing a sensibly logical opinion.
    One which is based on the facts and which is supported by those facts.

    Hence, I agreed with your statements that what Apple is doing is far better than what Google & Facebook are doing - which is an unreasonable breach of privacy.

    If you think I defend Google, you must truly be nuts.
    And even more so, if you think I defend Facebook, you must be nuts.

    You Apple zealots think everyone is a religious zealot, Jolly Roger.
    For you, it's all about the company and not about the technology.

    For me, hell, I don't like Facebook any more than I don't like Apple.
    Nor Microsoft. Nor Google. I don't defend any of the OEMs to death, JR.

    I don't defend Google to the death any more than I would defend Microsoft
    to the death. I say the truth about all operating systems, Jolly Roger.

    You don't see that because you are here to defend Apple to the death.
    In this case, I agree with you on Apple's record on not reporting images.

    However... what none of us know is the conviction rate of these image
    reports, where for all we know exactly zero people may have been convicted.

    If that's the case, then this whole CSAM crap is a crime against privacy.
    The articles are just clickbait without that critical missing information.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 11:04:11 2024
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech
    companies are rather than complain at Apple for not sending millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in child safety?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 11:11:38 2024
    Chris wrote on Wed, 24 Jul 2024 07:05:03 -0000 (UTC) :

    Everyone on this planet should have a right to basic privacy.

    And they do. That right is not absolute, however. Just like everyone has a right to freedom until they are convicted of a serious crime and are sent
    to prison. Even *suspects* of serious crimes are held in prison before conviction.

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety gained
    is zero.

    No child was protected.
    Everyone lost their privacy.

    Think about that.
    It's basic math.

    In fact, it's irresponsible yellow journalism.
    Because without the conviction rate, the reported images is meaningless.

    Nobody is protected.
    Everyone loses privacy.

    That's the math.

    Otherwise, they would have stated the conviction rate.
    And they clearly did not.

    Do you think they simply forgot about it?
    Hint: They're not that stupid, Chris.

    They saved nobody.
    They harmed everyone.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 11:07:46 2024
    Chris wrote on Wed, 24 Jul 2024 06:53:02 -0000 (UTC) :

    First off, I can't even spell CSAM and I don't even know what it stands
    for, although I'm sure I could look it up if I cared just to find it's
    perhaps it's some well marketed acronym of something related to child porn.

    You're depraved if you genuinely believe CSAM is just marketing.

    It's people like you - who don't care about the real harms being done -
    that ruin perfectly good ideas because it has some tiny (probably theoretical) impact on your lives.

    Bullshit.

    Chris. Please put your well-educated adult hat on when you read this.
    Warning: There is math involved in sensibly logical thought processes.

    1. There is NOTHING in the news articles about conviction rates.
    2. Let's say that another way: The articles are complete bullshit!

    The articles are yellow journalism clickbait, Chris.

    Without specifying conviction rates, the number of reported images is completely meaningless, Chris.

    In fact, it could be Apple's conviction rate is far greater than that oF
    Google & Facebook for all we know, Chris.

    Since the conviction rate isn't stated, I will assume it is zero.
    Think about that.

    For absolutely zero gain in child safety, all our privacy was compromised.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Wed Jul 24 15:47:08 2024
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    True. Unlike others, Apple's proposal was to only scan images on
    device that were being uploaded to Apple's servers, and only match
    hashes of them to a database of hashes matching known CSAM images.
    And only after multiple matches reached a threshold would Apple
    investigate further.

    All correct.

    Yet even with those precautions, there was still a realistic chance
    of false positives

    The rate was deterministic and tunable.

    If the rate was anything other than ZERO, them people's privacy was at
    risk.

    and invasion of privacy, which is why they scrapped the proposal.

    No.

    Yes.

    They scrapped it because it wasn't worth pursuing. As a business it
    was of no benefit to them and the noisy reaction was enough to put
    them off. There wasn't any "invasion of privacy". At least no more
    than there currently is in the US.

    Incorrect. Apple's statement makes it clear that their decision to scrap
    CSAM scanning was based on the feedback they received from security and
    privacy professionals:

    ---
    “After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our
    investment in the Communication Safety feature that we first made
    available in December 2021,” the company told WIRED in a statement. “We have further decided to not move forward with our previously proposed
    CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working
    with governments, child advocates, and other companies to help protect
    young people, preserve their right to privacy, and make the internet a
    safer place for children and for us all.”
    ---

    For those unaware, the Communication Safety feature is not the same
    thing at all: rather than scanning photos being uploaded to iCloud to
    match against known CSAM photo hashes, Communication Safety for Messages
    is opt-in and analyzes image attachments users send and receive on their devices to determine whether a ph
  • From Jolly Roger@21:1/5 to Andrew on Wed Jul 24 15:48:22 2024
    On 2024-07-24, Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech
    companies are rather than complain at Apple for not sending millions
    of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    It's nowhere near zero.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Your right to privacy shouldn't be violated because someone else might do something wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Wed Jul 24 15:34:16 2024
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    Apple's proposal was to match personal photos with *known* CSAM
    images.

    Correct.

    It would do nothing to detect *new* CSAM images.

    Also correct.

    And it could not prevent false positive matches.

    Incorrect.

    Nope. I am correct. It absolutely could not prevent false matches.

    It is designed to avoid false positives, although nothing is 100%
    perfect.

    If it has even .1 % false matches, then someone's privacy has been
    violated.

    Everyone on this planet should have a right to basic privacy.

    And they do.

    Tell that to the people whose private photos are scanned and are falsely accused of a crime they didn't commit because an imperfect algorithm got
    it wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Wed Jul 24 15:39:57 2024
    On 2024-07-24, Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:05:03 -0000 (UTC) :

    Everyone on this planet should have a right to basic privacy.

    And they do. That right is not absolute, however. Just like everyone
    has a right to freedom until they are convicted of a serious crime
    and are sent to prison. Even *suspects* of serious crimes are held in
    prison before conviction.

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Nah, there are news articles all of the time about shit like this:

    Man Stored Child Pornography on Google Account, Sentenced to 14 Years in Federal Prison <https://www.justice.gov/usao-wdtx/pr/man-stored-child-pornography-google-account-sentenced-14-years-federal-prison>

    But there are also plenty of horrendous stories of false matches:

    Google AI flagged parents’ accounts for potential abuse over nude photos
    of their sick kids <https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation>

    Apple's proposal went further than any other to protect the privacy of
    its customers as well as reduce the possibility of false matches, but it
    still fell short and risked people's privacy, which is why they ended up scrapping it.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Wed Jul 24 09:44:02 2024
    On 2024-07-24 08:47, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    True. Unlike others, Apple's proposal was to only scan images on
    device that were being uploaded to Apple's servers, and only match
    hashes of them to a database of hashes matching known CSAM images.
    And only after multiple matches reached a threshold would Apple
    investigate further.

    All correct.

    Yet even with those precautions, there was still a realistic chance
    of false positives

    The rate was deterministic and tunable.

    If the rate was anything other than ZERO, them people's privacy was at
    risk.

    By that argument, we must also scrap the traditional system of issuing
    warrants to search people's homes, because there is a non-zero rate of
    warrants issued in error.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Wed Jul 24 17:51:40 2024
    Jolly Roger wrote on 24 Jul 2024 15:39:57 GMT :

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Nah, there are news articles all of the time about shit like this:

    Intelligent people have an understanding of math that you lack, JR.

    Even though I agree with your premise that Apple is doing the right thing,
    it never surprises me how little you zealots understand of basic math.

    What matters is the percentage of reported images which result in a
    conviction - which - is glaringly missing from the yellow journalism.

    That glaring omission alone says that the articles are complete bullshit.
    It's not like they didn't know that's the most important metric of all.

    The fact that the most important metric is missing, means the articles are complete bullshit - devoid of enough information to make any assessments.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 17:54:41 2024
    Chris wrote on Wed, 24 Jul 2024 17:23:20 -0000 (UTC) :

    Tell that to the people whose private photos are scanned and are falsely
    accused of a crime they didn't commit because an imperfect algorithm got
    it wrong.

    It's already happening. Is it better that 30m images were used to violate
    god knows how many people currently or a better method where a far tinier amount which are highly enriched for true positive?

    Chris,

    You can't make that assessment without fabricating the percentage of convictions, which, let's be clear, is the most important metric of all.

    The people who reported all this CSAM bullshit *know* that the percentage
    of convictions is the most important metric without which everything is BS.

    Given they didn't bother to report that metric, we must assume it's 0.

    Hence, as far as we know, out of 30 million images reported, exactly zero convictions resulted - which means every report was a false positive.

    Think about that.

    Everyone was harmed.
    Nobody was protected.

    It's a classic case of pure bullshit.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 18:08:41 2024
    Chris wrote on Wed, 24 Jul 2024 17:23:22 -0000 (UTC) :

    Chris. Please put your well-educated adult hat on when you read this.
    Warning: There is math involved in sensibly logical thought processes.

    You present no maths. Just baseless supposition.

    Chris,
    When you earned that PhD, did you never take any logic classes?

    Reporting images isn't the end goal.
    The end goal is convictions.

    Everyone knows that.

    The fact the stories don't report the only metric that matters is a very
    clear indication the CSAM process is a complete sham. It's pure bullshit.

    1. There is NOTHING in the news articles about conviction rates.
    2. Let's say that another way: The articles are complete bullshit!

    The articles are yellow journalism clickbait, Chris.

    Without specifying conviction rates, the number of reported images is
    completely meaningless, Chris.

    In fact, it could be Apple's conviction rate is far greater than that oF
    Google & Facebook for all we know, Chris.

    Since the conviction rate isn't stated, I will assume it is zero.
    Think about that.

    You're only making that assumption because it suits your preference and
    bias.

    The only metric that matters is the conviction rate.
    Not the number of images reported.

    Did you ever take logic in college, Chris?

    There might have been thousands of arrests and convictions. Or will be in
    the future as the investigations can take a while.

    Have you ever seen Ukraine videos where the Kadarovs are firing thousands
    of bullets in those TikTok videos? Nobody was hit. It's all pure bullshit.

    Same here.

    The fact they didn't report convictions means the whole CSAM story is
    bullshit, Chris. It's pure bullshit that has no reason to exist.

    Everyone is hurt.
    Nobody is protected.


    For absolutely zero gain in child safety, all our privacy was compromised.

    Simply a convenience for you to think like that.

    For you to claim that the only thing that matters is the number of images reported and not in the number of convictions, is not surprising, Chris.

    Maybe you should take a class in logic.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Wed Jul 24 18:03:05 2024
    Jolly Roger wrote on 24 Jul 2024 15:48:22 GMT :

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    It's nowhere near zero.

    It's probably zero given it's the most important metric.
    And it's missing from the story.

    Everyone knows that's the only metric that matters.
    Without convictions, the reporting of CSAM images is meaningless.

    The reason they didn't report it is likely because it's actually zero.
    Which means this whole CSAM reporting issue is complete bullshit.


    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Your right to privacy shouldn't be violated because someone else might do something wrong.

    I agree with you.

    They get zero convictions.
    Which means nobody is protected.
    And yet, everyone is harmed.

    If they had convictions, they'd have mentioned it.
    The fact they don't mention it, means it's zero.

    Because it isn't likely that they simply forgot the most important fact.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Andrew on Wed Jul 24 11:27:06 2024
    On 2024-07-24 11:08, Andrew wrote:
    Chris wrote on Wed, 24 Jul 2024 17:23:22 -0000 (UTC) :

    Chris. Please put your well-educated adult hat on when you read this.
    Warning: There is math involved in sensibly logical thought processes.

    You present no maths. Just baseless supposition.

    Chris,
    When you earned that PhD, did you never take any logic classes?

    Reporting images isn't the end goal.
    The end goal is convictions.

    Everyone knows that.

    The fact the stories don't report the only metric that matters is a very clear indication the CSAM process is a complete sham. It's pure bullshit.

    1. There is NOTHING in the news articles about conviction rates.
    2. Let's say that another way: The articles are complete bullshit!

    The articles are yellow journalism clickbait, Chris.

    Without specifying conviction rates, the number of reported images is
    completely meaningless, Chris.

    In fact, it could be Apple's conviction rate is far greater than that oF >>> Google & Facebook for all we know, Chris.

    Since the conviction rate isn't stated, I will assume it is zero.
    Think about that.

    You're only making that assumption because it suits your preference and
    bias.

    The only metric that matters is the conviction rate.
    Not the number of images reported.

    Did you ever take logic in college, Chris?

    The only metric that matters with battery capacity in smartphones is run
    time.

    Did you never take logic in college, Arlen?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chris on Wed Jul 24 11:26:07 2024
    On 2024-07-24 10:12, Chris wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech
    companies are rather than complain at Apple for not sending millions of
    photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions resulted. >>
    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in child safety?

    Apple's solution wouldn't have resulted in any additional loss of privacy plus it only affected customers of icloud. Don't like it? Don't use icloud. Simple.


    It's amazing how many people don't understand that private companies
    aren't constrained to avoid violating one's rights in the way that
    governments are.

    If Apple had thought it was a good idea to implement its on-device CSAM checking system, that would have been their prerogative, and the remedy
    that anyone who didn't like would have had would have been...

    ...not accepting Apple's services!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Wed Jul 24 18:17:09 2024
    Chris wrote on Wed, 24 Jul 2024 17:12:25 -0000 (UTC) :

    It's interesting that you decided to choose (male) homosexuality as your exemplar.

    I also used tasteless tattoos as an example but it's interesting you only
    found the male homosexuality interesting...

    Something that has been oppressed and vilified by the church for centuries primarily because of the distaste (fetishisation even) for the sexual act - sodomy like you mention - for no good reason.

    My point is the whole CSAM thing is bullshit.
    It protects nobody.
    It hurts everyone.

    Curiously lesbianism was rarely so targeted and was often just accepted, if not mentioned.

    Like you say it is purely an opinion and choice which the state has no
    right to have a say when it involves consenting adults in a private place with no harm being done.

    My point is that it's not up to you and me what we want to convict others
    of. It's up to the laws of teh country. And in some countries, you can get executed for things people do left and right here in California, Chris.

    Which is my point about CSAM.

    The whole CSAM thing is pure bullshit.
    a. Nobody is protected.
    b. Everyone is harmed.


    You're, maybe unconsciously, making an equivalence here between
    homosexuality and child sexual abuse because like many the focus is on the sex rather than the abuse. Homosexuality is now legal so maybe CSAM is just an opinion, right?

    Again you focus on sexuality because of your own proclivities, Chris.
    I mentioned tasteless tattoos also - but that doesn't interest you.

    Wrong. Children can never consent to the abuse and are very vulnerable so need additional protections by the adults in the room. There is no way we should allow this to happen when we have ways to stop it in a healthy society. Impacting on someone's theoretical privacy - which is unproven -
    is a reasonable balance.

    Given for all any of us know for sure, exactly zero convictions resulted.

    That means everyone was hurt.
    And nobody was protected.

    You need to take a class in logic, Chris.

    The point of who has "rights" is up to the society that they live in.

    Correct. No person's rights are more important then another's. Your right
    to freedom doesn't overrule someone's right to life.

    That was my point. In some societies, women are killed for not covering themselves up, right? It's not up to you or to me. It's up to the society.

    In our society, the whole CSAM thing is nothing but pure bullshit.
    a. Nobody is protected.
    b. Everyone is hurt.


    While I need not say I'm no lawyer, it's my understanding that, here, in
    the USA, we are *all* presumed innocent until proven guilty - right?

    In theory in the US. Not always in practice.

    The whole CSAM bullshit violates the Constitution in my humble opinion; but
    I will caveat that statement by making it very clear I am not a lawyer.

    My point is I applaud that Apple didn't fall for the CSAM bullshit.
    I lament that Google & Facebook did.

    And, we have in our Constitution the fundamental right to not be subject to >> unreasonable search & seizure, right? Nor should our property be detained.

    Key word here is "unreasonable". All your examples have clearly been unreasonable so very simplistic to defend.

    It's unreasonable that a person takes a photo of a baby and they get
    reported by the likes of Google & Facebook for something that innocent.

    We all know that the law allows for people's rights to be suspended even constitutional ones. Companies have an obligation to uphold the law.

    In the USA, we have basic rights that most countries don't afford their citizens. One of those rights is to not be subject to false imprisonment.

    The whole CSAM thing violates the US Constitution, in my humble opinion. However, I will state very clearly that I am not a lawyer.

    Bearing in mind that for all we know, exactly ZERO people may have been
    convicted after all those Google, Meta (and yes, Apple) reports, the
    article is clearly bullshit meant to be an unwarranted attack on Apple.

    For now, I'm going to assume, for lack of data, that exactly zero people
    were convicted - which means Google, Meta, and yes, Apple, broke the law.

    Which law, exactly?

    Without any convictions being reported as a result of those reported
    images, we have to assume, a priori, that the conviction rate is zero.

    No other assessment is possible given it's the most important metric.
    And given that the most important metric was omitted from the reports.

    That means it's far more likely to be zero convictions, Chris.
    It's basic logic.

    Apple just does it far less than Google & Meta did.
    Without the conviction rate - we have no business lambasting Apple.

    Right. So where are your multitude of posts attacking google on the android forum?

    Have you every heard me say anything good about Google, Chris?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Wed Jul 24 21:29:07 2024
    On 2024-07-24, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 15:39:57 GMT :

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Nah, there are news articles all of the time about shit like this:

    Intelligent people have an understanding of math that you lack, JR.

    Intelligent people know what "absolutely zero" means, little Arlen.

    What matters is the percentage

    No, words have meanings, and zero means zero. And there is a
    higher-than-zero number of pedophiles who have been caught due to CSAM scanning. Unfortunately, there is also a higher-than-zero number of
    innocent people whose privacy was violated in the process.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Wed Jul 24 21:30:33 2024
    On 2024-07-24, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-24 08:47, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    True. Unlike others, Apple's proposal was to only scan images on
    device that were being uploaded to Apple's servers, and only match
    hashes of them to a database of hashes matching known CSAM images.
    And only after multiple matches reached a threshold would Apple
    investigate further.

    All correct.

    Yet even with those precautions, there was still a realistic chance
    of false positives

    The rate was deterministic and tunable.

    If the rate was anything other than ZERO, them people's privacy was
    at risk.

    By that argument, we must also scrap the traditional system of issuing warrants to search people's homes, because there is a non-zero rate of warrants issued in error.

    Nah. Most warrants meet probable cause standards before a judge will
    sign them. CSAM scanning requires no such due process. They are nowhere
    near the same thing.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Wed Jul 24 21:35:02 2024
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech
    companies are rather than complain at Apple for not sending millions
    of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy violations.

    plus it only affected customers of icloud. Don't like it? Don't use
    icloud. Simple.

    That much is true. Only images uploaded to iCloud would have been
    examined by the algorithm.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Wed Jul 24 21:33:12 2024
    On 2024-07-24, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 15:48:22 GMT :

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    It's nowhere near zero.

    It's probably zero given it's the most important metric.

    It's not zero. Not even close.

    Without convictions, the reporting of CSAM images is meaningless.

    There have been plenty of convictions.

    The reason they didn't report it is likely because it's actually zero.

    No.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Your right to privacy shouldn't be violated because someone else
    might do something wrong.

    I agree with you.

    They get zero convictions.

    That's a lie.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 13:07:04 2024
    Jolly Roger wrote on 24 Jul 2024 21:33:12 GMT :

    It's probably zero given it's the most important metric.

    It's not zero. Not even close.

    You're simply guessing. I'm using logic.
    They're different logical algorithms.

    Without convictions, the reporting of CSAM images is meaningless.

    There have been plenty of convictions.

    Logically, if they had appreciable convictions, they'd have mentioned it.
    The fact they don't bother to mention it, means it's probably almost zero.

    Because it isn't likely that they simply forgot the only fact that matters.

    The reason they didn't report it is likely because it's actually zero.

    No.

    It's the only fact that matters.
    And they forgot it?

    No. They're not that stupid. They're bullshitting us on this CSAM garbage.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Your right to privacy shouldn't be violated because someone else
    might do something wrong.

    I agree with you.

    They get zero convictions.

    That's a lie.

    I realize you guess at everything in life, Jolly Roger.
    I use logic.

    Logically thinking, the only metric that matters is convictions.
    a. It doesn't matter how many people or images are reported.
    b. It matters how many (or what percentage) are convicted.
    c. If they "forgot" to report that metric, most likely it's almost zero.
    d. Which means nobody is protected by CSAM while everyone is harmed.

    Anyone can prove me wrong by showing the CSAM reporting conviction rates.
    But they can't.

    That's likely because the conviction rates are likely almost zero.
    If you hate logic, then prove that logic wrong with facts, not guesses.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 13:54:47 2024
    Jolly Roger wrote on 24 Jul 2024 21:30:33 GMT :

    By that argument, we must also scrap the traditional system of issuing
    warrants to search people's homes, because there is a non-zero rate of
    warrants issued in error.

    Nah. Most warrants meet probable cause standards before a judge will
    sign them. CSAM scanning requires no such due process. They are nowhere
    near the same thing.

    When I assess that Alan Baker's IQ is the lowest of all the low-IQ Apple zealots, I didn't expect Jolly Roger to prove me right on that assessment.

    For Alan Baker to compare a warrant which is issued by a judge only after probable cause has been established on the public record (which the
    defendant has every right to object to in court, which would overturn his conviction) to CSAM scanning of every possible image uploaded, is absurd.

    That absurd thought process shows Alan Baker's IQ can't be above around 40.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 14:04:03 2024
    XPost: alt.privacy

    Jolly Roger wrote on 24 Jul 2024 21:29:07 GMT :

    What matters is the percentage

    No, words have meanings, and zero means zero. And there is a
    higher-than-zero number of pedophiles who have been caught due to CSAM scanning. Unfortunately, there is also a higher-than-zero number of
    innocent people whose privacy was violated in the process.

    While I support that Apple didn't report nearly the pure numbers of images
    that Google & Facebook did, the reports were extremely clear in what their click-bait blame-Apple yellow journalism objective was, were they not?

    The reports were attempting to convince us that Apple isn't doing its job
    in "protecting children" that Google & Facebook were doing, were they not?

    And yet, it could be Apple's conviction rate was 99% while Google &
    Facebook had 1% conviction rates - which would mean that Apple is
    "protecting children" fare more than Google/FB did, isn't that right?

    You see, I know logic. I also know bullshit.
    Without the conviction rate, simply comparing images reported is bullshit.

    They're not stupid.
    They know this.

    The fact they didn't report the only metric that matters, means something.

    Most likely, it means they know what wasn't reported - which is that the conviction rate on all these invasions of privacy are probably nearly 0.

    The result?
    a. Nobody is protected.
    b. Everyone is hurt.

    Prove me wrong with conviction rates for images reported.
    Nothing else has any meaning.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 13:50:36 2024
    XPost: alt.privacy

    Jolly Roger wrote on 24 Jul 2024 21:35:02 GMT :

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy violations.

    plus it only affected customers of icloud. Don't like it? Don't use
    icloud. Simple.

    That much is true. Only images uploaded to iCloud would have been
    examined by the algorithm.

    While I fully agree with what Apple is doing compared to Google/FB...
    I'm going to see if you guys can work out the basic logic involved, OK?

    1. The articles clearly were lambasting Apple, right?
    2. They were saying Apple underreports CSAM, right?
    3. To do that, they reported CSAM numbers between Apple & others, right?

    Guess what.

    The number of reports is NOT a meaningful metric without the percentage of those reports that result in convictions. That's just basic logic, right?

    The fact they "forgot" to show the most meaningful metric, while they were clearly desperate to show that Apple underreports the CSAM numbers, is a
    clue by four that they are bullshitting us.

    They're not that stupid.

    They KNOW if they reported the conviction rate, their argument would fall
    flat - so that's likely why they conveniently forgot about the only metric
    that matters.

    In fact, it could be Apple's conviction rate is 99% (for all we know),
    while Google's conviction rate could be 50% & Facebook's 99%.

    Without knowing the conviction rate, the reported numbers are meaningless. Since they know that (they're not stupid), they likely bullshitting us.

    If you don't like logic, simply prove me wrong with the conviction rates.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 15:38:54 2024
    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 21:30:33 GMT :

    By that argument, we must also scrap the traditional system of
    issuing warrants to search people's homes, because there is a
    non-zero rate of warrants issued in error.

    Nah. Most warrants meet probable cause standards before a judge will
    sign them. CSAM scanning requires no such due process. They are
    nowhere near the same thing.

    When I assess that Alan Baker's IQ is the lowest of all the low-IQ
    Apple zealots, I didn't expect Jolly Roger to prove me right on that assessment.

    For Alan Baker to compare a warrant which is issued by a judge only
    after probable cause has been established on the public record (which
    the defendant has every right to object to in court, which would
    overturn his conviction) to CSAM scanning of every possible image
    uploaded, is absurd.

    That absurd thought process shows Alan Baker's IQ can't be above
    around 40.

    Your obsession with meaningless IQ numbers says way more about you than
    anyone else, little Arlen. And your continual drive to insult anyone who disagrees with you says more about you as well. You're a weak-minded
    juvenile troll in a man's body.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 15:36:48 2024
    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 21:33:12 GMT :

    It's probably zero given it's the most important metric.

    It's not zero. Not even close.

    You're simply guessing. I'm using logic.

    Again, little Arlen, you're projecting. You claimed absolutely zero
    people have been convicted, which is not the case. This isn't a guess:

    Man Stored Child Pornography on Google Account, Sentenced to 14 Years in Federal Prison <https://www.justice.gov/usao-wdtx/pr/man-stored-child-pornography-google-account-sentenced-14-years-federal-prison>

    They're different logical algorithms.

    No, you're just delusional. It's not zero.

    Without convictions, the reporting of CSAM images is meaningless.

    There have been plenty of convictions.

    Logically, if they had appreciable convictions, they'd have mentioned
    it. The fact they don't bother to mention it, means it's probably
    almost zero.

    Now you're desperately trying to move the goal post from your original
    claim of "absolutely zero". Not happening, and the fact that some
    nebulous "they" didn't happen to mention the number of CSAM convictions
    in one particular instance means nothing. You're goal post shifting and deflecting, as usual.

    Because it isn't likely that they simply forgot the only fact that
    matters.

    Your claim that the only fact that matters is the number of convictions
    is dubious at best. I contend that what matters most is that innocent
    people's privacy may be violated. Things like this shouldn't happen, as
    it's a clear violation of the children's and their parents' privacy:

    Google AI flagged parents’ accounts for potential abuse over nude photos
    of their sick kids <https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation>

    The reason they didn't report it is likely because it's actually
    zero.

    No.

    It's the only fact that matters. And they forgot it?

    The only fact that matters to *you*. You don't get to make that claim
    for the rest of us, and certainly not the person who wrote the article.

    No. They're not that stupid. They're bullshitting us on this CSAM
    garbage.

    The only bullshitter I see here is you.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Your right to privacy shouldn't be violated because someone else
    might do something wrong.

    I agree with you.

    They get zero convictions.

    That's a lie.

    I realize you guess at everything in life, Jolly Roger.

    You're projecting. You claimed the number was absolutely zero. And I've provided evidence that the number of convictions is greater than zero,
    little Arlen.

    I use logic.

    No, you just lie.

    Logically thinking, the only metric that matters is convictions.

    You don't get to make that claim.

    And actually, the only metric that matters is the number of innocent
    people whose privacy may be violated. If that number is greater than
    zero, then you can count me (and a whole lot of others who reserve their
    right to privacy) out.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 15:41:59 2024
    XPost: alt.privacy

    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 21:29:07 GMT :

    What matters is the percentage

    No, words have meanings, and zero means zero. And there is a
    higher-than-zero number of pedophiles who have been caught due to
    CSAM scanning. Unfortunately, there is also a higher-than-zero number
    of innocent people whose privacy was violated in the process.

    While I support blah blah blah

    Nothing you can say will change the fact that a greater-than-zero number
    of people have been convicted from CSAM scanning - just like nothing you
    can say will convince me that CSAM scanning can be done without
    violating the privacy of innocent people. Things like this should not
    happen:

    Google AI flagged parents’ accounts for potential abuse over nude photos
    of their sick kids <https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation>

    And while Apple did their best to prevent such things from happening
    with their proposal, they could not guarantee it would not happen, which
    is why they ended up scrapping the proposal.

    Nothing else has any meaning.

    Nah.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to YourName@YourISP.com on Thu Jul 25 15:52:29 2024
    On 23/07/2024 05:35 Your Name <YourName@YourISP.com> wrote:
    badgolferman wrote:
    Apple has been accused of underreporting the prevalence of child sexual
    abuse material (CSAM) on its platforms. The National Society for the
    Prevention of Cruelty to Children (NSPCC), a child protection charity in
    the UK, says that Apple reported just 267 worldwide cases of suspected CSAM >> to the National Center for Missing & Exploited Children (NCMEC) last year. >>
    That pales in comparison to the 1.47 million potential cases that Google
    reported and 30.6 million reports from Meta. Other platforms that reported >> more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
    (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
    Interactive Entertainment (3,974). Every US-based tech company is required >> to pass along any possible CSAM cases detected on their platforms to NCMEC, >> which directs cases to relevant law enforcement agencies worldwide.

    As The Guardian, which first reported on the NSPCC's claim, points out,
    Apple services such as iMessage, FaceTime and iCloud all have end-to-end
    encryption, which stops the company from viewing the contents of what users >> share on them. However, WhatsApp has E2EE as well, and that service
    reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.

    “There is a concerning discrepancy between the number of UK child abuse
    image crimes taking place on Apple’s services and the almost negligible
    number of global reports of abuse content they make to authorities,”
    Richard Collard, the NSPCC's head of child safety online policy, said.
    “Apple is clearly behind many of their peers in tackling child sexual abuse
    when all tech firms should be investing in safety and preparing for the
    roll out of the Online Safety Act in the UK.”

    Apple declined to comment on the NSPCC's accusation, instead pointing The
    Guardian to a statement it made when it shelved the CSAM scanning plan.
    Apple said it opted for a different strategy that “prioritizes the security
    and privacy of [its] users.” The company told Wired in August 2022 that
    "children can be protected without companies combing through personal
    data."


    https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html


    Apple had a system ready to go, but a pile of brainless morons
    complained about their "privacy" being invaded (which it wasn't), so
    Apple was forced to abandon it. All this report achieves is to
    acknowledge that all those other companies listed above are less
    stringent about their users' privacy.

    Many people threatened to throw away or even burn their iPhones if
    Apple went ahead with the scheme. People don't want their actions
    policed on THEIR phones.

    This really scared Apple and they immediately did a 180 degree turn.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to ithinkiam@gmail.com on Thu Jul 25 16:01:57 2024
    On 23/07/2024 18:27 Chris <ithinkiam@gmail.com> wrote:
    Jörg Lorenz <hugybear@gmx.net> wrote:
    Am 23.07.24 um 13:31 schrieb Chris:
    After being a bit skeptical of Apple's solution, I realised it was a pretty >>> good and pragmatic balance between respecting people's privacy and
    protecting vulnerable people. I was disappointed that the angry "muh
    freedom" brigade scuppered it.

    It was neither a good nor an acceptable solution. This being the reason
    why Apple decided against it in the end.

    It was sunk by reactionary know-it-alls. If anyone bothered looked at the technology - which Apple published openly - they would have seen it was pretty elegant and privacy preserving.

    It could have been a really good tool to protect children, but no, people's non-rights were more important.


    The reason why more and more tech companies are switching to E2EE is
    exactly to stave off pressure from SIG (special interest groups) who believe protecting children is more important than everyone's right to.

    Haven't you noticed that no one's complaining about the CSAM on WhatsApp
    or Mega (E2EE file storage service like DropBox)? Because no one can see
    its contents!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to ithinkiam@gmail.com on Thu Jul 25 16:04:34 2024
    On 23/07/2024 18:32 Chris <ithinkiam@gmail.com> wrote:
    badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:

    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved
    the CSAM scanning plan. Apple said it opted for a different
    strategy that “prioritizes the security and privacy of [its]
    users.” The company told Wired in August 2022 that "children
    can be protected without companies combing through personal
    data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Agreed.

    No-one is forced to use icloud. If they didn't like the policy, they could
    go elsewhere. Like to google and meta, who are more than happy to share millionsof people's private photos with law enforcement which apparently
    is just fine.


    I would advise people to swtich to Mega, an E2EE file storage service, and be done
    with it. No more nagging about CSAM reporting statistics.

    https://mega.io

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to jollyroger@pobox.com on Thu Jul 25 16:07:12 2024
    On 23/07/2024 19:49 Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> >>>>>wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved the
    CSAM scanning plan. Apple said it opted for a different strategy >>>>>> that “prioritizes the security and privacy of [its] users.” The >>>>>> company told Wired in August 2022 that "children can be protected >>>>>> without companies combing through personal data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can
    work, and why Apple wisely withdrew their proposal even though it worked harder to preserve privacy than any other solution.


    You're wrong. Apple doesn't give a rat's ass about anyone's privacy. They do, however,
    care about their profits and when people start threatening to leave their platform if this
    client-scanning scheme goes through, they start to listen.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to REMOVETHISbadgolferman@gmail.com on Thu Jul 25 16:13:59 2024
    On 23/07/2024 19:53 "badgolferman" <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:

    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
    wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman
    <REMOVETHISbadgolferman@gmail.com> wrote:

    Apple declined to comment on the NSPCC's accusation,
    instead pointing The Guardian to a statement it made when
    it shelved the CSAM scanning plan. Apple said it opted
    for a different strategy that “prioritizes the security
    and privacy of [its] users.” The company told Wired in
    August 2022 that "children can be protected without
    companies combing through personal data."

    This is one reason many people choose Apple over
    alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can >>work, and why Apple wisely withdrew their proposal even though it
    worked harder to preserve privacy than any other solution.

    No, I take exception to your statement that privacy is for *everyone*.
    There are plenty of people in this world who should have their privacy
    and freedom taken away forever. Many of them their lives too.

    You can scream all you want but the majority of consumers is not going
    to allow this mass invasion of their privacy, no matter how many children,
    get abused.

    We're already given up too much of our privacy already. But client side scanning is where people draw the line. Users expect to be able to do
    whatever they want on their OWN computers without the operating system
    ratting them out!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to ithinkiam@gmail.com on Thu Jul 25 15:55:37 2024
    On 23/07/2024 13:31 Chris <ithinkiam@gmail.com> wrote:
    Are the tech companies simply swamping
    authorities with data? How does anyone cope with 30m reports from meta?


    They are and authorities mostly ignore these reports. Sometimes
    they are added to cases as additional evidence. Nothing more.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Thu Jul 25 18:08:59 2024
    Am 25.07.24 um 18:04 schrieb The Running Man:

    Are you really not aware that E-S is not allowing attachments?

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?J=C3=B6rg_Lorenz?=@21:1/5 to All on Thu Jul 25 18:00:00 2024
    Am 25.07.24 um 17:55 schrieb The Running Man:

    No text but an attachment?
    You look like a bloody beginner ... *SCNR*

    --
    "Gutta cavat lapidem." (Ovid)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Running Man@21:1/5 to jollyroger@pobox.com on Thu Jul 25 16:18:15 2024
    On 24/07/2024 01:54 Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 23 Jul 2024 17:55:55 GMT :

    Everyone on this planet should have a right to basic privacy.

    I agree with anyone who makes a logically sensible assessment of
    fact.

    I fully agree with Jolly Roger (and I disagree with badgolferman).

    We should never have to defend our right to privacy.

    The object of my disagreement is not CSAM, it’s the idea that everyone
    has the right to privacy.

    This entire thread is in the context of CSAM, so that's a cop out.

    Prisoners, murderers, rapists, child molestors, etc.

    Again, in the context of this thread, we're talking about people who
    have never been convicted or suspected of such crimes.


    Don't you see the numbskull refuses to make the distinction? Don't
    feed the trolls.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to The Running Man on Thu Jul 25 18:08:57 2024
    On 2024-07-25, The Running Man <running_man@writeable.com> wrote:
    On 23/07/2024 19:49 Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Jolly Roger wrote:
    On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> >>>>>>wrote:

    Apple declined to comment on the NSPCC's accusation, instead
    pointing The Guardian to a statement it made when it shelved the >>>>>>> CSAM scanning plan. Apple said it opted for a different strategy >>>>>>> that “prioritizes the security and privacy of [its] users.” The >>>>>>> company told Wired in August 2022 that "children can be protected >>>>>>> without companies combing through personal data."

    This is one reason many people choose Apple over alternatives.

    iPhone. The preferred mobile device of child molestors.

    This could be a new marketing ploy someday!

    Privacy for everyone is important.

    Sorry, I can't agree with that. Some people give up their right to
    privacy when they harm others or society. The laws are there for
    everyone, not just those who choose to follow them.

    Your problem is you want to invade everyone's privacy regardless of
    whether they are hurting anyone. That's the only way CSAM scanning can
    work, and why Apple wisely withdrew their proposal even though it worked
    harder to preserve privacy than any other solution.

    You're wrong. Apple doesn't give a rat's ass about anyone's privacy.

    Wrong. Their customers care, so they care. It's better for their bottom
    line to care, so they do.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Thu Jul 25 19:22:24 2024
    On 2024-07-25, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 24 Jul 2024 21:33:12 GMT :

    It's probably zero given it's the most important metric.

    It's not zero. Not even close.

    You're simply guessing. I'm using logic. They're different logical
    algorithms.

    Nope. You're guessing just as much as JR.

    The facts - and we know you like them, but never look for them - are
    that there are many convictions on a depressingly regular basis. Just
    look at the press releases from the DoJ Project Safe Childhood: https://www.justice.gov/psc/press-room

    My statements aren't guesses. There have been plenty of convictions.

    Unfortunately, there have also been privacy violations of innocent
    people. And that's my primary concern when it comes to CSAM scanning.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 20:42:14 2024
    XPost: alt.privacy

    Jolly Roger wrote on 25 Jul 2024 15:41:59 GMT :

    Nothing you can say will change the fact that a greater-than-zero number
    of people have been convicted from CSAM scanning - just like nothing you
    can say will convince me that CSAM scanning can be done without
    violating the privacy of innocent people. Things like this should not
    happen:

    I'm not disagreeing with you, Jolly Roger.
    The CSAM scanning is a violating of privacy.

    The question isn't that it's a violation of privacy.
    The question is whether it's worth that violation of privacy.

    For that assessment, we need to know what the conviction rate is.
    Why do YOU think the reports all left out the most important fact?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 20:47:40 2024
    Jolly Roger wrote on 25 Jul 2024 15:36:48 GMT :

    And actually, the only metric that matters is the number of innocent
    people whose privacy may be violated. If that number is greater than
    zero, then you can count me (and a whole lot of others who reserve their right to privacy) out.

    I'm agreeing with you that millions, and maybe even billions of innocent
    people are being violated by CSAM scanning by Apple, Google & Facebook.

    Apple is harming us less than Google & Facebook, but harm is still harm.

    In addition, I'm saying that the gain can only be measured in the
    percentage or number of convictions - which is detail we don't know.

    Nobody is so stupid as to not know that's the most important fact.
    And the fact they left it out of the reports - tells us something.

    Why do YOU think they left out the most important fact, Jolly Roger?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Your Name@21:1/5 to The Running Man on Fri Jul 26 08:27:21 2024
    On 2024-07-25 15:52:29 +0000, The Running Man said:
    On 23/07/2024 05:35 Your Name <YourName@YourISP.com> wrote:

    badgolferman wrote:

    Apple has been accused of underreporting the prevalence of child sexual

    abuse material (CSAM) on its platforms. The National Society for the

    Prevention of Cruelty to Children (NSPCC), a child protection charity in

    the UK, says that Apple reported just 267 worldwide cases of suspected CSAM

    to the National Center for Missing & Exploited Children (NCMEC) last year.



    That pales in comparison to the 1.47 million potential cases that Google

    reported and 30.6 million reports from Meta. Other platforms that reported

    more potential CSAM cases than Apple in 2023 include TikTok (590,376), X

    (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony

    Interactive Entertainment (3,974). Every US-based tech company is required

    to pass along any possible CSAM cases detected on their platforms to NCMEC,

    which directs cases to relevant law enforcement agencies worldwide.



    As The Guardian, which first reported on the NSPCC's claim, points out,

    Apple services such as iMessage, FaceTime and iCloud all have end-to-end

    encryption, which stops the company from viewing the contents of what users

    share on them. However, WhatsApp has E2EE as well, and that service

    reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.



    “There is a concerning discrepancy between the number of UK child abuse

    image crimes taking place on Apple’s services and the almost negligible

    number of global reports of abuse content they make to authorities,”

    Richard Collard, the NSPCC's head of child safety online policy, said.

    “Apple is clearly behind many of their peers in tackling child
    sexual abuse

    when all tech firms should be investing in safety and preparing for the

    roll out of the Online Safety Act in the UK.”



    Apple declined to comment on the NSPCC's accusation, instead pointing The

    Guardian to a statement it made when it shelved the CSAM scanning plan.

    Apple said it opted for a different strategy that “prioritizes the security

    and privacy of [its] users.” The company told Wired in August 2022 that

    "children can be protected without companies combing through personal

    data."





    https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html






    Apple had a system ready to go, but a pile of brainless morons

    complained about their "privacy" being invaded (which it wasn't), so

    Apple was forced to abandon it. All this report achieves is to

    acknowledge that all those other companies listed above are less

    stringent about their users' privacy.



    Many people threatened to throw away or even burn their iPhones if

    Apple went ahead with the scheme. People don't want their actions

    policed on THEIR phones.



    This really scared Apple and they immediately did a 180 degree turn.


    The reality is that most users didn't even know about the plans and
    wouldn't have cared even if they did know. As usual, it was a minority
    of loud-mouthed whiners who complained because they were scare someone
    at Apple would see that they've been taking photos of cute wabbits ...
    which wasn't even possible if the whiners actually understood anything.
    :-\

    Nobody anywhere gives a flying crap what your do with your phone ...
    unlesss it's illegal, in which case you deserve to get caught. PLAIN
    AND SIMPLE!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 20:52:48 2024
    XPost: alt.privacy

    Jolly Roger wrote on 25 Jul 2024 19:22:24 GMT :

    You're simply guessing. I'm using logic. They're different logical
    algorithms.

    Nope. You're guessing just as much as JR.

    The facts - and we know you like them, but never look for them - are
    that there are many convictions on a depressingly regular basis. Just
    look at the press releases from the DoJ Project Safe Childhood:
    https://www.justice.gov/psc/press-room

    My statements aren't guesses. There have been plenty of convictions.

    Unfortunately, there have also been privacy violations of innocent
    people. And that's my primary concern when it comes to CSAM scanning.

    Chris just lied about convictions.
    Why did Chris lie?

    I don't know why Chris lied.
    All I know is that Chris lied.

    I suspect Chris felt the need to lie because he had no valid point.
    The cite Chris listed said NOTHING whatsoever about the conviction rate.

    It's like me saying people in Alaska are all Eskimos and when Chris asks me
    to prove it I point to a long listing of Italians in Florida instead.

    It's revealing that Chris has to resort to lies to defend his assessment.
    It means that Chris' assessment that CSAM scanning is effective - is false.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Thu Jul 25 20:54:18 2024
    Jolly Roger wrote on 25 Jul 2024 15:38:54 GMT :

    That absurd thought process shows Alan Baker's IQ can't be above
    around 40.

    Your obsession with meaningless IQ numbers says way more about you than anyone else, little Arlen. And your continual drive to insult anyone who disagrees with you says more about you as well. You're a weak-minded
    juvenile troll in a man's body.

    The only reason the IQ of the strange religious zealots matters is that the low-IQ is partly why you zealots believe things that have no basis in fact.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 21:07:20 2024
    XPost: alt.privacy

    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 15:41:59 GMT :

    Nothing you can say will change the fact that a greater-than-zero number
    of people have been convicted from CSAM scanning - just like nothing you
    can say will convince me that CSAM scanning can be done without
    violating the privacy of innocent people. Things like this should not
    happen:

    I'm not disagreeing with you, Jolly Roger.
    The CSAM scanning is a violating of privacy.

    The question isn't that it's a violation of privacy.
    The question is whether it's worth that violation of privacy.

    Ask the innocent people who have their privacy violated whether it was
    worth it, and see what they tell you.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 21:12:36 2024
    XPost: alt.privacy

    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 19:22:24 GMT :

    You're simply guessing. I'm using logic. They're different logical
    algorithms.

    Nope. You're guessing just as much as JR.

    The facts - and we know you like them, but never look for them - are
    that there are many convictions on a depressingly regular basis.
    Just look at the press releases from the DoJ Project Safe Childhood:
    https://www.justice.gov/psc/press-room

    My statements aren't guesses. There have been plenty of convictions.

    Unfortunately, there have also been privacy violations of innocent
    people. And that's my primary concern when it comes to CSAM scanning.

    The cite Chris listed said NOTHING whatsoever about the conviction
    rate.

    It certainly shows the conviction rate is higher than your claimed
    "absolute zero". So it seems it was you who lied first. I realize you
    want us to ignore this, but your focus on who lied demands we recognize
    it.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Thu Jul 25 21:13:27 2024
    On 2024-07-25, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 15:38:54 GMT :

    That absurd thought process shows Alan Baker's IQ can't be above
    around 40.

    Your obsession with meaningless IQ numbers says way more about you
    than anyone else, little Arlen. And your continual drive to insult
    anyone who disagrees with you says more about you as well. You're a
    weak-minded juvenile troll in a man's body.

    The only reason the IQ of the strange religious zealots matters is
    that the low-IQ is partly why you zealots believe things that have no
    basis in fact.

    *YAWN* You're a hopeless broken record, little Arlen.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Andrew on Thu Jul 25 14:37:55 2024
    On 2024-07-25 13:47, Andrew wrote:
    Jolly Roger wrote on 25 Jul 2024 15:36:48 GMT :

    And actually, the only metric that matters is the number of innocent
    people whose privacy may be violated. If that number is greater than
    zero, then you can count me (and a whole lot of others who reserve their
    right to privacy) out.

    I'm agreeing with you that millions, and maybe even billions of innocent people are being violated by CSAM scanning by Apple, Google & Facebook.

    Apple is harming us less than Google & Facebook, but harm is still harm.

    Apple isn't doing CSAM scanning, doofus.

    After lots of (unwarranted) criticism, they cancelled their plan to scan
    images on-device against know CSAM images and iCloud photo storage can
    be encrypted so that even Apple can't decrypt them.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Fri Jul 26 02:25:16 2024
    XPost: alt.privacy

    Jolly Roger wrote on 25 Jul 2024 21:12:36 GMT :

    The cite Chris listed said NOTHING whatsoever about the conviction
    rate.

    It certainly shows the conviction rate is higher than your claimed
    "absolute zero". So it seems it was you who lied first. I realize you
    want us to ignore this, but your focus on who lied demands we recognize
    it.

    Oh Jeez. This is exactly how I know you zealots lack a normal IQ.
    Think about what was actually said and what you & Chris lied about.

    Think about it. Please. Look at your own cites, Jolly Roger. L@@K!
    That you both resort to lies means you lack the ability to reason.

    You both fabricated imaginary convictions based on the CSAM imaging by Facebook, Google, and Apple - and yet - your cites say nothing of the sort.

    You lied.
    You fabricated evidence that does not exist in your cite.

    Why?

    I don't know if your low IQ is the problem or if you are simply lying, JR.
    Your cite says NOTHING about how the evidence was obtained.

    Yet, you lied by fabricating that the cite says Apple, Google & Facebook provided the evidence for which those convictions were obtained.

    Do you think I'm stupid, Jolly Roger?
    I see right through your lies and those of Chris.

    The fact you resort to lies shows that you can't back up your point.

    Show me evidence of the conviction rated for people who were reported to authorities by Apple, Google and Facebook, Jolly Roger.

    Just showing convictions is completely meaningless to this point.
    The fact you can't comprehend that is an indication of low intelligence.

    Stop fabricating evidence that doesn't exist in your cites please.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Fri Jul 26 02:27:48 2024
    Jolly Roger wrote on 25 Jul 2024 21:13:27 GMT :

    That absurd thought process shows Alan Baker's IQ can't be above
    around 40.

    Your obsession with meaningless IQ numbers says way more about you
    than anyone else, little Arlen. And your continual drive to insult
    anyone who disagrees with you says more about you as well. You're a
    weak-minded juvenile troll in a man's body.

    The only reason the IQ of the strange religious zealots matters is
    that the low-IQ is partly why you zealots believe things that have no
    basis in fact.

    *YAWN* You're a hopeless broken record, little Arlen.

    Did you ever wonder why *all* you Apple zealots lack formal education?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Fri Jul 26 02:33:01 2024
    XPost: alt.privacy

    Jolly Roger wrote on 25 Jul 2024 21:07:20 GMT :

    The question isn't that it's a violation of privacy.
    The question is whether it's worth that violation of privacy.

    Ask the innocent people who have their privacy violated whether it was
    worth it, and see what they tell you.

    Well, everyone who has photos that Google, Apple and Facebook can 'see',
    have been violated, and as far as we know, there have been zero convictions based on those reports by Apple, Google and Facebook.

    Since you lack a normal IQ, which is why you don't have a formal education,
    you need to understand that I didn't say people aren't convicted, JR.

    I said there's no evidence presented that the reports by Apple, Google and Facebook resulted in any convictions (so we have to assume it's zero).

    The problem here, as any intelligent adult can instantly comprehend, is if
    the conviction rate based on the reports is 1%, is it worth the privacy violation of millions, nay, billions of people by Apple/Google/FB?

    At what percent conviction rate is the loss in privacy worth it?
    1% ?
    2% ?
    3% ?
    5% ?
    8% ?

    Note the fact the articles blamed Apple and yet they omitted the most
    important metric is a hint that they know the conviction rate is dismal.

    Because nobody is that stupid to not ask what the conviction rate is.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Fri Jul 26 16:08:29 2024
    XPost: alt.privacy

    On 2024-07-26, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 21:07:20 GMT :

    The question isn't that it's a violation of privacy. The question
    is whether it's worth that violation of privacy.

    Ask the innocent people who have their privacy violated whether it
    was worth it, and see what they tell you.

    Well, everyone who has photos that Google, Apple and Facebook can
    'see', have been violated

    Apple doesn't scan their user's photos for CSAM.

    Since you lack a normal IQ

    Straight to the insults

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chris on Fri Jul 26 08:30:50 2024
    On 2024-07-26 07:53, Chris wrote:
    Which is my point about CSAM.

    The whole CSAM thing is pure bullshit.
    a. Nobody is protected.
    b. Everyone is harmed.

    Repeating lies doesn't make them true.

    But that is quite literally the only tool he has!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Fri Jul 26 16:12:58 2024
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 07:53, Chris wrote:
    Which is my point about CSAM.

    The whole CSAM thing is pure bullshit.
    a. Nobody is protected.
    b. Everyone is harmed.

    Repeating lies doesn't make them true.

    But that is quite literally the only tool he has!

    It really is. When his lies are laid bare and shown to be false, he
    inevitably just doubles down and tells them over and over again. It's
    extremely childish.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Fri Jul 26 16:11:26 2024
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech
    companies are rather than complain at Apple for not sending
    millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso than
    the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent person
    who has to deal with charges, a potential criminal sexual violation on
    your record, and all that comes with it? I certainly wouldn't.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Fri Jul 26 09:35:47 2024
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the tech >>>>>> companies are rather than complain at Apple for not sending
    millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in child
    safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso than
    the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent person
    who has to deal with charges, a potential criminal sexual violation on
    your record, and all that comes with it? I certainly wouldn't.


    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Fri Jul 26 09:37:52 2024
    XPost: alt.privacy

    On 2024-07-26 09:08, Jolly Roger wrote:
    On 2024-07-26, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 21:07:20 GMT :

    The question isn't that it's a violation of privacy. The question
    is whether it's worth that violation of privacy.

    Ask the innocent people who have their privacy violated whether it
    was worth it, and see what they tell you.

    Well, everyone who has photos that Google, Apple and Facebook can
    'see', have been violated

    Apple doesn't scan their user's photos for CSAM.

    Since you lack a normal IQ

    Straight to the insults

    When I said he had only one tool (lying), I was clearly in error.

    He has two tools: lying... ...and insults.

    🤣

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Fri Jul 26 22:14:13 2024
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the
    tech companies are rather than complain at Apple for not sending >>>>>>> millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions
    resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent
    person who has to deal with charges, a potential criminal sexual
    violation on your record, and all that comes with it? I certainly
    wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated. Do you want a
    stranger looking at photos of your sick child? What if that stranger
    came to the conclusion that those photos are somehow classifiable as
    sexual or abusive in some way? Would you want to have to argue your case
    in court because of it?

    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    Yes, but one is one too many in my book.

    Apple was wise to shelve this proposal. And I am happy to see that they embraced more private features such as the Safe Communication feature
    which is done without violating customers' privacy.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Fri Jul 26 22:24:28 2024
    XPost: alt.privacy

    On 2024-07-26, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 21:12:36 GMT :

    The cite Chris listed said NOTHING whatsoever about the conviction
    rate.

    It certainly shows the conviction rate is higher than your claimed
    "absolute zero". So it seems it was you who lied first. I realize you
    want us to ignore this, but your focus on who lied demands we
    recognize it.

    zealots lack a normal IQ

    Straight to the insults.

    You both fabricated imaginary convictions

    Actually, both Chris and I provided actual documentation of actual
    convictions.

    You lied.

    No, you're projecting.

    Just showing convictions is completely meaningless to this point.

    No, it's entirely meaningful and disputes your bogus claim that there
    have been "absolutely zero" convictions.

    We're done here.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Fri Jul 26 16:07:57 2024
    On 2024-07-26 15:14, Jolly Roger wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the
    tech companies are rather than complain at Apple for not sending >>>>>>>> millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions >>>>>>> resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent
    person who has to deal with charges, a potential criminal sexual
    violation on your record, and all that comes with it? I certainly
    wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated. Do you want a
    stranger looking at photos of your sick child? What if that stranger
    came to the conclusion that those photos are somehow classifiable as
    sexual or abusive in some way? Would you want to have to argue your case
    in court because of it?

    Yes. At that point...

    ...if and only if the person is INNOCENT...

    ...someone's privacy is unnecessarily violated.

    And it's a stretch to imagine that:

    1. Innocent pictures would be matched with KNOWN CSAM images, AND;

    (the logical AND)

    2. A person reviewing those images after they've been flagged wouldn't
    notice they don't actually match; AND

    3. The owner of those images at that point would be charged when they
    could then show that they were in fact innocent images.

    All three of those things have to happen before this would ever wind up
    in court.


    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    Yes, but one is one too many in my book.

    And yet you are fine with innocent people's privacy being violated when
    a search warrant is issued erroneously.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Sat Jul 27 16:55:11 2024
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 15:14, Jolly Roger wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>>>> millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions >>>>>>>> resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss of >>>>>>> privacy

    Actually, Apple could not guarantee that, and there was a non-zero >>>>>> chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent
    person who has to deal with charges, a potential criminal sexual
    violation on your record, and all that comes with it? I certainly
    wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated. Do you want a
    stranger looking at photos of your sick child? What if that stranger
    came to the conclusion that those photos are somehow classifiable as
    sexual or abusive in some way? Would you want to have to argue your case
    in court because of it?

    Yes. At that point...

    ...if and only if the person is INNOCENT...

    ...someone's privacy is unnecessarily violated.

    And it's a stretch to imagine that:

    1. Innocent pictures would be matched with KNOWN CSAM images, AND;

    Not it's not. There was a margin of error in the proposed matching
    algorithms.

    (the logical AND)

    2. A person reviewing those images after they've been flagged wouldn't
    notice they don't actually match; AND

    That decision is a human one, and humans make mistakes and have biased
    beliefs that can lead them to make faulty decisions.

    3. The owner of those images at that point would be charged when they
    could then show that they were in fact innocent images.

    Innocent people shouldn't have to prove anything to anyone.

    Yes, but one is one too many in my book.

    And yet you are fine with innocent people's privacy being violated
    when a search warrant is issued erroneously.

    Search warrants require probable cause and are signed by a judge.
    Totally different scenario.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sat Jul 27 18:38:40 2024
    XPost: alt.privacy

    Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :

    Actually, both Chris and I provided actual documentation of actual convictions.

    Oh Jesus. Everyone, yes, even you zealots, knows there are convictions.
    Nobody said otherwise.

    It's no longer shocking that you just claimed every conviction was a result
    of Apple/Google/Facebook CSAM scanning - when you have no idea if they are.

    Both of you lack the IQ to fathom there is a huge difference.

    It's like you try to prove all tickets are made by short cops when all you
    list as your "proof" is a listing of all the tickets that were issued.

    Do you not realize your cites only show convictions?
    Not convictions explicitly due to Apple/Google/Facebook CSAM scanning?
    And certainly not the conviction rate based on the number of CSAM reports?

    Seriously.
    You'd fail every college test in the world making such absurd claims.

    Your total lack of logic is how I know, Jolly Roger, your IQ is very low.
    I knew Alan Baker was that stupid - but I didn't know you & Chris were too.

    All your strongly held beliefs are based on exactly zero facts, JR.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Sat Jul 27 18:45:57 2024
    XPost: alt.privacy

    Chris wrote on Fri, 26 Jul 2024 15:23:16 +0100 :

    Chris just lied about convictions.
    Why did Chris lie?

    Given I was responding to your claim of "ZERO convictions"...

    Do you realize the number of convictions was never in dispute, Chris?

    What is in dispute is your claims that Apple/Google/Facebook CSAM scanning
    has a 100% conviction rate (which is essentially your claim, Chris).

    I realize your mind forms strong belief systems based on exactly zero
    facts, Chris - but the way most normal people work is they use facts.

    Without knowing what the conviction rate is per report by Google/Apple/Facebook, we have to assume that it's zero percent.

    No other logical assessment is possible (by an actual adult).

    And since that's the most important metric, the fact that it's purposefully left out of the reports is an indication that it's probably zero percent.

    Because the people writing those reports are not stupid.
    They *know* the only metric isn't convictions - but conviction rates.

    You haven't shown a single fact telling us the conviction rate out of the millions (perhaps billions) of CSAM reports by Apple/Google/Facebook.

    Your entire strongly held belief system is based on exactly zero facts.
    You simply guessed that all those convictions are due to Apple/Google/FB.

    And, as always, you guess wrong(ly).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Andrew on Sat Jul 27 11:48:36 2024
    XPost: alt.privacy

    On 2024-07-27 11:38, Andrew wrote:
    Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :

    Actually, both Chris and I provided actual documentation of actual
    convictions.

    Oh Jesus. Everyone, yes, even you zealots, knows there are convictions. Nobody said otherwise.

    Actually:

    'It's probably zero given it's the most important metric.
    And it's missing from the story.'

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Andrew on Sat Jul 27 11:49:08 2024
    XPost: alt.privacy

    On 2024-07-27 11:45, Andrew wrote:
    Chris wrote on Fri, 26 Jul 2024 15:23:16 +0100 :

    Chris just lied about convictions.
    Why did Chris lie?

    Given I was responding to your claim of "ZERO convictions"...

    Do you realize the number of convictions was never in dispute, Chris?

    'It's probably zero given it's the most important metric.
    And it's missing from the story.'


    What is in dispute is your claims that Apple/Google/Facebook CSAM scanning has a 100% conviction rate (which is essentially your claim, Chris).

    That was never once said or implied.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Sat Jul 27 11:52:41 2024
    On 2024-07-27 09:55, Jolly Roger wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 15:14, Jolly Roger wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the >>>>>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>>>>> millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions >>>>>>>>> resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in >>>>>>>>> child safety?

    Apple's solution wouldn't have resulted in any additional loss of >>>>>>>> privacy

    Actually, Apple could not guarantee that, and there was a non-zero >>>>>>> chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed, >>>>> that's one too many. Would you want to be that unlucky innocent
    person who has to deal with charges, a potential criminal sexual
    violation on your record, and all that comes with it? I certainly
    wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated. Do you want a
    stranger looking at photos of your sick child? What if that stranger
    came to the conclusion that those photos are somehow classifiable as
    sexual or abusive in some way? Would you want to have to argue your case >>> in court because of it?

    Yes. At that point...

    ...if and only if the person is INNOCENT...

    ...someone's privacy is unnecessarily violated.

    And it's a stretch to imagine that:

    1. Innocent pictures would be matched with KNOWN CSAM images, AND;

    Not it's not. There was a margin of error in the proposed matching algorithms.

    I'm not saying it's impossible. Just very unlikely.


    (the logical AND)

    2. A person reviewing those images after they've been flagged wouldn't
    notice they don't actually match; AND

    That decision is a human one, and humans make mistakes and have biased beliefs that can lead them to make faulty decisions.

    I'm not saying it's impossible. Just very unlikely.


    3. The owner of those images at that point would be charged when they
    could then show that they were in fact innocent images.

    Innocent people shouldn't have to prove anything to anyone.

    We've already had two very statistically unlikely events that had to
    happen to get to this point.

    At this point, it's pretty much equivalent to the search warrant.
    Information that suggests a crime might have been committed has been communicated to the authorities. And while someone's pictures have been examined by a human, that person doesn't know they've been examined, so
    where is the damage?


    Yes, but one is one too many in my book.

    And yet you are fine with innocent people's privacy being violated
    when a search warrant is issued erroneously.

    Search warrants require probable cause and are signed by a judge.
    Totally different scenario.

    But innocent people do get searched...

    ...and you literally just said:

    'Innocent people shouldn't have to prove anything to anyone.'

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sat Jul 27 18:58:37 2024
    XPost: alt.privacy

    Jolly Roger wrote on 26 Jul 2024 16:12:58 GMT :

    But that is quite literally the only tool he has!

    It really is. When his lies are laid bare and shown to be false, he inevitably just doubles down and tells them over and over again. It's extremely childish.

    I find it rather interesting that the zealots claim an essentially 100% conviction rate based on the Apple/Google/Facebook CSAM reports...

    And yet...
    Nobody here actually knows what the conviction rate is.

    All we know for sure is that everyone is harmed.
    But for all we know, the conviction rate is so low that nobody is saved.

    None of you zealots ever took basic logic classes.

    Note: The people who wrote the reports certainly know the conviction rate;
    and the fact they left out that most important metric, implies it's dismal.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Arlen can't deal@21:1/5 to Andrew on Sat Jul 27 12:01:04 2024
    XPost: alt.privacy

    On 2024-07-27 11:58, Andrew wrote:
    Jolly Roger wrote on 26 Jul 2024 16:12:58 GMT :

    But that is quite literally the only tool he has!

    It really is. When his lies are laid bare and shown to be false, he
    inevitably just doubles down and tells them over and over again. It's
    extremely childish.

    I find it rather interesting that the zealots claim an essentially 100% conviction rate based on the Apple/Google/Facebook CSAM reports...

    Arlen...

    ...why must you lie?

    Literally no one has made that claim.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sat Jul 27 19:02:58 2024
    XPost: alt.privacy

    Jolly Roger wrote on 26 Jul 2024 16:08:29 GMT :

    Since you lack a normal IQ

    Straight to the insults

    Hi Jolly Roger,

    To put it bluntly, every assessment that you make is based on your low IQ
    not being able to discern between convictions and conviction rates due to
    the reporting by Apple, Google & Facebook.

    They're not the same thing.

    That you can't discern the difference is because of your low IQ.
    Don't blame me for that as it's not my fault you don't know math.

    Suffice to say that, without the conviction rate, all we know is many
    people are harmed and, for all we know, almost nobody is saved.

    We can assume the people lambasting Apple are not stupid; they know what
    the conviction rate is: the fact they left it out means it's likely dismal.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Sun Jul 28 01:57:26 2024
    XPost: alt.privacy

    On 2024-07-27, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-27 11:38, Andrew wrote:
    Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :

    Actually, both Chris and I provided actual documentation of actual
    convictions.

    Oh Jesus. Everyone, yes, even you zealots, knows there are
    convictions. Nobody said otherwise.

    Actually:

    'It's probably zero given it's the most important metric. And it's
    missing from the story.'

    And before that he literally said "absolutely zero" convictions have
    resulted from CSAM scanning - right here in this thread. Arlen is a
    clown.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 01:56:24 2024
    XPost: alt.privacy

    On 2024-07-27, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :

    Actually, both Chris and I provided actual documentation of actual
    convictions.

    Oh Jesus. Everyone, yes, even you zealots, knows there are
    convictions. Nobody said otherwise.

    Liar. *YOU* said there were "absolutely zero" convictions. You're trying
    to pretend you didn't say that, but it's on record right here in this
    thread.

    Pretend all you want, but it's not going to change reality. And the fact
    that you never back up your own words shows you have no credibility.
    Like Trump and so many right-wing politicians, you just lie, and when
    your lies are shown to be false, you double down or pretend you never
    said them. It's a sign of a weak mind, and it's really pitiful behavior
    for a supposed mature adult.

    All your strongly held beliefs are based on exactly zero facts, JR.

    Projection from a mental weakling.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sun Jul 28 03:13:44 2024
    XPost: alt.privacy

    Jolly Roger wrote on 28 Jul 2024 01:57:26 GMT :

    'It's probably zero given it's the most important metric. And it's
    missing from the story.'

    And before that he literally said "absolutely zero" convictions have
    resulted from CSAM scanning - right here in this thread. Arlen is a
    clown.

    Jesus Chris, Jolly Roger.

    You can't tell the difference between the number of convictions from all sources and the percentage of convictions from Apple, Google or FB reports?

    Who is that stupid?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Arlen can't deal on Sun Jul 28 02:36:58 2024
    XPost: alt.privacy

    On 2024-07-27, Arlen can't deal <with@truth.com> wrote:
    On 2024-07-27 11:58, Andrew wrote:
    Jolly Roger wrote on 26 Jul 2024 16:12:58 GMT :

    But that is quite literally the only tool he has!

    It really is. When his lies are laid bare and shown to be false, he
    inevitably just doubles down and tells them over and over again. It's
    extremely childish.

    I find it rather interesting that the zealots claim an essentially 100%
    conviction rate based on the Apple/Google/Facebook CSAM reports...

    Arlen...

    ...why must you lie?

    Literally no one has made that claim.

    He's so predictable and childish.

    First he says there have been "absolutely zero" convictions as a result
    of CSAM scanning.

    Then he says there are "probably zero" convictions.

    Now he fabricates others supposedly saying "essentially 100% conviction
    rate" when literally nobody said
    that.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sun Jul 28 03:12:26 2024
    XPost: alt.privacy

    Jolly Roger wrote on 28 Jul 2024 01:56:24 GMT :

    Oh Jesus. Everyone, yes, even you zealots, knows there are
    convictions. Nobody said otherwise.

    Liar. *YOU* said there were "absolutely zero" convictions. You're trying
    to pretend you didn't say that, but it's on record right here in this
    thread.

    Oh my God, Jolly Roger...

    You still don't get it that nobody said people aren't convicted of the
    crime. What I said was there's NOTHING in the reports that indicates the percentage of convictions based on the number of reports by Apple, Google & Facebook.

    For all you know, Apple's percentage can be twice that of Google & Facebook combined. Or half. Or ten times as many. Or none at all. Jesus Christ, JR.

    The fact you can't figure out the difference is how I know your IQ is
    dismal.

    You couldn't pass a single college test question given your lack of understanding of simple statements such as those which I said many times.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 02:39:40 2024
    XPost: alt.privacy

    On 2024-07-27, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 26 Jul 2024 16:08:29 GMT :

    Since you lack a normal IQ

    Straight to the insults

    Hi Jolly Roger,

    To put it bluntly, every assessment that you make is based on your low
    IQ not being able to discern between convictions and conviction rates
    due to the reporting by Apple, Google & Facebook.

    They're not the same thing.

    You're the one who started out with "absolutely zero" convictions, you
    complete asshole. Then you walked it back to "probably zero", and now
    you are fabricating things nobody here has said - all while desperately
    trying to move the goal post to conviction rates. And the fact that you actually think you are fooling anyone here reflects more on your own IQ
    than anyone else's. You're a pathetic weak-minded fool, little Arlen.
    Take your childish insults and shove them where the sun don't shine.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sun Jul 28 03:15:25 2024
    XPost: alt.privacy

    Jolly Roger wrote on 28 Jul 2024 02:36:58 GMT :

    First he says there have been "absolutely zero" convictions as a result
    of CSAM scanning.

    It's no longer shocking that your IQ is so dismally low that you can't tell
    the difference between the number of convictions from all sources, and the percentage (or number) of convictions from just Apple.

    Or just Google.
    Or just Facebook.

    You couldn't pass a single test in college with your lack of comprehension.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Jolly Roger on Sun Jul 28 03:16:47 2024
    XPost: alt.privacy

    Jolly Roger wrote on 28 Jul 2024 02:39:40 GMT :

    You're the one who started out with "absolutely zero" convictions, you complete asshole. Then you walked it back to "probably zero", and now
    you are fabricating things nobody here has said - all while desperately trying to move the goal post to conviction rates. And the fact that you actually think you are fooling anyone here reflects more on your own IQ
    than anyone else's. You're a pathetic weak-minded fool, little Arlen.
    Take your childish insults and shove them where the sun don't shine.

    For Christs' sake. It's no longer surprising you can't tell the difference between the number of convictions versus the number from Apple's reporting.

    Or from Google's. Or Facebook.

    That you can't tell the difference is how I know your IQ is below normal.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 16:16:59 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 01:56:24 GMT :

    Oh Jesus. Everyone, yes, even you zealots, knows there are
    convictions. Nobody said otherwise.

    Liar. *YOU* said there were "absolutely zero" convictions. You're
    trying to pretend you didn't say that, but it's on record right here
    in this thread.

    Oh my God, Jolly Roger...

    You still don't get it that nobody said people aren't convicted of the
    crime.

    Liar - here are your own words, little Arlen, where you say "absolutely
    zero" were caught and convicted:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    And you're dead wrong:

    Man Stored Child Pornography on Google Account, Sentenced to 14 Years in Federal Prison
    <https://www.justice.gov/usao-wdtx/pr/man-stored-child-pornography-google-account-sentenced-14-years-federal-prison>
    ---
    SAN ANTONIO – A Maryland man was sentenced in a federal court in San
    Antonio today to 168 months in prison followed by 30 years of supervised release for the receipt of child pornography.

    According to court documents, David Edward King, 59, of Ellicott City, Maryland, stored 504 videos and 2,050 images depicting child sexual
    abuse material (CSAM) in Google applications while living in San
    Antonio. A review of King’s cell phone revealed that he had received
    the illicit material, some of which involved prepubescent children, via
    the instant messaging applications ICQ and Telegram. Google became
    aware of the CSAM being stored in their Google Photos and Google Drive infrastructure and filed a report to the National Center for Missing and Exploited Children.
    ---

    Your insults show what we all know: you're an immature man-child.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- Sou
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 16:21:06 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 01:57:26 GMT :

    'It's probably zero given it's the most important metric. And it's
    missing from the story.'

    And before that he literally said "absolutely zero" convictions have
    resulted from CSAM scanning - right here in this thread. Arlen is a
    clown.

    Jesus Chris, Jolly Roger.

    You can't tell the difference between the number of convictions from
    all sources and the percentage of convictions from Apple, Google or FB reports?

    Who is that stupid?

    Jesus Christ, little Arlen.

    You don't remember your own words, where you claimed there have been "absolutely zero" pedophiles caught and convicted as a result of CSAM
    scanning? Here, let's refresh your rotten memory:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    And, of course, you're dead wrong:

    Man Stored Child Pornography on Google Account, Sentenced to 14 Years in Federal Prison <https://www.justice.gov/usao-wdtx/pr/man-stored-child-pornography-google-account-sentenced-14-years-federal-prison>
    ---
    SAN ANTONIO – A Maryland man was sentenced in a federal court in San
    Antonio today to 168 months in prison followed by 30 years of supervised release for the receipt of child pornography.

    According to court documents, David Edward King, 59, of Ellicott City, Maryland, stored 504 videos and 2,050 images depicting child sexual
    abuse material (CSAM) in Google applications while living in San
    Antonio. A review of King’s cell phone revealed that he had received
    the illicit material, some of which involved prepubescent children, via
    the instant messaging applications ICQ and Telegram. Google became
    aware of the CSAM being stored in their Google Photos and Google Drive infrastructure and filed a report to the National Center for Missing and Exploited Children.
    ---

    Keep slinging juvenile insults and squirming around if it helps you feel better, but your own words are on record, little boy.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 16:22:30 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 02:36:58 GMT :

    First he says there have been "absolutely zero" convictions as a result
    of CSAM scanning.

    It's no longer shocking that your IQ is so dismally low that you can't tell the difference between the number of convictions from all sources, and the percentage (or number) of convictions from just Apple.

    Your own words, little Arlen:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    As always, you're full of shit.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 16:23:59 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 02:39:40 GMT :

    You're the one who started out with "absolutely zero" convictions, you
    complete asshole. Then you walked it back to "probably zero", and now
    you are fabricating things nobody here has said - all while desperately
    trying to move the goal post to conviction rates. And the fact that you
    actually think you are fooling anyone here reflects more on your own IQ
    than anyone else's. You're a pathetic weak-minded fool, little Arlen.
    Take your childish insults and shove them where the sun don't shine.

    For Christs' sake. It's no longer surprising you can't tell the difference between the number of convictions versus the number from Apple's reporting.

    You're trying to move the goal post. Here are your own words, little
    Arlen:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Choke on them.

    That you can't tell the difference is how I know your IQ is below normal.

    Juvenile insults are all you have, little Arlen.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 18:25:18 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 16:16:59 GMT :

    Liar - here are your own words, little Arlen, where you say "absolutely
    zero" were caught and convicted:

    You not comprehending

    You trying to move the goal post is how I know you're a low-level troll.
    You can't escape your own words, little Arlen. Trim them all you want,
    they are a matter of public record forvever:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 18:26:02 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 16:21:06 GMT :

    You don't remember your own words, where you claimed there have been
    "absolutely zero" pedophiles caught and convicted as a result of CSAM
    scanning? Here, let's refresh your rotten memory:

    You not comprehending the difference between zero percent of Apple
    reports versus zero total convictions is how I know you zealots own
    subnormal IQs.

    You trying to move the goal post is how I know you're a low-level troll,
    little Arlen. You can't escape your own words:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Sun Jul 28 18:49:20 2024
    On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for not
    sending millions of photos to already overwhelmed authorities. >>>>>>>>
    For all that is in the news stories, it could be ZERO
    convictions resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss
    of privacy

    Actually, Apple could not guarantee that, and there was a
    non-zero chance that false positive matches would result in
    privacy violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is
    harmed, that's one too many. Would you want to be that unlucky
    innocent person who has to deal with charges, a potential criminal
    sexual violation on your record, and all that comes with it? I
    certainly wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated.

    You're entering into confucious territory. If nothing is triggered is anyone's privacy infringed.

    You're claiming innocent photos would never match, but there is a
    possibility of false matches inherent in the algorithm, no matter how
    small.

    Do you want a stranger looking at photos of your sick child?

    That wouldn't happen with Apple's method.

    It would. If a sufficient number of images matched Apple's algorithms
    (which are not perfect and allow for false matches), a human being would
    be looking at those photos of your naked sick child. How else do you
    think Apple would determine whether the images in question are or are
    not CSAM? And what happens when that stranger decides "You know what? I
    think these photos are inappropriate even if they don't match known CSAM"?

    What if that stranger came to the conclusion that those photos are
    somehow classifiable as sexual or abusive in some way? Would you want
    to have to argue your case in court because of it?

    That's a lot of ifs and steps.

    Yes, but it's possible.

    No-one is going to be charged for a dubious
    photo of their own child. There are much bigger fish to fry and get into jail.

    You're wrong. It has already happened:

    A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
    Him as a Criminal
    <https://archive.is/78Pla#selection-563.0-1075.217>

    Read the whole article to get a glimpse of what innocent people go
    through who fall victim to this invasive scanning.

    Do you think these parents and their child consider their privacy to be violated? How would you feel if your intimate photos were added to the
    PhotoDNA CSAM database because they were incorrectly flagged?

    ---
    In 2021, the CyberTipline reported that it had alerted authorities
    to “over 4,260 potential new child victims.” The sons of Mark and Cassio were counted among them.
    ---

    A lot of really bad things can happen to good people:

    ---
    “This would be problematic if it were just a case of content
    moderation and censorship,” Ms. Klonick said. “But this is doubly
    dangerous in that it also results in someone being reported to law enforcement.” It could have been worse, she said, with a parent
    potentially losing custody of a child. “You could imagine how this might escalate,” Ms. Klonick said.
    ---

    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    Yes, but one is one too many in my book.

    How many children are you prepared to be abused to protect YOUR
    privacy?

    Now you're being absurd. My right to privacy doesn't cause any children
    to be abused.

    Apple was wise to shelve this proposal
  • From Andrew@21:1/5 to Chris on Mon Jul 29 11:27:12 2024
    Chris wrote on Mon, 29 Jul 2024 07:36:46 -0000 (UTC) :

    I think you need to have a lie down. You literally are making no sense anymore.

    It's not surprising that you uneducated zealots can't follow simple logic.

    What matters isn't the number of reports but the percentage of convictions.

    That you fail to comprehend a concept so obvious and simple, is why I have ascertained you zealots are of low IQ as this is not a difficult concept.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Mon Jul 29 11:23:48 2024
    XPost: alt.privacy

    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >> versus zero total convictions is how I know you zealots own subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about three different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    b. If the FB/Google reporting rate is high, and yet if their conviction
    rate is low (based on the number of reports), then they are
    overreporting images.

    c. None of us know if either is true unless and until we know the
    conviction rates per Apple, Facebook, & Google - which are not
    in the reports (which were aimed to lambaste Apple).

    d. That conviction rate information is so important, that nobody
    is so stupid to not ASK for it BEFORE making any assessments.

    e. Given the people who wrote those reports are not likely to be
    stupid, the fact they left out the most important factor,
    directly implies the obvious, based on the omission itself.

    Now what do you think that omitted fact directly implies, Chris?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Mon Jul 29 15:25:36 2024
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:

    No-one is going to be charged for a dubious photo of their own
    child. There are much bigger fish to fry and get into jail.

    You're wrong. It has already happened:

    A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
    Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>

    I explicitly said "charged". No-one got charged. The law is working
    just fine. It's the tech, as I've been arguing all along, that's the
    problem.

    So it's okay that these parents and their child had their privacy
    violated, their child's naked photos added to the CSAM database, and
    their accounts (along with years of emails, photos, and so on)
    revoked/deleted because they were officially charged?

    Read the whole article to get a glimpse of what innocent people go
    through who fall victim to this invasive scanning.

    Do you think these parents and their child consider their privacy to
    be violated? How would you feel if your intimate photos were added to
    the PhotoDNA CSAM database because they were incorrectly flagged?

    This wasn't PhotoDNA, which is what Apple was similar to. It was
    google's AI method that is designed to "recognize never-before-seen exploitative images of children" which is where the real danger sits.

    It is designed to identify new abuse images based on only the pixel
    data so all hits will be massively enriched for things that look like
    abuse. A human won't have the ability to accurately identify the
    (likely innocent) motivation for taking photo and "to be safe" will
    pass it onto someone else make the decision i.e. law enforcement. The
    LE will have access to much more information and see it's an obvious
    mistake as seen in your article.

    Actually, a human being does review it with Google's system:

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they
    met the federal definition of child sexual abuse material. When Google
    makes such a discovery, it locks the user’s account, searches for other exploitative material and, as required by federal law, makes a report to
    the CyberTipline at the National Center for Missing and Exploited
    Children.
    ---

    Apple's system was more like hashing the image data and comparing
    hashes where false positives are due to algorithmic randomness. The
    pixel data when viewed by a human won't be anything like CSAM and an
    easy decision made.

    What's crucial here is that Google are looking for new stuff - which
    is always problematic - whereas Apple's was not. The search space when looking for existing images is much tinier and the impact of false
    positives much, much smaller.

    Yes, but even in Apple's case, there's a small change of a false
    positive patch.
  • From Alan@21:1/5 to Andrew on Mon Jul 29 11:42:00 2024
    XPost: alt.privacy

    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >>> versus zero total convictions is how I know you zealots own subnormal IQs. >>
    Not at all. My position hasn't changed. You, however, have had about three >> different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to badgolferman on Mon Jul 29 13:11:32 2024
    On 2024-07-29 13:04, badgolferman wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that given >> the dozens/hundreds of images of suspected abuse images they review a day
    they won't have the ability to make informed decisions.

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they
    met the federal definition of child sexual abuse material.

    What kind of a person would want this job?

    There are lots of jobs that need doing that very few want to do.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chris on Mon Jul 29 13:40:14 2024
    On 2024-07-29 13:38, Chris wrote:
    On 29/07/2024 21:04, badgolferman wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that
    given
    the dozens/hundreds of images of suspected abuse images they review a
    day
    they won't have the ability to make informed decisions.

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they >>>> met the federal definition of child sexual abuse material.

    What kind of a person would want this job?

    I read an article a couple of years ago on the Facebook content
    moderators. Many ended up traumatised and got no support. God it was a
    grim read.

    I think I recall that as well.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Mon Jul 29 16:11:52 2024
    XPost: alt.privacy

    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple
    reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about
    three
    different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that
    you
    don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT
        underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users stored
    in iCloud for child sexual abuse material (CSAM). The tool was meant to
    be privacy-preserving and allow the company to flag potentially
    problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
    concerned that the surveillance capability itself could be abused to
    undermine the privacy and security of iCloud users around the world. At
    the beginning of September 2021, Apple said it would pause the rollout
    of the feature to “collect input and make improvements before releasing
    these critically important child safety features.” In other words, a
    launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to report
    the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the
    Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
    use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents
    folder, some HEIC files taken with an iPhone and copied to the Mac using
    the Image Capture program (used for dumping photos from an iOS device
    attached with an USB cable).

    I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
    lot of Apple OS-level apps because I’m not interested in transmitting
    any of my data whatsoever to Apple over the network - mostly because
    Apple turns over customer data on over 30,000 customers per year to US
    federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a program
    named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
    software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Mon Jul 29 16:21:12 2024
    XPost: alt.privacy

    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple
    reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about
    three
    different positions on this thread and keep getting confused which one >>>> you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions based on >>> the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT
        underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users stored
    in iCloud for child sexual abuse material (CSAM). The tool was meant to
    be privacy-preserving and allow the company to flag potentially
    problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
    concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At
    the beginning of September 2021, Apple said it would pause the rollout
    of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a
    launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to report
    the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
    use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using
    the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
    lot of Apple OS-level apps because I’m not interested in transmitting
    any of my data whatsoever to Apple over the network - mostly because
    Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a program
    named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
    software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the case.'

    <https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>

    'The mediaanalysisd process is a background task that starts every time
    an image file is previewed in Finder, and then calls an Apple service.
    The process is designed to run machine learning algorithms to detect
    objects in photos and make object-based search possible in the Photos
    app. It also helps Finder to detect text and QR codes in photos. Even if
    a user does not use the Photos app or have an iCloud account, the
    process will still run.'

    Apple is not scanning your photos for CSAM

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Mon Jul 29 23:35:29 2024
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    Yes, but even in Apple's case, there's a small change of a false
    positive patch. And were that to happen, there is a danger of an
    innocent person's privacy being violated.

    In every case there's a chance of FPs. Apple would have had lower FPR then *the current* system.

    Given the choice I'm in favour of the better, evidence-based method.

    You're wrong. The choices now are:

    - Use systems and services where CSAM scanning disregards your privacy.

    - Use systems and services that do no CSAM scanning of private content.

    The latter happens to be Apple's systems and services (with the singular exception of email).

    You're in favour of the worse system

    Nope. I have never said that. I'm in favor of no CSAM scanning of
    private content.

    Nope. I don't support any scanning of private content.

    Yet it's already happening so why not support the better method?

    Speak for yourself. It's certainly not happening to my private content.

    I agree - still not good enough for me though.

    "Perfect is the enemy of the good"

    By seeking perfection you and others are allowing and enabling child
    abuse.

    Nah. There is no child abuse occurring in my private content, and my
    decision not to use or support privacy-violating technology isn't
    harming anyone.

    Apple only shelved it for PR reasons, which is a real shame.

    You don't know all of Apple's motivations. What we know is Apple shelved
    it after gathering feedback from industry experts. And many of those
    experts were of the opinion that even with Apple's precautions, the risk
    of violating people's privacy was too great.

    That wasn't the consensus. The noisy tin-foil brigade drowned out any possible discussion. i

    Not true. There was plenty of collaboration and discussion. Here's what
    Apple said about their decision:

    ---
    "Child sexual abuse material is abhorrent and we are committed to
    breaking the chain of coercion and influence that makes children
    susceptible to it," Erik Neuenschwander, Apple's director of user
    privacy and child safety, wrote in the company's response to Heat
    Initiative. He added, though, that after collaborating with an array of
    privacy and security researchers, digital rights groups, and child
    safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

    "Scanning every user's privately stored iCloud data would create new
    threat vectors for data thieves to find and exploit," Neuenschwander
    wrote. "It would also inject the potential for a slippery slope of
    unintended consequences. Scanning for one type of content, for instance,
    opens the door for bulk surveillance and could create a desire to search
    other encrypted messaging systems across content types."
    ---

    Apple should have sim
  • From Jolly Roger@21:1/5 to badgolferman on Mon Jul 29 23:36:41 2024
    On 2024-07-29, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that
    given the dozens/hundreds of images of suspected abuse images they
    review a day they won't have the ability to make informed decisions.

    --- A human content moderator for Google would have reviewed the
    photos after they were flagged by the artificial intelligence to
    confirm they met the federal definition of child sexual abuse
    material.

    What kind of a person would want this job?

    I'll give you three guesses.

    Guess what kind of people are most attracted to positions of power (such
    as police)?

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Mon Jul 29 23:39:58 2024
    XPost: alt.privacy

    On 2024-07-29, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >>>> versus zero total convictions is how I know you zealots own subnormal IQs. >>>
    Not at all. My position hasn't changed. You, however, have had about three >>> different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that you >> don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    It's zero for *photos*, but not for *email*:

    <https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/>

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Mon Jul 29 18:10:29 2024
    XPost: alt.privacy

    Alan wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple >>>>>> reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had
    about three
    different positions on this thread and keep getting confused which one >>>>> you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions
    based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only >>>> then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT >>>>     underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/


    In August 2021, Apple announced a plan to scan photos that users
    stored in iCloud for child sexual abuse material (CSAM). The tool was
    meant to be privacy-preserving and allow the company to flag
    potentially problematic and abusive content without revealing anything
    else. But the initiative was controversial, and it soon drew
    widespread criticism from privacy and security researchers and digital
    rights groups who were concerned that the surveillance capability
    itself could be abused to undermine the privacy and security of iCloud
    users around the world. At the beginning of September 2021, Apple said
    it would pause the rollout of the feature to “collect input and make
    improvements before releasing these critically important child safety
    features.” In other words, a launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to
    report the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >> Mac App Store. I don’t store photos in the macOS “Photos” application, >> even locally. I never opted in to Apple network services of any kind -
    I use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents
    folder, some HEIC files taken with an iPhone and copied to the Mac
    using the Image Capture program (used for dumping photos from an iOS
    device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network
    traffic attempted by the programs I use. I have all network access
    denied for a lot of Apple OS-level apps because I’m not interested in
    transmitting any of my data whatsoever to Apple over the network -
    mostly because Apple turns over customer data on over 30,000 customers
    per year to US federal police without any search warrant per Apple’s
    own self-published transparency report. I’m good without any of that
    nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a
    program named mediaanalysisd (Media Analysis Daemon - a background
    process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled
    via controls within the OS: you must used third party network
    filtering software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the case.'



    Bullshit.

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Mon Jul 29 17:14:48 2024
    XPost: alt.privacy

    On 2024-07-29 17:10, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of
    Apple reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had
    about three
    different positions on this thread and keep getting confused which >>>>>> one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never >>>>> changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions
    based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and
    only
    then) will you realize I've maintained the same position throughout. >>>>>
    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT >>>>>     underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning
    of images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users
    stored in iCloud for child sexual abuse material (CSAM). The tool was
    meant to be privacy-preserving and allow the company to flag
    potentially problematic and abusive content without revealing
    anything else. But the initiative was controversial, and it soon drew
    widespread criticism from privacy and security researchers and
    digital rights groups who were concerned that the surveillance
    capability itself could be abused to undermine the privacy and
    security of iCloud users around the world. At the beginning of
    September 2021, Apple said it would pause the rollout of the feature
    to “collect input and make improvements before releasing these
    critically important child safety features.” In other words, a launch
    was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to
    report the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >>> Mac App Store. I don’t store photos in the macOS “Photos”
    application, even locally. I never opted in to Apple network services
    of any kind - I use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my
    Documents folder, some HEIC files taken with an iPhone and copied to
    the Mac using the Image Capture program (used for dumping photos from
    an iOS device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network
    traffic attempted by the programs I use. I have all network access
    denied for a lot of Apple OS-level apps because I’m not interested in
    transmitting any of my data whatsoever to Apple over the network -
    mostly because Apple turns over customer data on over 30,000
    customers per year to US federal police without any search warrant
    per Apple’s own self-published transparency report. I’m good without >>> any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a
    program named mediaanalysisd (Media Analysis Daemon - a background
    process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled
    via controls within the OS: you must used third party network
    filtering software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process
    mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the
    case.'



    Bullshit.

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos & Videos, and your photos will all be on your iPhone, ready to back up somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.


    That discusses a system that Apple disabled.

    And doesn't support your first source AT ALL.

    'Mysk:

    No, macOS doesn’t send info about your local photos to Apple We analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul that it
    scans local photos and secretly sends the results to an Apple server.

    […]

    We analyzed the network traffic sent and received by mediaanalysisd.
    Well, the call is literally empty. We decrypted it. No headers, no IDs, nothing. Just a simple GET request to this endpoint that returns
    nothing. Honestly, it looks like it is a bug.

    Mysk:

    The issue was indeed a bug and it has been fixed in macOS 13.2. The
    process no longer makes calls to Apple servers.'

    <https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Tue Jul 30 01:41:02 2024
    XPost: alt.privacy

    On 2024-07-30, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 17:10, Chips Loral wrote:

    Bullshit.

    That discusses a system that Apple disabled.

    A proposal that was never implemented and was shelved.

    And doesn't support your first source AT ALL.

    'Mysk:

    No, macOS doesn’t send info about your local photos to Apple We
    analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul
    that it scans local photos and secretly sends the results to an Apple
    server.

    […]

    We analyzed the network traffic sent and received by mediaanalysisd.
    Well, the call is literally empty. We decrypted it. No headers, no
    IDs, nothing. Just a simple GET request to this endpoint that returns nothing. Honestly, it looks like it is a bug.

    Mysk:

    The issue was indeed a bug and it has been fixed in macOS 13.2. The
    process no longer makes calls to Apple servers.'

    <https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>

    Yup. I've had Little Snitch installed on this Mac Studio since I bought
    it, and the Little Snitch Network Monitor has no record of that process
    ever connecting to the internet.

    Chips Loral is either extremely gullible or a simple troll. Either way,
    it's clear he's not interested in factual discourse on this subject.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Tue Jul 30 01:38:11 2024
    XPost: alt.privacy

    On 2024-07-29, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:

    Apple's reporting rate is ZERO, because they're not doing scanning
    of images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the
    case.'

    <https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>

    'The mediaanalysisd process is a background task that starts every
    time an image file is previewed in Finder, and then calls an Apple
    service. The process is designed to run machine learning algorithms
    to detect objects in photos and make object-based search possible in
    the Photos app. It also helps Finder to detect text and QR codes in
    photos. Even if a user does not use the Photos app or have an iCloud
    account, the process will still run.'

    Apple is not scanning your photos for CSAM

    Yup. Jeffrey Paul should be embarrassed and ashamed of himself. He's not
    much of a "hacker and security researcher" if he didn't even bother to
    learn what the process actually does before making outlandish claims
    about it. What a weirdo.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Tue Jul 30 10:14:01 2024
    XPost: alt.privacy

    On 2024-07-30 10:01, Chips Loral wrote:
    Alan wrote:
    The issue was indeed a bug


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    You're just showing your ignorance now.

    1. There was a proposed Apple system for checking images that were to be uploaded to Apple's iCloud system for photos and videos. That checking
    was going to take place ON THE PHONE and it was only going to compare
    images to KNOWN CSAM images. That system is what your "article" is
    talking about.

    2. That system was never actually implemented.

    3. Long after that, someone noticed a network connection made by a piece
    of software called "mediaanalysisd" (media analysis daemon) to an Apple
    server, but that connection:

    a. Was a GET that never actually sent any information TO the server.

    b. Was clearly a bug, as it was removed during an OS update.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Tue Jul 30 11:01:40 2024
    XPost: alt.privacy

    Alan wrote:
    The issue was indeed a bug


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)