Apple declined to comment on the NSPCC's accusation, instead pointing
The Guardian to a statement it made when it shelved the CSAM scanning
plan. Apple said it opted for a different strategy that “prioritizes
the security and privacy of [its] users.” The company told Wired in
August 2022 that "children can be protected without companies combing
through personal data."
Apple has been accused of underreporting the prevalence of child sexual
abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.
That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required
to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.
As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” Richard Collard, the NSPCC's head of child safety online policy, said. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”
Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company told Wired in August 2022 that "children can be protected without companies combing through personal
data."
https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html
Of course, anyone with more than one braincell couldn't care less about
this "privacy" nonsense when it means catching more of the illegal
scumbags who are doing wrong.
Apple has been accused of underreporting the prevalence of child sexual
abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.
That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required
to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.
As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” Richard Collard, the NSPCC's head of child safety online policy, said. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”
Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company told Wired in August 2022 that "children can be protected without companies combing through personal
data."
https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html
Apple had a system ready to go, but a pile of brainless morons
complained about their "privacy" being invaded (which it wasn't), so
Apple was forced to abandon it. All this report achieves is to
acknowledge that all those other companies listed above are less
stringent about their users' privacy.
Of course, anyone with more than one braincell couldn't care less about
this "privacy" nonsense when it means catching more of the illegal
scumbags who are doing wrong.
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Apple declined to comment on the NSPCC's accusation, instead pointing
The Guardian to a statement it made when it shelved the CSAM scanning
plan. Apple said it opted for a different strategy that “prioritizes
the security and privacy of [its] users.” The company told Wired in
August 2022 that "children can be protected without companies combing
through personal data."
This is one reason many people choose Apple over alternatives.
After being a bit skeptical of Apple's solution, I realised it was a pretty good and pragmatic balance between respecting people's privacy and
protecting vulnerable people. I was disappointed that the angry "muh
freedom" brigade scuppered it.
Jörg Lorenz <hugybear@gmx.net> wrote:
Am 23.07.24 um 05:35 schrieb Your Name:Obviously you have a source for that, right?
Apple had a system ready to go, but a pile of brainless morons
complained about their "privacy" being invaded (which it wasn't), so
Apple was forced to abandon it. All this report achieves is to
acknowledge that all those other companies listed above are less
stringent about their users' privacy.
Of course, anyone with more than one braincell couldn't care less about
this "privacy" nonsense when it means catching more of the illegal
scumbags who are doing wrong.
Brain dead idiot: Learn to think! When the pictures are on the devices
the damage to the children is irreversibly done.
Spying on users is undermining democracy and human rights.
This isn't spying so no it doesn't.
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the
CSAM scanning plan. Apple said it opted for a different strategy
that “prioritizes the security and privacy of [its] users.” The
company told Wired in August 2022 that "children can be protected
without companies combing through personal data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved
the CSAM scanning plan. Apple said it opted for a different
strategy that “prioritizes the security and privacy of [its]
users.” The company told Wired in August 2022 that "children
can be protected without companies combing through personal
data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Jörg Lorenz <hugybear@gmx.net> wrote:
Am 23.07.24 um 13:31 schrieb Chris:
After being a bit skeptical of Apple's solution, I realised it was a pretty >>> good and pragmatic balance between respecting people's privacy and
protecting vulnerable people. I was disappointed that the angry "muh
freedom" brigade scuppered it.
It was neither a good nor an acceptable solution. This being the reason
why Apple decided against it in the end.
It was sunk by reactionary know-it-alls. If anyone bothered looked at the technology - which Apple published openly - they would have seen it was pretty elegant and privacy preserving.
It could have been a really good tool to protect children, but no, people's non-rights were more important.
Alan wrote:
On 2024-07-23 09:21, badgolferman wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it
shelved the CSAM scanning plan. Apple said it opted for
a different strategy that “prioritizes the security and
privacy of [its] users.” The company told Wired in August
2022 that "children can be protected without companies
combing through personal data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
He said it poorly, but "privacy for everyone" IS important in that
when you create a system that purports only to take privacy away from
the bad actors, you are really taking it away from everyone.
If you're on the internet there is no pretense of privacy.
badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the
CSAM scanning plan. Apple said it opted for a different
strategy that “prioritizes the security and privacy of [its]
users.” The company told Wired in August 2022 that "children
can be protected without companies combing through personal
data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Agreed.
No-one is forced to use icloud. If they didn't like the policy, they
could go elsewhere.
Like to google and meta, who are more than happy to share millionsof
people's private photos with law enforcement which apparently is just
fine.
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the
CSAM scanning plan. Apple said it opted for a different strategy
that “prioritizes the security and privacy of [its] users.” The >>>>> company told Wired in August 2022 that "children can be protected
without companies combing through personal data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Jörg Lorenz <hugybear@gmx.net> wrote:
Am 23.07.24 um 13:31 schrieb Chris:
After being a bit skeptical of Apple's solution, I realised it was a
pretty good and pragmatic balance between respecting people's
privacy and protecting vulnerable people. I was disappointed that
the angry "muh freedom" brigade scuppered it.
It was neither a good nor an acceptable solution. This being the
reason why Apple decided against it in the end.
It was sunk by reactionary know-it-alls. If anyone bothered looked at
the technology - which Apple published openly - they would have seen
it was pretty elegant and privacy preserving.
It could have been a really good tool to protect children, but no,
people's non-rights were more important.
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Apple declined to comment on the NSPCC's accusation,
instead pointing The Guardian to a statement it made when
it shelved the CSAM scanning plan. Apple said it opted
for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in
August 2022 that "children can be protected without
companies combing through personal data."
This is one reason many people choose Apple over
alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can
work, and why Apple wisely withdrew their proposal even though it
worked harder to preserve privacy than any other solution.
No, I take exception to your statement that privacy is for *everyone*.
There are plenty of people in this world who should have their privacy
and freedom taken away forever. Many of them their lives too.
Everyone on this planet should have a right to basic privacy.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can >>work, and why Apple wisely withdrew their proposal even though it
worked harder to preserve privacy than any other solution.
No, I take exception to your statement that privacy is for *everyone*.
There are plenty of people in this world who should have their privacy
and freedom taken away forever. Many of them their lives too.
Hint: When it comes to privacy, I'm in favor of Apple's strategic decision >> not to strip search every single person who owns an Apple device.
But you're happy that Google, Meta et al are reporting millions of images
to law enforcement? I don't see you criticising them.
Everyone on this planet should have a right to basic privacy.
I agree with anyone who makes a logically sensible assessment of fact.
I fully agree with Jolly Roger (and I disagree with badgolferman).
We should never have to defend our right to privacy.
The object of my disagreement is not CSAM, it's the idea that everyone has the right to privacy. Prisoners, murderers, rapists, child molestors, etc.
do not deserve even basic privacy. Many of them are let off scot free for stupid technicalities while the families are left with no justice. Many of these people have a long history of harming others and should have been
under surveillance or even locked up long before they hurt another person.
Andrew <andrew@spam.net> wrote:
Jolly Roger wrote on 23 Jul 2024 17:55:55 GMT :
Everyone on this planet should have a right to basic privacy.
I agree with anyone who makes a logically sensible assessment of
fact.
I fully agree with Jolly Roger (and I disagree with badgolferman).
We should never have to defend our right to privacy.
The object of my disagreement is not CSAM, it’s the idea that everyone
has the right to privacy.
Prisoners, murderers, rapists, child molestors, etc.
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Apple declined to comment on the NSPCC's accusation,
instead pointing The Guardian to a statement it made when it
shelved the CSAM scanning plan. Apple said it opted for a
different strategy that “prioritizes the security and privacy >>>>>>> of [its] users.” The company told Wired in August 2022 that
"children can be protected without companies combing through
personal data."
This is one reason many people choose Apple over
alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can
work, and why Apple wisely withdrew their proposal even though it
worked harder to preserve privacy than any other solution.
No, I take exception to your statement that privacy is for *everyone*.
There are plenty of people in this world who should have their privacy
and freedom taken away forever. Many of them their lives too.
Chris wrote on Tue, 23 Jul 2024 11:26:41 -0000 (UTC) :
Hint: When it comes to privacy, I'm in favor of Apple's strategic
decision not to strip search every single person who owns an Apple
device.
But you're happy that Google, Meta et al are reporting millions of
images to law enforcement? I don't see you criticising them.
You religious zealots always jump to the conclusion that everyone who
is not an Apple nutcase must be an Android nutcase - which just isn't
true.
First off, I can't even spell CSAM
and I don't even know what it stands for
You religious zealots always jump to the conclusion that everyone who
is not an Apple nutcase must be an Android nutcase - which just isn't
true.
Yet what he said is true: We don't see you criticizing other companies
for it even though they do CSAM scanning and are much more invasive
about it than Apple proposed.
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending millions of photos to already overwhelmed authorities.
Everyone on this planet should have a right to basic privacy.
And they do. That right is not absolute, however. Just like everyone has a right to freedom until they are convicted of a serious crime and are sent
to prison. Even *suspects* of serious crimes are held in prison before conviction.
First off, I can't even spell CSAM and I don't even know what it stands
for, although I'm sure I could look it up if I cared just to find it's
perhaps it's some well marketed acronym of something related to child porn.
You're depraved if you genuinely believe CSAM is just marketing.
It's people like you - who don't care about the real harms being done -
that ruin perfectly good ideas because it has some tiny (probably theoretical) impact on your lives.
Jolly Roger <jollyroger@pobox.com> wrote:
True. Unlike others, Apple's proposal was to only scan images on
device that were being uploaded to Apple's servers, and only match
hashes of them to a database of hashes matching known CSAM images.
And only after multiple matches reached a threshold would Apple
investigate further.
All correct.
Yet even with those precautions, there was still a realistic chance
of false positives
The rate was deterministic and tunable.
and invasion of privacy, which is why they scrapped the proposal.
No.
They scrapped it because it wasn't worth pursuing. As a business it
was of no benefit to them and the noisy reaction was enough to put
them off. There wasn't any "invasion of privacy". At least no more
than there currently is in the US.
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending millions
of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Jolly Roger <jollyroger@pobox.com> wrote:
Apple's proposal was to match personal photos with *known* CSAM
images.
Correct.
It would do nothing to detect *new* CSAM images.
Also correct.
And it could not prevent false positive matches.
Incorrect.
It is designed to avoid false positives, although nothing is 100%
perfect.
Everyone on this planet should have a right to basic privacy.
And they do.
Chris wrote on Wed, 24 Jul 2024 07:05:03 -0000 (UTC) :
Everyone on this planet should have a right to basic privacy.
And they do. That right is not absolute, however. Just like everyone
has a right to freedom until they are convicted of a serious crime
and are sent to prison. Even *suspects* of serious crimes are held in
prison before conviction.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
True. Unlike others, Apple's proposal was to only scan images on
device that were being uploaded to Apple's servers, and only match
hashes of them to a database of hashes matching known CSAM images.
And only after multiple matches reached a threshold would Apple
investigate further.
All correct.
Yet even with those precautions, there was still a realistic chance
of false positives
The rate was deterministic and tunable.
If the rate was anything other than ZERO, them people's privacy was at
risk.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Nah, there are news articles all of the time about shit like this:
Tell that to the people whose private photos are scanned and are falsely
accused of a crime they didn't commit because an imperfect algorithm got
it wrong.
It's already happening. Is it better that 30m images were used to violate
god knows how many people currently or a better method where a far tinier amount which are highly enriched for true positive?
Chris. Please put your well-educated adult hat on when you read this.
Warning: There is math involved in sensibly logical thought processes.
You present no maths. Just baseless supposition.
1. There is NOTHING in the news articles about conviction rates.
2. Let's say that another way: The articles are complete bullshit!
The articles are yellow journalism clickbait, Chris.
Without specifying conviction rates, the number of reported images is
completely meaningless, Chris.
In fact, it could be Apple's conviction rate is far greater than that oF
Google & Facebook for all we know, Chris.
Since the conviction rate isn't stated, I will assume it is zero.
Think about that.
You're only making that assumption because it suits your preference and
bias.
There might have been thousands of arrests and convictions. Or will be in
the future as the investigations can take a while.
For absolutely zero gain in child safety, all our privacy was compromised.
Simply a convenience for you to think like that.
For all that is in the news stories, it could be ZERO convictions
resulted.
It's nowhere near zero.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Your right to privacy shouldn't be violated because someone else might do something wrong.
Chris wrote on Wed, 24 Jul 2024 17:23:22 -0000 (UTC) :
Chris. Please put your well-educated adult hat on when you read this.
Warning: There is math involved in sensibly logical thought processes.
You present no maths. Just baseless supposition.
Chris,
When you earned that PhD, did you never take any logic classes?
Reporting images isn't the end goal.
The end goal is convictions.
Everyone knows that.
The fact the stories don't report the only metric that matters is a very clear indication the CSAM process is a complete sham. It's pure bullshit.
1. There is NOTHING in the news articles about conviction rates.
2. Let's say that another way: The articles are complete bullshit!
The articles are yellow journalism clickbait, Chris.
Without specifying conviction rates, the number of reported images is
completely meaningless, Chris.
In fact, it could be Apple's conviction rate is far greater than that oF >>> Google & Facebook for all we know, Chris.
Since the conviction rate isn't stated, I will assume it is zero.
Think about that.
You're only making that assumption because it suits your preference and
bias.
The only metric that matters is the conviction rate.
Not the number of images reported.
Did you ever take logic in college, Chris?
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending millions of
photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions resulted. >>
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in child safety?
Apple's solution wouldn't have resulted in any additional loss of privacy plus it only affected customers of icloud. Don't like it? Don't use icloud. Simple.
It's interesting that you decided to choose (male) homosexuality as your exemplar.
Something that has been oppressed and vilified by the church for centuries primarily because of the distaste (fetishisation even) for the sexual act - sodomy like you mention - for no good reason.
Curiously lesbianism was rarely so targeted and was often just accepted, if not mentioned.
Like you say it is purely an opinion and choice which the state has no
right to have a say when it involves consenting adults in a private place with no harm being done.
You're, maybe unconsciously, making an equivalence here between
homosexuality and child sexual abuse because like many the focus is on the sex rather than the abuse. Homosexuality is now legal so maybe CSAM is just an opinion, right?
Wrong. Children can never consent to the abuse and are very vulnerable so need additional protections by the adults in the room. There is no way we should allow this to happen when we have ways to stop it in a healthy society. Impacting on someone's theoretical privacy - which is unproven -
is a reasonable balance.
The point of who has "rights" is up to the society that they live in.
Correct. No person's rights are more important then another's. Your right
to freedom doesn't overrule someone's right to life.
While I need not say I'm no lawyer, it's my understanding that, here, in
the USA, we are *all* presumed innocent until proven guilty - right?
In theory in the US. Not always in practice.
And, we have in our Constitution the fundamental right to not be subject to >> unreasonable search & seizure, right? Nor should our property be detained.
Key word here is "unreasonable". All your examples have clearly been unreasonable so very simplistic to defend.
We all know that the law allows for people's rights to be suspended even constitutional ones. Companies have an obligation to uphold the law.
Bearing in mind that for all we know, exactly ZERO people may have been
convicted after all those Google, Meta (and yes, Apple) reports, the
article is clearly bullshit meant to be an unwarranted attack on Apple.
For now, I'm going to assume, for lack of data, that exactly zero people
were convicted - which means Google, Meta, and yes, Apple, broke the law.
Which law, exactly?
Apple just does it far less than Google & Meta did.
Without the conviction rate - we have no business lambasting Apple.
Right. So where are your multitude of posts attacking google on the android forum?
Jolly Roger wrote on 24 Jul 2024 15:39:57 GMT :
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Nah, there are news articles all of the time about shit like this:
Intelligent people have an understanding of math that you lack, JR.
What matters is the percentage
On 2024-07-24 08:47, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
True. Unlike others, Apple's proposal was to only scan images on
device that were being uploaded to Apple's servers, and only match
hashes of them to a database of hashes matching known CSAM images.
And only after multiple matches reached a threshold would Apple
investigate further.
All correct.
Yet even with those precautions, there was still a realistic chance
of false positives
The rate was deterministic and tunable.
If the rate was anything other than ZERO, them people's privacy was
at risk.
By that argument, we must also scrap the traditional system of issuing warrants to search people's homes, because there is a non-zero rate of warrants issued in error.
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending millions
of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
plus it only affected customers of icloud. Don't like it? Don't use
icloud. Simple.
Jolly Roger wrote on 24 Jul 2024 15:48:22 GMT :
For all that is in the news stories, it could be ZERO convictions
resulted.
It's nowhere near zero.
It's probably zero given it's the most important metric.
Without convictions, the reporting of CSAM images is meaningless.
The reason they didn't report it is likely because it's actually zero.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Your right to privacy shouldn't be violated because someone else
might do something wrong.
I agree with you.
They get zero convictions.
It's probably zero given it's the most important metric.
It's not zero. Not even close.
Without convictions, the reporting of CSAM images is meaningless.
There have been plenty of convictions.
The reason they didn't report it is likely because it's actually zero.
No.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Your right to privacy shouldn't be violated because someone else
might do something wrong.
I agree with you.
They get zero convictions.
That's a lie.
By that argument, we must also scrap the traditional system of issuing
warrants to search people's homes, because there is a non-zero rate of
warrants issued in error.
Nah. Most warrants meet probable cause standards before a judge will
sign them. CSAM scanning requires no such due process. They are nowhere
near the same thing.
What matters is the percentage
No, words have meanings, and zero means zero. And there is a
higher-than-zero number of pedophiles who have been caught due to CSAM scanning. Unfortunately, there is also a higher-than-zero number of
innocent people whose privacy was violated in the process.
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy violations.
plus it only affected customers of icloud. Don't like it? Don't use
icloud. Simple.
That much is true. Only images uploaded to iCloud would have been
examined by the algorithm.
Jolly Roger wrote on 24 Jul 2024 21:30:33 GMT :
By that argument, we must also scrap the traditional system of
issuing warrants to search people's homes, because there is a
non-zero rate of warrants issued in error.
Nah. Most warrants meet probable cause standards before a judge will
sign them. CSAM scanning requires no such due process. They are
nowhere near the same thing.
When I assess that Alan Baker's IQ is the lowest of all the low-IQ
Apple zealots, I didn't expect Jolly Roger to prove me right on that assessment.
For Alan Baker to compare a warrant which is issued by a judge only
after probable cause has been established on the public record (which
the defendant has every right to object to in court, which would
overturn his conviction) to CSAM scanning of every possible image
uploaded, is absurd.
That absurd thought process shows Alan Baker's IQ can't be above
around 40.
Jolly Roger wrote on 24 Jul 2024 21:33:12 GMT :
It's probably zero given it's the most important metric.
It's not zero. Not even close.
You're simply guessing. I'm using logic.
They're different logical algorithms.
Without convictions, the reporting of CSAM images is meaningless.
There have been plenty of convictions.
Logically, if they had appreciable convictions, they'd have mentioned
it. The fact they don't bother to mention it, means it's probably
almost zero.
Because it isn't likely that they simply forgot the only fact that
matters.
The reason they didn't report it is likely because it's actually
zero.
No.
It's the only fact that matters. And they forgot it?
No. They're not that stupid. They're bullshitting us on this CSAM
garbage.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Your right to privacy shouldn't be violated because someone else
might do something wrong.
I agree with you.
They get zero convictions.
That's a lie.
I realize you guess at everything in life, Jolly Roger.
I use logic.
Logically thinking, the only metric that matters is convictions.
Jolly Roger wrote on 24 Jul 2024 21:29:07 GMT :
What matters is the percentage
No, words have meanings, and zero means zero. And there is a
higher-than-zero number of pedophiles who have been caught due to
CSAM scanning. Unfortunately, there is also a higher-than-zero number
of innocent people whose privacy was violated in the process.
While I support blah blah blah
Nothing else has any meaning.
badgolferman wrote:
Apple has been accused of underreporting the prevalence of child sexual
abuse material (CSAM) on its platforms. The National Society for the
Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM >> to the National Center for Missing & Exploited Children (NCMEC) last year. >>
That pales in comparison to the 1.47 million potential cases that Google
reported and 30.6 million reports from Meta. Other platforms that reported >> more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
(597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required >> to pass along any possible CSAM cases detected on their platforms to NCMEC, >> which directs cases to relevant law enforcement agencies worldwide.
As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end
encryption, which stops the company from viewing the contents of what users >> share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
“There is a concerning discrepancy between the number of UK child abuse
image crimes taking place on Apple’s services and the almost negligible
number of global reports of abuse content they make to authorities,”
Richard Collard, the NSPCC's head of child safety online policy, said.
“Apple is clearly behind many of their peers in tackling child sexual abuse
when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”
Apple declined to comment on the NSPCC's accusation, instead pointing The
Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in August 2022 that
"children can be protected without companies combing through personal
data."
https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html
Apple had a system ready to go, but a pile of brainless morons
complained about their "privacy" being invaded (which it wasn't), so
Apple was forced to abandon it. All this report achieves is to
acknowledge that all those other companies listed above are less
stringent about their users' privacy.
Jörg Lorenz <hugybear@gmx.net> wrote:
Am 23.07.24 um 13:31 schrieb Chris:
After being a bit skeptical of Apple's solution, I realised it was a pretty >>> good and pragmatic balance between respecting people's privacy and
protecting vulnerable people. I was disappointed that the angry "muh
freedom" brigade scuppered it.
It was neither a good nor an acceptable solution. This being the reason
why Apple decided against it in the end.
It was sunk by reactionary know-it-alls. If anyone bothered looked at the technology - which Apple published openly - they would have seen it was pretty elegant and privacy preserving.
It could have been a really good tool to protect children, but no, people's non-rights were more important.
badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved
the CSAM scanning plan. Apple said it opted for a different
strategy that “prioritizes the security and privacy of [its]
users.” The company told Wired in August 2022 that "children
can be protected without companies combing through personal
data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Agreed.
No-one is forced to use icloud. If they didn't like the policy, they could
go elsewhere. Like to google and meta, who are more than happy to share millionsof people's private photos with law enforcement which apparently
is just fine.
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> >>>>>wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the
CSAM scanning plan. Apple said it opted for a different strategy >>>>>> that “prioritizes the security and privacy of [its] users.” The >>>>>> company told Wired in August 2022 that "children can be protected >>>>>> without companies combing through personal data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can
work, and why Apple wisely withdrew their proposal even though it worked harder to preserve privacy than any other solution.
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Apple declined to comment on the NSPCC's accusation,
instead pointing The Guardian to a statement it made when
it shelved the CSAM scanning plan. Apple said it opted
for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in
August 2022 that "children can be protected without
companies combing through personal data."
This is one reason many people choose Apple over
alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can >>work, and why Apple wisely withdrew their proposal even though it
worked harder to preserve privacy than any other solution.
No, I take exception to your statement that privacy is for *everyone*.
There are plenty of people in this world who should have their privacy
and freedom taken away forever. Many of them their lives too.
Are the tech companies simply swamping
authorities with data? How does anyone cope with 30m reports from meta?
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Jolly Roger wrote on 23 Jul 2024 17:55:55 GMT :
Everyone on this planet should have a right to basic privacy.
I agree with anyone who makes a logically sensible assessment of
fact.
I fully agree with Jolly Roger (and I disagree with badgolferman).
We should never have to defend our right to privacy.
The object of my disagreement is not CSAM, it’s the idea that everyone
has the right to privacy.
This entire thread is in the context of CSAM, so that's a cop out.
Prisoners, murderers, rapists, child molestors, etc.
Again, in the context of this thread, we're talking about people who
have never been convicted or suspected of such crimes.
On 23/07/2024 19:49 Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
Jolly Roger wrote:
On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> >>>>>>wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the >>>>>>> CSAM scanning plan. Apple said it opted for a different strategy >>>>>>> that “prioritizes the security and privacy of [its] users.” The >>>>>>> company told Wired in August 2022 that "children can be protected >>>>>>> without companies combing through personal data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Your problem is you want to invade everyone's privacy regardless of
whether they are hurting anyone. That's the only way CSAM scanning can
work, and why Apple wisely withdrew their proposal even though it worked
harder to preserve privacy than any other solution.
You're wrong. Apple doesn't give a rat's ass about anyone's privacy.
Andrew <andrew@spam.net> wrote:
Jolly Roger wrote on 24 Jul 2024 21:33:12 GMT :
It's probably zero given it's the most important metric.
It's not zero. Not even close.
You're simply guessing. I'm using logic. They're different logical
algorithms.
Nope. You're guessing just as much as JR.
The facts - and we know you like them, but never look for them - are
that there are many convictions on a depressingly regular basis. Just
look at the press releases from the DoJ Project Safe Childhood: https://www.justice.gov/psc/press-room
Nothing you can say will change the fact that a greater-than-zero number
of people have been convicted from CSAM scanning - just like nothing you
can say will convince me that CSAM scanning can be done without
violating the privacy of innocent people. Things like this should not
happen:
And actually, the only metric that matters is the number of innocent
people whose privacy may be violated. If that number is greater than
zero, then you can count me (and a whole lot of others who reserve their right to privacy) out.
On 23/07/2024 05:35 Your Name <YourName@YourISP.com> wrote:
badgolferman wrote:
Apple has been accused of underreporting the prevalence of child sexual
abuse material (CSAM) on its platforms. The National Society for the
Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM
to the National Center for Missing & Exploited Children (NCMEC) last year.
That pales in comparison to the 1.47 million potential cases that Google
reported and 30.6 million reports from Meta. Other platforms that reported
more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
(597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required
to pass along any possible CSAM cases detected on their platforms to NCMEC,
which directs cases to relevant law enforcement agencies worldwide.
As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end
encryption, which stops the company from viewing the contents of what users
share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
“There is a concerning discrepancy between the number of UK child abuse
image crimes taking place on Apple’s services and the almost negligible
number of global reports of abuse content they make to authorities,”
Richard Collard, the NSPCC's head of child safety online policy, said.
sexual abuse“Apple is clearly behind many of their peers in tackling child
when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”
Apple declined to comment on the NSPCC's accusation, instead pointing The
Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in August 2022 that
"children can be protected without companies combing through personal
data."
https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html
Apple had a system ready to go, but a pile of brainless morons
complained about their "privacy" being invaded (which it wasn't), so
Apple was forced to abandon it. All this report achieves is to
acknowledge that all those other companies listed above are less
stringent about their users' privacy.
Many people threatened to throw away or even burn their iPhones if
Apple went ahead with the scheme. People don't want their actions
policed on THEIR phones.
This really scared Apple and they immediately did a 180 degree turn.
You're simply guessing. I'm using logic. They're different logical
algorithms.
Nope. You're guessing just as much as JR.
The facts - and we know you like them, but never look for them - are
that there are many convictions on a depressingly regular basis. Just
look at the press releases from the DoJ Project Safe Childhood:
https://www.justice.gov/psc/press-room
My statements aren't guesses. There have been plenty of convictions.
Unfortunately, there have also been privacy violations of innocent
people. And that's my primary concern when it comes to CSAM scanning.
That absurd thought process shows Alan Baker's IQ can't be above
around 40.
Your obsession with meaningless IQ numbers says way more about you than anyone else, little Arlen. And your continual drive to insult anyone who disagrees with you says more about you as well. You're a weak-minded
juvenile troll in a man's body.
Jolly Roger wrote on 25 Jul 2024 15:41:59 GMT :
Nothing you can say will change the fact that a greater-than-zero number
of people have been convicted from CSAM scanning - just like nothing you
can say will convince me that CSAM scanning can be done without
violating the privacy of innocent people. Things like this should not
happen:
I'm not disagreeing with you, Jolly Roger.
The CSAM scanning is a violating of privacy.
The question isn't that it's a violation of privacy.
The question is whether it's worth that violation of privacy.
Jolly Roger wrote on 25 Jul 2024 19:22:24 GMT :
You're simply guessing. I'm using logic. They're different logical
algorithms.
Nope. You're guessing just as much as JR.
The facts - and we know you like them, but never look for them - are
that there are many convictions on a depressingly regular basis.
Just look at the press releases from the DoJ Project Safe Childhood:
https://www.justice.gov/psc/press-room
My statements aren't guesses. There have been plenty of convictions.
Unfortunately, there have also been privacy violations of innocent
people. And that's my primary concern when it comes to CSAM scanning.
The cite Chris listed said NOTHING whatsoever about the conviction
rate.
Jolly Roger wrote on 25 Jul 2024 15:38:54 GMT :
That absurd thought process shows Alan Baker's IQ can't be above
around 40.
Your obsession with meaningless IQ numbers says way more about you
than anyone else, little Arlen. And your continual drive to insult
anyone who disagrees with you says more about you as well. You're a
weak-minded juvenile troll in a man's body.
The only reason the IQ of the strange religious zealots matters is
that the low-IQ is partly why you zealots believe things that have no
basis in fact.
Jolly Roger wrote on 25 Jul 2024 15:36:48 GMT :
And actually, the only metric that matters is the number of innocent
people whose privacy may be violated. If that number is greater than
zero, then you can count me (and a whole lot of others who reserve their
right to privacy) out.
I'm agreeing with you that millions, and maybe even billions of innocent people are being violated by CSAM scanning by Apple, Google & Facebook.
Apple is harming us less than Google & Facebook, but harm is still harm.
The cite Chris listed said NOTHING whatsoever about the conviction
rate.
It certainly shows the conviction rate is higher than your claimed
"absolute zero". So it seems it was you who lied first. I realize you
want us to ignore this, but your focus on who lied demands we recognize
it.
That absurd thought process shows Alan Baker's IQ can't be above
around 40.
Your obsession with meaningless IQ numbers says way more about you
than anyone else, little Arlen. And your continual drive to insult
anyone who disagrees with you says more about you as well. You're a
weak-minded juvenile troll in a man's body.
The only reason the IQ of the strange religious zealots matters is
that the low-IQ is partly why you zealots believe things that have no
basis in fact.
*YAWN* You're a hopeless broken record, little Arlen.
The question isn't that it's a violation of privacy.
The question is whether it's worth that violation of privacy.
Ask the innocent people who have their privacy violated whether it was
worth it, and see what they tell you.
Jolly Roger wrote on 25 Jul 2024 21:07:20 GMT :
The question isn't that it's a violation of privacy. The question
is whether it's worth that violation of privacy.
Ask the innocent people who have their privacy violated whether it
was worth it, and see what they tell you.
Well, everyone who has photos that Google, Apple and Facebook can
'see', have been violated
Since you lack a normal IQ
Which is my point about CSAM.
The whole CSAM thing is pure bullshit.
a. Nobody is protected.
b. Everyone is harmed.
Repeating lies doesn't make them true.
On 2024-07-26 07:53, Chris wrote:
Which is my point about CSAM.
The whole CSAM thing is pure bullshit.
a. Nobody is protected.
b. Everyone is harmed.
Repeating lies doesn't make them true.
But that is quite literally the only tool he has!
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending
millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso than
the current system.
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech >>>>>> companies are rather than complain at Apple for not sending
millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso than
the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent person
who has to deal with charges, a potential criminal sexual violation on
your record, and all that comes with it? I certainly wouldn't.
On 2024-07-26, Andrew <andrew@spam.net> wrote:
Jolly Roger wrote on 25 Jul 2024 21:07:20 GMT :
The question isn't that it's a violation of privacy. The question
is whether it's worth that violation of privacy.
Ask the innocent people who have their privacy violated whether it
was worth it, and see what they tell you.
Well, everyone who has photos that Google, Apple and Facebook can
'see', have been violated
Apple doesn't scan their user's photos for CSAM.
Since you lack a normal IQ
Straight to the insults
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the
tech companies are rather than complain at Apple for not sending >>>>>>> millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Jolly Roger wrote on 25 Jul 2024 21:12:36 GMT :
The cite Chris listed said NOTHING whatsoever about the conviction
rate.
It certainly shows the conviction rate is higher than your claimed
"absolute zero". So it seems it was you who lied first. I realize you
want us to ignore this, but your focus on who lied demands we
recognize it.
zealots lack a normal IQ
You both fabricated imaginary convictions
You lied.
Just showing convictions is completely meaningless to this point.
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the
tech companies are rather than complain at Apple for not sending >>>>>>>> millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions >>>>>>> resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated. Do you want a
stranger looking at photos of your sick child? What if that stranger
came to the conclusion that those photos are somehow classifiable as
sexual or abusive in some way? Would you want to have to argue your case
in court because of it?
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Yes, but one is one too many in my book.
On 2024-07-26 15:14, Jolly Roger wrote:
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>>>> millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions >>>>>>>> resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss of >>>>>>> privacy
Actually, Apple could not guarantee that, and there was a non-zero >>>>>> chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated. Do you want a
stranger looking at photos of your sick child? What if that stranger
came to the conclusion that those photos are somehow classifiable as
sexual or abusive in some way? Would you want to have to argue your case
in court because of it?
Yes. At that point...
...if and only if the person is INNOCENT...
...someone's privacy is unnecessarily violated.
And it's a stretch to imagine that:
1. Innocent pictures would be matched with KNOWN CSAM images, AND;
(the logical AND)
2. A person reviewing those images after they've been flagged wouldn't
notice they don't actually match; AND
3. The owner of those images at that point would be charged when they
could then show that they were in fact innocent images.
Yes, but one is one too many in my book.
And yet you are fine with innocent people's privacy being violated
when a search warrant is issued erroneously.
Actually, both Chris and I provided actual documentation of actual convictions.
Chris just lied about convictions.
Why did Chris lie?
Given I was responding to your claim of "ZERO convictions"...
Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :
Actually, both Chris and I provided actual documentation of actual
convictions.
Oh Jesus. Everyone, yes, even you zealots, knows there are convictions. Nobody said otherwise.
Chris wrote on Fri, 26 Jul 2024 15:23:16 +0100 :
Chris just lied about convictions.
Why did Chris lie?
Given I was responding to your claim of "ZERO convictions"...
Do you realize the number of convictions was never in dispute, Chris?
What is in dispute is your claims that Apple/Google/Facebook CSAM scanning has a 100% conviction rate (which is essentially your claim, Chris).
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 15:14, Jolly Roger wrote:
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the >>>>>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>>>>> millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions >>>>>>>>> resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in >>>>>>>>> child safety?
Apple's solution wouldn't have resulted in any additional loss of >>>>>>>> privacy
Actually, Apple could not guarantee that, and there was a non-zero >>>>>>> chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed, >>>>> that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated. Do you want a
stranger looking at photos of your sick child? What if that stranger
came to the conclusion that those photos are somehow classifiable as
sexual or abusive in some way? Would you want to have to argue your case >>> in court because of it?
Yes. At that point...
...if and only if the person is INNOCENT...
...someone's privacy is unnecessarily violated.
And it's a stretch to imagine that:
1. Innocent pictures would be matched with KNOWN CSAM images, AND;
Not it's not. There was a margin of error in the proposed matching algorithms.
(the logical AND)
2. A person reviewing those images after they've been flagged wouldn't
notice they don't actually match; AND
That decision is a human one, and humans make mistakes and have biased beliefs that can lead them to make faulty decisions.
3. The owner of those images at that point would be charged when they
could then show that they were in fact innocent images.
Innocent people shouldn't have to prove anything to anyone.
Yes, but one is one too many in my book.
And yet you are fine with innocent people's privacy being violated
when a search warrant is issued erroneously.
Search warrants require probable cause and are signed by a judge.
Totally different scenario.
But that is quite literally the only tool he has!
It really is. When his lies are laid bare and shown to be false, he inevitably just doubles down and tells them over and over again. It's extremely childish.
Jolly Roger wrote on 26 Jul 2024 16:12:58 GMT :
But that is quite literally the only tool he has!
It really is. When his lies are laid bare and shown to be false, he
inevitably just doubles down and tells them over and over again. It's
extremely childish.
I find it rather interesting that the zealots claim an essentially 100% conviction rate based on the Apple/Google/Facebook CSAM reports...
Since you lack a normal IQ
Straight to the insults
On 2024-07-27 11:38, Andrew wrote:
Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :
Actually, both Chris and I provided actual documentation of actual
convictions.
Oh Jesus. Everyone, yes, even you zealots, knows there are
convictions. Nobody said otherwise.
Actually:
'It's probably zero given it's the most important metric. And it's
missing from the story.'
Jolly Roger wrote on 26 Jul 2024 22:24:28 GMT :
Actually, both Chris and I provided actual documentation of actual
convictions.
Oh Jesus. Everyone, yes, even you zealots, knows there are
convictions. Nobody said otherwise.
All your strongly held beliefs are based on exactly zero facts, JR.
'It's probably zero given it's the most important metric. And it's
missing from the story.'
And before that he literally said "absolutely zero" convictions have
resulted from CSAM scanning - right here in this thread. Arlen is a
clown.
On 2024-07-27 11:58, Andrew wrote:
Jolly Roger wrote on 26 Jul 2024 16:12:58 GMT :
But that is quite literally the only tool he has!
It really is. When his lies are laid bare and shown to be false, he
inevitably just doubles down and tells them over and over again. It's
extremely childish.
I find it rather interesting that the zealots claim an essentially 100%
conviction rate based on the Apple/Google/Facebook CSAM reports...
Arlen...
...why must you lie?
Literally no one has made that claim.
Oh Jesus. Everyone, yes, even you zealots, knows there are
convictions. Nobody said otherwise.
Liar. *YOU* said there were "absolutely zero" convictions. You're trying
to pretend you didn't say that, but it's on record right here in this
thread.
Jolly Roger wrote on 26 Jul 2024 16:08:29 GMT :
Since you lack a normal IQ
Straight to the insults
Hi Jolly Roger,
To put it bluntly, every assessment that you make is based on your low
IQ not being able to discern between convictions and conviction rates
due to the reporting by Apple, Google & Facebook.
They're not the same thing.
First he says there have been "absolutely zero" convictions as a result
of CSAM scanning.
You're the one who started out with "absolutely zero" convictions, you complete asshole. Then you walked it back to "probably zero", and now
you are fabricating things nobody here has said - all while desperately trying to move the goal post to conviction rates. And the fact that you actually think you are fooling anyone here reflects more on your own IQ
than anyone else's. You're a pathetic weak-minded fool, little Arlen.
Take your childish insults and shove them where the sun don't shine.
Jolly Roger wrote on 28 Jul 2024 01:56:24 GMT :
Oh Jesus. Everyone, yes, even you zealots, knows there are
convictions. Nobody said otherwise.
Liar. *YOU* said there were "absolutely zero" convictions. You're
trying to pretend you didn't say that, but it's on record right here
in this thread.
Oh my God, Jolly Roger...
You still don't get it that nobody said people aren't convicted of the
crime.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger wrote on 28 Jul 2024 01:57:26 GMT :
'It's probably zero given it's the most important metric. And it's
missing from the story.'
And before that he literally said "absolutely zero" convictions have
resulted from CSAM scanning - right here in this thread. Arlen is a
clown.
Jesus Chris, Jolly Roger.
You can't tell the difference between the number of convictions from
all sources and the percentage of convictions from Apple, Google or FB reports?
Who is that stupid?
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger wrote on 28 Jul 2024 02:36:58 GMT :
First he says there have been "absolutely zero" convictions as a result
of CSAM scanning.
It's no longer shocking that your IQ is so dismally low that you can't tell the difference between the number of convictions from all sources, and the percentage (or number) of convictions from just Apple.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger wrote on 28 Jul 2024 02:39:40 GMT :
You're the one who started out with "absolutely zero" convictions, you
complete asshole. Then you walked it back to "probably zero", and now
you are fabricating things nobody here has said - all while desperately
trying to move the goal post to conviction rates. And the fact that you
actually think you are fooling anyone here reflects more on your own IQ
than anyone else's. You're a pathetic weak-minded fool, little Arlen.
Take your childish insults and shove them where the sun don't shine.
For Christs' sake. It's no longer surprising you can't tell the difference between the number of convictions versus the number from Apple's reporting.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
That you can't tell the difference is how I know your IQ is below normal.
Jolly Roger wrote on 28 Jul 2024 16:16:59 GMT :
Liar - here are your own words, little Arlen, where you say "absolutely
zero" were caught and convicted:
You not comprehending
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger wrote on 28 Jul 2024 16:21:06 GMT :
You don't remember your own words, where you claimed there have been
"absolutely zero" pedophiles caught and convicted as a result of CSAM
scanning? Here, let's refresh your rotten memory:
You not comprehending the difference between zero percent of Apple
reports versus zero total convictions is how I know you zealots own
subnormal IQs.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for notFor all that is in the news stories, it could be ZERO
sending millions of photos to already overwhelmed authorities. >>>>>>>>
convictions resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss
of privacy
Actually, Apple could not guarantee that, and there was a
non-zero chance that false positive matches would result in
privacy violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is
harmed, that's one too many. Would you want to be that unlucky
innocent person who has to deal with charges, a potential criminal
sexual violation on your record, and all that comes with it? I
certainly wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated.
You're entering into confucious territory. If nothing is triggered is anyone's privacy infringed.
Do you want a stranger looking at photos of your sick child?
That wouldn't happen with Apple's method.
What if that stranger came to the conclusion that those photos are
somehow classifiable as sexual or abusive in some way? Would you want
to have to argue your case in court because of it?
That's a lot of ifs and steps.
No-one is going to be charged for a dubious
photo of their own child. There are much bigger fish to fry and get into jail.
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Yes, but one is one too many in my book.
How many children are you prepared to be abused to protect YOUR
privacy?
Apple was wise to shelve this proposal
I think you need to have a lie down. You literally are making no sense anymore.
You not comprehending the difference between zero percent of Apple reports >> versus zero total convictions is how I know you zealots own subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about three different positions on this thread and keep getting confused which one
you're arguing for. lol.
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:
No-one is going to be charged for a dubious photo of their own
child. There are much bigger fish to fry and get into jail.
You're wrong. It has already happened:
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>
I explicitly said "charged". No-one got charged. The law is working
just fine. It's the tech, as I've been arguing all along, that's the
problem.
Read the whole article to get a glimpse of what innocent people go
through who fall victim to this invasive scanning.
Do you think these parents and their child consider their privacy to
be violated? How would you feel if your intimate photos were added to
the PhotoDNA CSAM database because they were incorrectly flagged?
This wasn't PhotoDNA, which is what Apple was similar to. It was
google's AI method that is designed to "recognize never-before-seen exploitative images of children" which is where the real danger sits.
It is designed to identify new abuse images based on only the pixel
data so all hits will be massively enriched for things that look like
abuse. A human won't have the ability to accurately identify the
(likely innocent) motivation for taking photo and "to be safe" will
pass it onto someone else make the decision i.e. law enforcement. The
LE will have access to much more information and see it's an obvious
mistake as seen in your article.
Apple's system was more like hashing the image data and comparing
hashes where false positives are due to algorithmic randomness. The
pixel data when viewed by a human won't be anything like CSAM and an
easy decision made.
What's crucial here is that Google are looking for new stuff - which
is always problematic - whereas Apple's was not. The search space when looking for existing images is much tinier and the impact of false
positives much, much smaller.
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple reports >>> versus zero total convictions is how I know you zealots own subnormal IQs. >>Not at all. My position hasn't changed. You, however, have had about three >> different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that given >> the dozens/hundreds of images of suspected abuse images they review a day
they won't have the ability to make informed decisions.
---
A human content moderator for Google would have reviewed the photos
after they were flagged by the artificial intelligence to confirm they
met the federal definition of child sexual abuse material.
What kind of a person would want this job?
On 29/07/2024 21:04, badgolferman wrote:
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that
given
the dozens/hundreds of images of suspected abuse images they review a
day
they won't have the ability to make informed decisions.
---
A human content moderator for Google would have reviewed the photos
after they were flagged by the artificial intelligence to confirm they >>>> met the federal definition of child sexual abuse material.
What kind of a person would want this job?
I read an article a couple of years ago on the Facebook content
moderators. Many ended up traumatised and got no support. God it was a
grim read.
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple
reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about
three
different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that
you
don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple
reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about
three
different positions on this thread and keep getting confused which one >>>> you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions based on >>> the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users stored
in iCloud for child sexual abuse material (CSAM). The tool was meant to
be privacy-preserving and allow the company to flag potentially
problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At
the beginning of September 2021, Apple said it would pause the rollout
of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a
launch was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to report
the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using
the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).
I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
lot of Apple OS-level apps because I’m not interested in transmitting
any of my data whatsoever to Apple over the network - mostly because
Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a program
named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
Jolly Roger <jollyroger@pobox.com> wrote:
Yes, but even in Apple's case, there's a small change of a false
positive patch. And were that to happen, there is a danger of an
innocent person's privacy being violated.
In every case there's a chance of FPs. Apple would have had lower FPR then *the current* system.
Given the choice I'm in favour of the better, evidence-based method.
You're in favour of the worse system
Nope. I don't support any scanning of private content.
Yet it's already happening so why not support the better method?
I agree - still not good enough for me though.
"Perfect is the enemy of the good"
By seeking perfection you and others are allowing and enabling child
abuse.
Apple only shelved it for PR reasons, which is a real shame.
You don't know all of Apple's motivations. What we know is Apple shelved
it after gathering feedback from industry experts. And many of those
experts were of the opinion that even with Apple's precautions, the risk
of violating people's privacy was too great.
That wasn't the consensus. The noisy tin-foil brigade drowned out any possible discussion. i
Apple should have sim
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that
given the dozens/hundreds of images of suspected abuse images they
review a day they won't have the ability to make informed decisions.
--- A human content moderator for Google would have reviewed the
photos after they were flagged by the artificial intelligence to
confirm they met the federal definition of child sexual abuse
material.
What kind of a person would want this job?
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple reports >>>> versus zero total convictions is how I know you zealots own subnormal IQs. >>>Not at all. My position hasn't changed. You, however, have had about three >>> different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that you >> don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple >>>>>> reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had
about three
different positions on this thread and keep getting confused which one >>>>> you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions
based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only >>>> then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT >>>> underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users
stored in iCloud for child sexual abuse material (CSAM). The tool was
meant to be privacy-preserving and allow the company to flag
potentially problematic and abusive content without revealing anything
else. But the initiative was controversial, and it soon drew
widespread criticism from privacy and security researchers and digital
rights groups who were concerned that the surveillance capability
itself could be abused to undermine the privacy and security of iCloud
users around the world. At the beginning of September 2021, Apple said
it would pause the rollout of the feature to “collect input and make
improvements before releasing these critically important child safety
features.” In other words, a launch was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to
report the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >> Mac App Store. I don’t store photos in the macOS “Photos” application, >> even locally. I never opted in to Apple network services of any kind -
I use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my Documents
folder, some HEIC files taken with an iPhone and copied to the Mac
using the Image Capture program (used for dumping photos from an iOS
device attached with an USB cable).
I use a program called Little Snitch which alerts me to network
traffic attempted by the programs I use. I have all network access
denied for a lot of Apple OS-level apps because I’m not interested in
transmitting any of my data whatsoever to Apple over the network -
mostly because Apple turns over customer data on over 30,000 customers
per year to US federal police without any search warrant per Apple’s
own self-published transparency report. I’m good without any of that
nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a
program named mediaanalysisd (Media Analysis Daemon - a background
process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled
via controls within the OS: you must used third party network
filtering software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the case.'
Alan wrote:
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of
Apple reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had
about three
different positions on this thread and keep getting confused which >>>>>> one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never >>>>> changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions
based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and
only
then) will you realize I've maintained the same position throughout. >>>>>
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT >>>>> underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning
of images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users
stored in iCloud for child sexual abuse material (CSAM). The tool was
meant to be privacy-preserving and allow the company to flag
potentially problematic and abusive content without revealing
anything else. But the initiative was controversial, and it soon drew
widespread criticism from privacy and security researchers and
digital rights groups who were concerned that the surveillance
capability itself could be abused to undermine the privacy and
security of iCloud users around the world. At the beginning of
September 2021, Apple said it would pause the rollout of the feature
to “collect input and make improvements before releasing these
critically important child safety features.” In other words, a launch
was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to
report the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >>> Mac App Store. I don’t store photos in the macOS “Photos”
application, even locally. I never opted in to Apple network services
of any kind - I use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my
Documents folder, some HEIC files taken with an iPhone and copied to
the Mac using the Image Capture program (used for dumping photos from
an iOS device attached with an USB cable).
I use a program called Little Snitch which alerts me to network
traffic attempted by the programs I use. I have all network access
denied for a lot of Apple OS-level apps because I’m not interested in
transmitting any of my data whatsoever to Apple over the network -
mostly because Apple turns over customer data on over 30,000
customers per year to US federal police without any search warrant
per Apple’s own self-published transparency report. I’m good without >>> any of that nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a
program named mediaanalysisd (Media Analysis Daemon - a background
process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled
via controls within the OS: you must used third party network
filtering software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
'A recent thread on Twitter raised concerns that the macOS process
mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the
case.'
Bullshit.
https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.
Apple's new photo-scanning feature will scan photos stored in iCloud to
see whether they match known Child Sexual Abuse Material (CSAM). The
problem with this, like many others, is that we often have hundreds of
photos of our children and grandchildren, and who knows how good or bad
the new software scanning technology is? Apple claims false positives
are one trillion to one, and there is an appeals process in place. That
said, one mistake from this AI, just one, could have an innocent person
sent to jail and their lives destroyed.
Apple has many other features as part of these upgrades to protect
children, and we like them all, but photo-scanning sounds like a problem waiting to happen.
Here are all of the "features" that come with anti-CSAM, expected to
roll out with iOS 15 in the fall of 2021.
Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
Siri and Search: Siri and Search will provide additional resources to
help children and parents stay safe online and get help with unsafe situations.
Now that you understand how anti-CSAM works, the only way to avoid
having your photos scanned by this system is to disable iCloud Photos.
Your photos are scanned when you automatically upload your photos to the cloud, so the only current way to avoid having them scanned is not to
upload them.
This adds an interesting problem. The majority of iPhone users use
iCloud to back up their photos (and everything else). If you disable
iCloud, you will need to back up your photos manually. If you have a PC
or Mac, you can always copy them to your computer and back them up. You
can also consider using another cloud service for backups.
Let's talk about disabling iCloud and also removing any photos you
already have uploaded. You will have 30 days to recover your photos if
you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.
You'll want to backup and disable iCloud, then verify that no photos
were left on their servers.
Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable iCloud Photos
First, we can disable the uploading of iCloud photos while keeping all
other backups, including your contacts, calendars, notes, and more.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Photos.
Uncheck iCloud Photos.
You will be prompted to decide what to do with your current photos.
If you have the space on your phone, you can click on Download Photos & Videos, and your photos will all be on your iPhone, ready to back up somewhere else.
Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server
While all of your photos should be deleted from Apple's server, we
should verify that.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Manage Storage.
Click on Photos.
Click on Disable & Delete
https://discussions.apple.com/thread/254538081?sortBy=rank
https://www.youtube.com/watch?v=K_i8rTiXTd8
How to disable Apple scanning your photos in iCloud and on device. The
new iOS 15 update will scan iPhone photos and alert authorities if any
of them contain CSAM. Apple Messages also gets an update to scan and
warn parents if it detects an explicit image being sent or received.
This video discusses the new Apple update, privacy implications, how to disable iPhone photo scanning, and offers a commentary on tech companies
and the issue of privacy and electronic surveillance.
On 2024-07-29 17:10, Chips Loral wrote:
Bullshit.
That discusses a system that Apple disabled.
And doesn't support your first source AT ALL.
'Mysk:
No, macOS doesn’t send info about your local photos to Apple We
analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul
that it scans local photos and secretly sends the results to an Apple
server.
[…]
We analyzed the network traffic sent and received by mediaanalysisd.
Well, the call is literally empty. We decrypted it. No headers, no
IDs, nothing. Just a simple GET request to this endpoint that returns nothing. Honestly, it looks like it is a bug.
Mysk:
The issue was indeed a bug and it has been fixed in macOS 13.2. The
process no longer makes calls to Apple servers.'
<https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
Apple's reporting rate is ZERO, because they're not doing scanning
of images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the
case.'
<https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>
'The mediaanalysisd process is a background task that starts every
time an image file is previewed in Finder, and then calls an Apple
service. The process is designed to run machine learning algorithms
to detect objects in photos and make object-based search possible in
the Photos app. It also helps Finder to detect text and QR codes in
photos. Even if a user does not use the Photos app or have an iCloud
account, the process will still run.'
Apple is not scanning your photos for CSAM
Alan wrote:
The issue was indeed a bug
https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.
The issue was indeed a bug
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 11:40:11 |
Calls: | 10,387 |
Calls today: | 2 |
Files: | 14,060 |
Messages: | 6,416,701 |