时间:2025-11-22 02:20:49 来源:网络整理编辑:焦點
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
This company is hiring someone just to drink all day2025-11-22 02:18
十連冠 ?拜仁坐穩榜首 多特葬送好局失去爭冠主動權2025-11-22 02:01
李章洙來華尋求重返中國的可能性? 深足確認暫無換帥計劃2025-11-22 01:43
熱刺VS諾維奇首發:凱恩孫興慜領銜 小盧卡斯出戰2025-11-22 01:32
Fiji wins first2025-11-22 01:28
武磊第三次進大名單未出場 西甲293分鍾0球0助攻2025-11-22 00:39
傳國足領隊一職進行調整 張賀離任戚“鐵三”有望接手2025-11-22 00:37
多特VS拜仁身價:總和破14億歐元 哈蘭德遠超萊萬2025-11-22 00:35
This app is giving streaming TV news a second try2025-11-22 00:15
皇馬VS皇家社會首發 :本澤馬領銜 典禮三中場出戰2025-11-21 23:44
Man stumbles upon his phone background in real life2025-11-22 01:55
人生贏家 !《法國足球》曬梅西全家與七座金球合影2025-11-22 01:55
巴薩VS貝蒂斯首發 :德佩庫蒂尼奧出戰 皮克替補2025-11-22 01:13
李霄鵬執教國足刷新兩項紀錄 一屆世預賽三度更換主帥2025-11-22 01:09
Donald Trump's tangled web of Russian influence2025-11-22 01:05
損失巨大!本澤馬上半場傷退 恐無緣出戰國米馬競2025-11-22 00:49
足協換帥以穩為主 “金句王”李霄鵬能否帶來改變 ?2025-11-22 00:30
巴黎VS朗斯首發 :梅西迪馬利亞領銜 姆巴佩替補席2025-11-22 00:10
Carlos Beltran made a very interesting hair choice2025-11-22 00:02
本澤馬與維尼修斯 本賽季歐洲最佳鋒線搭檔的修煉2025-11-21 23:41