![]() Like so many contentious issues there was widespread misunderstanding and misinterpretation of what Apple was proposing, and a lot of people came away with the belief that macOS would soon (if not already) be scanning images stored on their Mac in a bid to detect those containing CSAM. Much of the current climate of suspicion over Apple and images stems from what in retrospect was a misguided attempt to consult openly about changes Apple had intended to introduce a year ago in Monterey, to scan for images containing CSAM (Child Sexual Abuse Material). ![]() This article looks at the reality behind that fear. A few hours later, it turned out the whole thing was a false alarm, but by then it was clear that many Mac users believe that Apple scans images on your Mac. A few days ago, there was a wave of panic following a report that macOS had scanned a QR code in an image and accessed the link encoded within it.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |