Negosyante News

November 22, 2024 2:06 pm

Apple to include tracking of child sexual abuse material in future iOS update

IMG SOURCE: MORNING TIDINGS

Apple has announced that iOS, the operating system for its mobile devices such as the iPhone and iPad, will soon start detecting images containing child sexual abuse material (CSAM) and reporting them as they are uploaded to iCloud.

The update to Apple’s operating system will monitor pictures, allowing Apple to report findings to the National Center for Missing and Exploited Children.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of CSAM,” said Apple.

The new feature will allow devices to match abusive photos on a user’s device against a database of CSAM images provided by child safety organizations, then flag the images as they are uploaded to iCloud.

The feature is part of a series of new tools heading to Apple mobile devices.

Apple’s iPhone messaging app will also use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos.

Furthermore, its personal assistant feature Siri will be taught to “intervene” when users try to search topics related to child sex abuse.

About 7,000,000 children are sexually abused every year in the Philippines.

More than 70% of sexually abused children are between 10 and 18 years old. Among those victims, 20% are under 6 years old.

Currently, Apple only has 15% of the market share in the Philippines. Hopefully, Android, which holds 85% of the market share, finds a way to replicate this technology in its devices.

SOURCE: ABS CBN, Cameleon Association, Statista

Comments are closed for this article!

Subscribe to Our Newsletter and get a free pdf:

Sign Up for negosyante news

and receive a copy of The Crypto Cheat Sheet (PDF)
and NFT Cheat Sheet for free!

* indicates required