Report lists unsafe online platforms for youth

Child rights organization Plan International Pilipinas has released an assessment that provides a comprehensive overview of online platform usage, preferences, behaviors, and vulnerabilities of Filipino children and youth, along with recommendations to strengthen child protection mechanisms across these platforms.
Presented to the public in celebration of Safer Internet Day on Feb. 26, the 2024 assessment covers Cavite, Manila, Baguio, Iloilo, Guimaras, Leyte, and Davao.
Findings show that children and young people primarily use digital platforms for communication (64.3 percent), education (47.7 percent), leisure (37.2 percent), and stress reduction (34.3 percent). Among mainstream platforms, Facebook (33.2 percent) and Messenger (23.3 percent) remain the most widely used.
For nonmainstream platforms, the anime streaming and video-sharing site BiliBili had the highest usage among respondents at 52.8 percent. This signals a significant shift among young users migrating from mainstream platforms toward lesser-known, less-regulated digital spaces that offer more privacy and autonomy, away from the scrutiny of their parents and other adults.
This change in digital behavior comes with increased risks, including the overexposure of personal information as young users unknowingly share details such as their location, age, or photos without understanding the risks. Additionally, many bypass age restrictions using “alter” accounts, adult profiles, or borrowed devices, exposing them to inappropriate content, targeted ads, and online threats.
Economic barriers also contribute to this shift, as children and youth from lower-income backgrounds rely on free platforms with weaker safety measures, while their peers access safer, paid options.
In terms of safety, respondents identified TikTok, Facebook, and Messenger as relatively child-friendly due to their accessibility and ease of use. However, they also raised concerns about online risks, including exposure to harmful content, cyberbullying, and potential exploitation.
They flagged certain platforms as non-child-friendly, including online games with violent themes (Valorant, Call of Duty), messaging apps and chat services (Telegram, Discord, OmeTV), and livestreaming services (BigoLive).
Apps such as Roblox and Fortnite, among others, were considered unsafe due to explicit content and toxic online behavior among its users, and the risk of online addiction. Additionally, mobile wallet apps and platforms with payment features (GCash, PayPal) were seen as unsuitable for children, with respondents highlighting concerns about scams and other security risks.
The assessment revealed distinct patterns in online usage trends across different age groups and genders, highlighting the need for age-appropriate safeguards. Children age 10 to 14 primarily used Roblox, Mobile Legends, Facebook, and YouTube, while those age 15 to 17 were more active on social media and messaging apps for communication and leisure. Adolescents age 18 to 24 frequently engaged with Facebook, TikTok, and online shopping applications.
Online platforms likewise played a role in social interactions, with some young women age 18 to 24 admitting to using them for dating. Alarmingly, the assessment found that children as young as 10 to 14 have been using dating and networking apps like OmeTV, Tinder, and Litmatch, which are primarily designed for adults.
To address these challenges, the assessment offers targeted recommendations to tech companies, including simplifying terms of service with video tutorials and multilingual options, implementing enforceable age-based content ratings (e.g., “Rated 13+”), and scaling up safety features on their platforms. It also calls for strengthened digital literacy programs targeting children, parents, and educators.
For the government, it recommends implementing and enforcing stricter regulations and monitoring mechanisms.