Facebook Implementing New Safety Feature To Protect Kids

Facebook is taking big steps in its efforts to safeguard kids, find out how.

By Kristi Eckert | Published

This article is more than 2 years old

apple facebook

Thanks to the courage of numerous whistleblowers combined with increased regulator awareness, Facebook and other platforms like it have been subject to increased scrutiny regarding the potentially negative effects that social media can have on individual users. Children, adequately so, have been at the forefront of many inquiries involving these potentially hazardous effects. NPR reported that Facebook parent company Meta is now planning to roll out a parental control feature meant to prevent kids from inadvertently downloading virtual reality apps intended for older audiences on the Facebook-owned virtual reality device Oculus Quest.

Facebook’s new child-minded move comes alongside the realization that the Oculus Quest, although marketed towards a 13 and older demographic, is also popular with younger kids. However, because the Quest was developed for an older subset of individuals, much of the content that is available for it is reflective of that. Thus, a child using the device is likely to come across content inappropriate for their age level. Facebook’s new parent control implementations are meant to serve as a counteragent to a child accidentally being exposed to inappropriate content.

Facebook parent company Meta detailed that parents will be able to begin taking advantage of the new feature in April. When the parental control features roll out parents will easily be able to place blocks and restrictions on apps that they don’t want their child to engage with. As an added resource, parents will also be able to utilize a dashboard from which they will be able to monitor their child’s activities including keeping tabs on just what software their child has downloaded onto Facebook’s Oculus Quest.

virtual reality oculus

Facebook didn’t leave parents of older children without resources either. Their child-protecting implementations are extended to teenagers, too. Parents of teens can take solace in the fact that starting in May Facebook will automatically prevent teens from downloading content rated as too mature for them in accordance with the standard outlined by the International Age Rating Coalition.

Furthermore, Facebook’s child-protective endeavors are not being limited to its VR headset and the software made for it. The social media giant’s parent company is creating what they are calling a “Family Hub.” This hub is meant to serve as a place where parents can monitor their child’s activities across all of Meta’s social media platforms. All eyes are on Instagram, in particular, in relation to this Family Hub rollout. This is largely because Instagram took center stage in the media after evidence surfaced that suggested it could play a role in young girls developing eating disorders. However, overall the hub is meant to be a resource and one that parents can turn to for helpful strategies they can use during conversations they have with their kids about proper internet engagement.

Considering how widespread social media use is and the consumption of content on those platforms has become, it is reassuring to know that Facebook is beginning to take actions intended to safeguard the most vulnerable populations. There is certainly still a long way to go, but a step in the right direction is a step in the right direction nonetheless.