In an incident that is truly tragic and concerning for the society, some children in Muzaffarpur, Bihar, aged between 7 to 12 years were trying to make a bomb after watching a YouTube video when tragically the bomb accidentally exploded, injuring five children.
Unfortunately, the bomb explosion resulted in injuries to five children. The incident has sparked widespread concern and has gone viral on social media.
These children from BIHAR were making Bombs,learning via Youtube. while making it exploded but only caused minor injuries to them they are safe Now..@YouTube should not encourage such videos that harm people. pic.twitter.com/vNV9Dgeknb
— Ashwin.. (@EvilTweetX) August 8, 2024
Local authorities have emphasized the need for parents to monitor their children’s online activities more closely. They also highlighted the importance of educating children about the dangers of attempting to replicate hazardous activities seen online. This tragic event underscores the critical need for better digital literacy and safety measures to protect children from such risks.
This highly concerning incident highlights the critical need to monitor children's online activities. It is essential to ensure that children are safe and aware of what they are watching and learning on the internet.
To prevent such incidents, social media platforms can play a crucial role in preventing incidents like the one in Muzaffarpur by implementing several key measures such as Proactive Monitoring wherein the online video platforms use advanced algorithms and AI to detect and remove harmful content, such as videos showing how to make dangerous items like bombs.
These platforms must also encourage users to report inappropriate or dangerous content and ensure that reported content is reviewed and acted upon promptly.
Besides, parents should also keep an eye on what their children are watching online. Use parental control settings on devices and platforms like YouTube to restrict access to inappropriate content. Parents must teach children about the dangers of attempting to replicate dangerous activities seen online. Encourage open communication so they feel comfortable discussing what they watch.
Need for School and Community Programs:
Safety Workshops: Schools and community centers can organize workshops to educate children about online safety and the potential risks of imitating dangerous activities.
Digital Literacy: Incorporate digital literacy into the curriculum to help children understand how to use the internet safely and responsibly.
The most common reasons for flagging videos include spam or misleading content, hateful or abusive content, sexual content, violent or repulsive content, and harmful or dangerous acts.
A significant portion of flagged content is reviewed and acted upon. For instance, from October to December 2017, 75.9% of all automatically flagged videos were removed before they received any views.
Improvements in automated flagging systems have also helped detect and review content even before it is flagged by the community. More than 80% of auto-flagged videos were removed before receiving a single view in the second quarter of 2019.
Reportedly, YouTube employs trained human reviewers who evaluate flagged content 24/7 to ensure it violates the Community Guidelines before taking action. This helps maintain a balance between automated systems and human judgment.
Digital Literacy: Incorporate digital literacy into the curriculum to help children understand how to use the internet safely and responsibly.
Flagging Dangerous Content
From April to June 2024, YouTube received over 23 million flags from human users. This indicates a high level of community engagement in reporting inappropriate content.The most common reasons for flagging videos include spam or misleading content, hateful or abusive content, sexual content, violent or repulsive content, and harmful or dangerous acts.
A significant portion of flagged content is reviewed and acted upon. For instance, from October to December 2017, 75.9% of all automatically flagged videos were removed before they received any views.
Improvements in automated flagging systems have also helped detect and review content even before it is flagged by the community. More than 80% of auto-flagged videos were removed before receiving a single view in the second quarter of 2019.
Reportedly, YouTube employs trained human reviewers who evaluate flagged content 24/7 to ensure it violates the Community Guidelines before taking action. This helps maintain a balance between automated systems and human judgment.
By taking required steps, both by parents are and social media platforms, a safer online environment can be created which would help prevent such tragic incidents.
Advertisements