Posted September 26, 2018 08:15:50 An investigation by BuzzFeed News into videos that were made in the past year by exploited mothers, exploited teens, and other adults who may be using the Internet to exploit children, has uncovered content that is inappropriate, harmful, or potentially harmful to children.
The BuzzFeed News investigation uncovered videos of exploited moms, exploited teen, and others who may have posted videos that have since been deleted, but which were uploaded to YouTube by an adult, using a browser extension or a pop-up blocker.
BuzzFeed News has identified approximately 800 videos that are potentially harmful, unsafe, or illegal to view or view on the Internet, which could be used by child exploitation, child pornography, child exploitation websites, and/or adult websites to lure children into sex acts.
The content of the videos that BuzzFeed News uncovered includes videos that could be sexually explicit, disturbing, abusive, or exploitive.
BuzzFeed is publishing this story to raise awareness about how these videos can be harmful to kids.
“We’ve been tracking these videos for months and we’ve found that at least 70 percent of them are being uploaded to adult websites, but we don’t know if the sites are actually taking down the content or whether they’re just uploading it on purpose,” said Emma Giddens, senior director of digital media and technology at the Center for Digital Democracy, a nonprofit that works to make the Internet safer for children and teens.
Giddins and BuzzFeed News conducted a search of YouTube and YouTube’s Content ID feature to see which of the more than 300 million videos uploaded in the last year had been removed.
They found that a majority of the affected videos are not being removed from YouTube because of concerns about the content.
The majority of these videos are being posted by individuals who may not be authorized to upload them.
“This is a dangerous area, and it’s just becoming more so with every day,” Giddons said.
“These videos are very similar to other content that people have been posting on YouTube.
There are a number of reasons why people might post videos, but it could be a porn video, it could also be a prank video or it could just be an adult video.”
Many of the people who upload these videos do so without knowledge of the content, but they are not illegal, according to YouTube Content ID.
Some of these websites have posted disclaimers, including a disclaimer from the site’s parent company, Disney-owned Maker Studios, stating that the content is not legal and is not endorsed by Maker Studios.
The videos are posted by a handful of people who have been flagged for violations of child protection laws.
Some have had their videos removed or have had the videos removed for violations related to child exploitation.
Others have posted their videos to private channels that are accessible only to people who are authorized to view them.
In some cases, these channels have already been taken down.
YouTube also allows people to submit content that has been removed, but in many cases it has not been removed because the video was not deemed to be child exploitation or illegal.
BuzzFeed’s investigation has been funded by the Center on Wrongful Deception, a non-profit advocacy organization that works against child exploitation on YouTube and other platforms.
In its report, the center said that “some of the accounts that appear to have uploaded videos to these channels are not legitimate and do not belong on YouTube.”
BuzzFeed News also spoke with several parents who said that their kids have been the target of exploitation videos and they told BuzzFeed News that the videos are usually made by people who may want to share the content and do so anonymously.
BuzzFeed has reached out to Maker Studios for comment and will update this story if and when we hear back.
The Center on Rightful Deceptions also told BuzzFeed that it has contacted YouTube to make sure that these videos and the associated videos are removed from the platform.
BuzzFeed reached out on Tuesday to YouTube, Maker Studios and Maker Studios’ parent company Maker Network, but has not received a response.
The center also contacted YouTube’s YouTube Content IDs team and has not heard back.
“YouTube has done a good job removing these videos, and we’re glad that we’re getting this information out there, but there’s still a lot more work to do,” Gynne Besser, director of public policy for Maker Networks, told BuzzFeed.
“What we want YouTube to do is take care of the most common, harmful content and not just the ones that are the most harmful to the kids and the kids aren’t getting what they want.”
In the report, BuzzFeed News identified several videos that appeared to have been removed for violating child protection law.
In one of the cases that we identified, the videos included explicit depictions of child sexual abuse.
In other cases, the video featured an adult holding a child who is being raped. In all of