As the US election approaches, Facebook announces that it will crack down on the face changing video

category:Internet
 As the US election approaches, Facebook announces that it will crack down on the face changing video


On January 7, according to Reuters, Facebook announced that it would remove the deep fake video (i.e. face changing video) and other artificially manipulated videos to curb the spread of false information before the US election. However, some videos intended to imitate and satirize are not removed.

Facebook posted the news on its company blog on January 6. Facebook said that video manipulation can be realized by simple technologies, such as Photoshop, or some complex tools using AI technology and deep learning technology, which is usually embodied in deep fake, namely face changing video.

Facebook describes how to deal with deep fake, including investigating AI generated content and fraud (such as fake accounts), working with academia, government and industry.

Specifically, when the video meets the following two standards, it will be removed. One is that after editing, the video misleads people to think that the person who said the video said something, but actually did not; the other is to use artificial intelligence or machine learning products to integrate, replace or overlay the content of the video, so that the video looks real.

Facebook said it would remove audio, photos or videos from the platform if they violated its community standards, including nudity, violence, hate speech, in line with existing policies.

Videos that do not meet the removal criteria will also be reviewed by an independent third-party fact checker, and Facebook will reduce its distribution in the news feed stream if photos or comments are found to be false or partially false.

Facebook said the company is also working on the identification of content manipulation, among which deep fake is the most challenging part. In September 2019, Facebook launched the deepfake detection challenge to encourage people around the world to research deepfake detection and create more open-source tools.

With the help of deep learning and artificial intelligence, computer-generated videos become more and more lifelike and even more difficult to detect, the Wall Street Journal said. These face changing videos may be interesting, but will have devastating consequences for human life. Before this years US presidential election, social media companies are under increasing pressure to eliminate false or misleading content on their websites.

Reuters noted that Facebook has long been criticized by politicians for its content policy, Democrats for its refusal to conduct political censorship of political advertising, and Republicans for its discrimination against conservatives.

Facebooks move to remove the face changing video was also cited as a vulnerability. Facebook said its policy of banning deep fake would not extend to content that imitates or satirizes, or videos that are edited just to omit and change order. According to the Wall Street Journal, this will leave companies in a state where they have to decide which videos are satirical and which are not.

In one case, Nancy Pelosi, the speaker of the US House of Representatives, was not removed from the list of edited videos. According to the Wall Street Journal, last year, Pelosis video was widely shared. The video was slowed down and the tone was changed by the editor, which made her feel ambiguous.

Facebook said the video was not deepfake due to the use of regular clips, but the company still reduced its distribution because the video was deliberately manipulated. The move has been widely criticized, and Facebooks New Deal is controversial.

Source: Qiao JunJing, editor in charge of surging news