Research: Facebook recommends good friend functions to help extremists build terrorist networks.

category:Hot
 Research: Facebook recommends good friend functions to help extremists build terrorist networks.


Social media is often popular for helping people connect with each other, but is this always good? According to the daily telegraph in May 5th, a new study shows that the facebook of the famous social networking site has linked extremist extremists of the Islamic state to extremes by recommending good friends. The Daily Telegraph reported that the CounterExtremismProject, a non-profit organization in New York and London, will release a detailed report in May, pointing out that the Facebooks recommended friend function is helping extremists to find organizations. The researchers analyzed the online activities of 1000 extremist groups of Islamic States supporters in 96 countries on face books, and found that users who sympathized with extremes were pushed by websites to introduce other extremists, and they could build new terrorist networks and even recruit new members. Facebook uses complex algorithms to help people with common interests to connect with each other. It automatically collects large amounts of information from users, uses precise push advertising, and recommends other users who may be like-minded. But in the absence of constraints, terrorists may also use this function to find other supporters. One of the authors of the report, Waters (GregoryWaters), said he received a large number of user recommendations to support the extremist organization of the Islamic state after adding friends to an active extremist user on the face book. Another researcher, Postin J (RobertPostings), said he had clicked on several news stories about the Philippines extremists on the face book, and that the news itself was not extremely inclined, but in a few hours he had received dozens of recommendations from Philippiness extremist friends. Their report says, after these users are interconnected, Facebook lacks monitoring of extreme content on the site, which allows extremists to quickly pick up vulnerable target populations. For example, they said, in March 2017, an Indonesian extremist organization Islamic state supporters sent a friend invitation to a user in New York. After two people chatted, the American user said he was not religious, but he was interested in Islam. In the following months, the Indonesian user began sending more and more aggressive messages and publicity links, and his American friends gave him a compliment. In the 6 months, the American user was also extremist and began to support terrorist organizations. Postin J, the researcher, said it was necessary to remove the content of extreme ideas and incitement to incite the attack, but the face book was not completely banned from most of the people, particularly worrying. Even if the account is closed, the website reaction rate is too slow, resulting in extreme content can still be widely disseminated. Bolger (J.M.Berger), an American Cbs Broadcasting Inc who cites research on extremism on the 6 day, said that people have noticed that the question of Facebook has been for some time, which should cause concern, but need further analysis. Bolger said that the Internet environment of extremists has been much more difficult than in the past few years. Bolger said that if an extremist supporter did not release banned content on social networks, he would not be found. But if the user is in a network that supports an extremist organization, the Facebook algorithm recommends that they connect with all the extremists as long as one of them is friends with each other. Today, this has been tighter regulated, but it can not ensure that 100% removes extreme content. Facebook spokesman told the Daily Telegraph: there are no terrorists in Facebook. We act strongly to ensure that there will be no terrorists or terrorist organizations using this website, and we will also remove any content of praise or support of terrorism. The spokesman also said that 99% of the current extremes involved in the Islamic state and Al Qaeda were removed by Facebook automatic system discovery, but there is no simple technical solution to combat online extremism, and Facebook will continue to pay more money to strengthen human and technical strength to find out. And remove the content of fear. The source of this article: surging news editor: Zhang Xianchao _NN9310 Bolger said that if an extremist supporter did not release banned content on social networks, he would not be found. But if the user is in a network that supports an extremist organization, the Facebook algorithm recommends that they connect with all the extremists as long as one of them is friends with each other. Today, this has been tighter regulated, but it can not ensure that 100% removes extreme content. Facebook spokesman told the Daily Telegraph: there are no terrorists in Facebook. We act strongly to ensure that there will be no terrorists or terrorist organizations using this website, and we will also remove any content of praise or support of terrorism. The spokesman also said that 99% of the current extremes involved in the Islamic state and Al Qaeda were removed by Facebook automatic system discovery, but there is no simple technical solution to combat online extremism, and Facebook will continue to pay more money to strengthen human and technical strength to find out. And remove the content of fear.