Facebook on Wednesday announced a ban on praise, support, and representation of white nationalism and white separatism.
The move by the company came weeks after a white supremacist terrorist killed 50 people in the New Zealand mosque massacre, which he live-streamed on Facebook.
Facebook has more than two billion users, but can the new ban make any difference to the increasing hate crimes?
Will it help to reduce violence?
Facebook has long banned white supremacist content under its rules on "hateful" content, but did not previously consider white nationalist or separatist content to be explicitly racist.
The company has been severely criticised by governments and users for failing to tackle the spread of hate speech.
Facebook said it had initially been wary of infringing on broader concepts of nationalism and separatism, which it said are "an important part of people's identity."
"But over the past three months our conversations with members of civil society and academics who are experts in race relations around the world have confirmed that white nationalism and separatism cannot be meaningfully separated from white supremacy and organised hate groups," Facebook said.
"Going forward, while people will still be able to demonstrate pride in their ethnic heritage, we will not tolerate praise or support for white nationalism and separatism,” the company said.
With the new policy that would be implemented as of next week, Facebook is going to add members to its content monitoring teams, who have already removed event pages used by white supremacists.
Facebook said it would also start connecting people who search for terms associated with white supremacy to an organisation called Life After Hate, focused on helping people leave hate groups.
Former US president Barack Obama’s administration had awarded Life After Hate $400,000 in annual funding for their effort to reduce violence, but current President Donald Trump cut the funding after he took office.
Although the intention of Facebook is to censor the posts supporting white nationalism and white separatism, some argue that it may not be an easy task to achieve.
Apar Gupta, Executive Director of Internet Freedom Foundation, told TRT World that it could be difficult for the sensors to differentiate between conversations supporting white nationalism and criticising white nationalism as Facebook never reveals standards of evaluating posts sent by users.
This is not the first time Facebook is taking action against violence spreading from its applications.
Facebook-owned WhatsApp in January limited the number of times a user could forward a message from 20 to five in India after the spread of rumors on social media led to killings and lynching attempts.
The policy on white nationalism and white supremacism will be enforced next week, Facebook said in a blog post that the change will apply to both its core Facebook app and Instagram.