Twitter on Wednesday said that it will be removing the coveted "verified" badge from the users who violate its rules and post racist and hateful material on the micro-blogging site.

The company has said it is now reviewing an unspecified number of verified accounts and will remove the badge if it finds out users violating its new guidelines, which include tweets that promote hate or violence against people based on their race, sexual orientation and religion among other things.

In this photo illustration, The Twitter logo is displayed on a mobile device as the company announced it's initial public offering and debut on the New York Stock Exchange on November 7, 2013 in London, England.Bethany Clarke/Getty Images

The announcement was made on the official @TwitterSupport account.


"Verification has long been perceived as an endorsement," Twitter said in a tweet. "We gave verified accounts visual prominence on the service which deepened this perception. We should have addressed this earlier but did not prioritize the work as we should have."

Twitter's decision to remove the verified badges which are displayed as a blue badge with a check mark on users' profiles comes amid criticism over the company's regulation of posts on its platform.

Of late, the social media company has been facing a lot of flak from critics for some days for verifying the accounts of White supremacist Jason Kessler who helped organise a high-profile white nationalist rally during which three people were killed amid widespread violence.

Twitter said the problem worsened after its July 2016 decision that allowed anyone to request a verified account.


"This perception became worse when we opened up verification for public submissions and verified people who we in no way endorse," it said in a tweet.

Twitter said that it is presently working on "a new authentication and verification program," but did not give any details as to when the process will debut. In the meantime, it is not verifying any accounts.