Twitter is working on a new feature that warns users to rethink offensive replies
Twitter is experimenting with a new feature, that will warn users before they post abusive or harmful content that may get reported.
“When things get heated, you may say things you don’t mean. To let you rethink a reply, we’re running a limited experiment on iOS with a prompt that gives you the option to revise your reply before it’s published if it uses language that could be harmful,” Twitter’s official Twitter Support account posted on Tuesday.
The feature will prompt users to alter their replies if the content typed out by them uses language that can be deemed “harmful.”