Twitter’s reply prompts for offensive tweets improved to detect friendly banter

Twitter started testing prompts last year to help people reconsider what they’re tweeting if it contains “potentially harmful or offensive reply”. Twitter has made improvements to its prompts system, and is rolling it out to Android and iOS users who have English as the default language.

Twitter doesn’t stop one from tweeting what they want but they show a prompt asking users to review what they’ve written by highlighting the words that may be harmful. Twitter showed these prompts for replies that contained insults, strong language or hateful remarks. During its tests, Twitter said it found that some of these prompts were unnecessary.

Read more

You may also like

Comments are closed.