Myeisha Essex is in love with all things pop culture, thanks in large part to her hometown. This Los Angeles native has an encyclopedic knowledge of the entertainment industry and she loves a good trivia game. She received her bachelor’s degree in journalism and media studies from Bennett College for Women and her master's from the Columbia University Graduate School of Journalism. Her work can be found in Sister 2 Sister, Harlem World Magazine, Clutch and on Essence.com. When she's not keeping up with the news or learning Beyonce's latest dance moves, she enjoys watching stand-up comedy on YouTube! Follow her on Instagram @more_about_me
Twitter averages 58 million tweets per day, and if you’re one of the 600 million active registered users then you know a lot of that content contains something racially insensitive. Whether it’s the use of the N-word, a racist joke and good old-fashioned in-your-face bigotry, it’s no surprise that intolerant people love to hide behind their computer screens.
To show social media users the depth of the problem, U.K.-based think tank Demos conducted a study to count the number of racially-insensitive tweets each day. According to their research, 10,000 tweets, or one in every 15,000 posts, contains a “racist or ethnic slur.” That’s about 416 tweets each hour.
NBC News reports “anywhere between 47.5 percent and 70 percent were ‘non-derogatory’ or used to ‘express in-group solidarity.’
“That still leaves the tricky question of how many tweets would be considered, by most people, as racist. A program can’t parse the racist subtext of a tweet and human analysts have different ideas of what counts as racism,” the new source points out. “Ultimately, the study found that one in 55,000 tweets (around 0.000018 percent) was indicative of racial prejudice. That includes up to 10 percent of the tweets that were considered “casual” racial slurs — meaning they weren’t explicitly racist, but would probably be considered offensive by some people — and the estimated 100 tweets a day that threatened violence.”
In response, the report’s author said, “Even though racist, religious and ethnic slurs tend to be used in a non-derogatory way on Twitter, this does not mean that hate speech is not being used on this platform.”
“Language does not require the use of slurs in order to be hateful,” he concluded.
To check out the study in its entirely, click here.