Are there ANY benefits to America that have been brought about by “woke culture”?

ATLANTA – Let’s think about this for a moment: Is there any benefit that has resulted in America from the woke culture?

While on the surface it may seem a ludicrous question to ask, objectively speaking it has done a pretty good job of exposing racism…from the left!

Today, Janelle King examines how woke culture had the unintended benefit of exposing the bigotry and racism of the very people who pushed wokeness on society.

Let’s talk about it! Smash that play button!

/ FACEBOOK

/ TWITTER