First, thing is first, I know that there are many bothered white people who think that they are entitled to everything on this earth including people but they are wrong. Those actions are soon backfiring on the U.S. Everyone loves the United States until recently it has been more than evident that this racism that they so call say doesn’t exist is very much so present. With that being said that is not a good look for tourist.
Just got wind that the people from the Bahamas are not in agreeance with all the headlines of racial police brutality. They have officially put a travel advisory out for their population for traveling to the United States. If no one knows what that means it means that because of the racial tension that has been published recently they warned people to not go to the Unites States until that is taken care of.
Now, I think this is a great thing actually because if more countries do that than these ignorant people will realize how much we need tourist and “immigrants”. I’m so tired of hearing people talk about real america as if that is white america because contrary to what is being portrayed that is a lie.
Never forget that America was built on the backs of slaves and Native Americans. That is and always was the ONLY “real america” if we want to be honest here. Know the facts!