What happened to the United States?
Words hold little meaning if actions prove much different than the spoken or written word. It can still be heard, America, the Land of the Free, the Home of the Brave, but do these words properly describe the nation of the United States of America as it is today?
While our Rights are being dwindled down to non-existent status, can Americans still speak of living in a free country? Calling the sky purple does not make it so, declaring a nation to be free does not equate with reality!
It appears that the American public is satisfied with living in a constant state of denial. A complete refusal to accept the facts of reality. The fact that America has been transformed, a once magnificent republic is becoming more authoritative and tyrannical in what seems to be a daily basis. The United States of America is not the bastion of freedom that it once was, truly heartbreaking!