#1
|
|||
|
|||
OT: Worldview of USA?
I've just watched a political documentary that I don't want to name, this being an election year here in the US and I don't want to start a political argument here. But the film mentioned the world influence of the United States.
In my opinion, I would prefer the USA to pull out of the public eye somewhat. We send millions of dollars to different countries, some of which seem to hate the US. We have troops in Afghanistan and Iraq, both of which seem to not like us. I think we (the U.S.) should pull out of coutries in which we are not invited, maybe even negotiate treaties where the countries that ask us to be there pay us either money or trade status, I think the US should stop sending money to nations until we get our own debt settled and have an excess to give away. I'm not saying the US should put it's head in the sand or be xenophobic but withdraw it's influence from countries we're not welcome in. The film I watched inferred that a lot of nations felt they were being "colonized" by America and it's capitolism and way of life. What's some of the other members' here opinion?
__________________
Just because I'm on the side of angels doesn't mean I am one. |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|