Is America an imperialist nation? I would love to debate you lefties on this issue, since I believe many of you would answer that question YES.
Please make your case.
Guess it would depend on how you are defining "imperialism", or if you mean "colonialism". Is your question relating to internal strugles within the US, or are you referring to a global strategy?
To the former, an internal struggle, the answer could be, and probably is, yes. One definition of imperialism is: Imperialism:The Dictionary of Human Geography, "the creation and maintenance of an unequal economic, cultural and territorial relationship, usually between states and often in the form of an empire, based on domination and subordination."
This could be true if you are referring to the power of government over its people which we have seen in the US. It could also be said to be true if one is referring to the corporate control over government, and the growth of corporatism. This is the basis for the economic disparity we see growing today, and the growing gap in control of wealth.
On a global scale, the US has not shown an "imperialistic" tendencies, and it could be said that China is far more "imperialistic", especially economically, and gaining military strength due to the disparity in trade, then most nations.