Extra

Links

phil
Interesting
That first article makes me wonder why our society of today looks down upon the practice of forcibly taking over another nation.

For hundreds, or even thousands of years, countries have been "conquering" weaker nations or nations that they disagree with. Only recently has the general opinion been that to militarily dominate another country is wrong.

Why is that? Who is to say whether or not the world would be a better place if the United States controlled Iraq. Perhaps it wouldn't be, at least for the rest of the world.

My point is, I'm a citizen of the United States, and maybe its in *my* best interest if we bomb the hell out of Saddam and then move on in.

Now, don't get me wrong, I'm not in favor of getting into a pointless war (which is really what this is looking like), but why can't I support our country expanding its influence? Is that wrong? What would the roman senate have thought? What about the various factions in the middle ages? Why the change now?

As I write this, I wonder if is has to do with the fact that recent generations have become more and more secular in their thinking. Almost all previous large scale conflicts have been a result in religious differences. Have we as a society as a whole woken from our opiate dream?

Wow, look at me, waxing all philisophical. =P
loophole
well...

I gan see your point, I suppose. The problem in my mind is that there is a cost to conquest. In the past, perhaps, people were willing to pay those prices. Now, in the atomic age (pardon the anachronism) the stakes are potentially much higher. Is the risk that destabilizing the middle east might result in Israel using nuclear weapons (It's a stretch, I know)?

I would like to think, also, that we have evolved past the need for expansionism and conquest. I guess that's naive, though.

I'll have to think about it some more.