Anywhere soccer is played is a dump and should be torn down, soccer isn't a sport.
I would like a minute to see discuss this quote.
I understand America will never embrace "soccer" and frankly, the fact doesn't bother anyone in football except FIFA who want more money.
But why the hell is it such a big deal to Americans to diss sports that they don't like?
American Football is mostly hated in this country, yet people would never say it "isn't a sport". It is justone that is not liked in this country. Basketball is another that is a minor thing over here, and baseball doesn't exist. Yet they are still always classified as sports.
I don't want to know the reasons why America won't accept sports, I just want to know why some feel the need to openly vent their anger at the sport. (I know this is not everyone mind)
All it makes America look like in the European sporting world is a joke. They call their sports finals the "world series" or "world championships" then pretend sports they are no good at don't exist.
The US are a superpower at athletics, but that isn't a high profile sport in Europe really. In every high profile sport here, Americans are seen as "easy" opposition. Yet America is always promoted as a sporting powerhouse. But only in sports that no one else competes in.
So I would love to know what American's views are on the issue. Why do you feel the need to bash sports that aren't your favourite? Why are American sports called world championships when only Americans take part? And why are American sports not big in other countries?