Like 99% of most Americans, the only time I give a shit about soccer is every four years when we play in the World Cup. You've read and seen the script before. We win a couple of games we weren't supposed too and the country gets excited with some kind of national pride and the talking heads all assure and reassure us that soccer has finally arrived and will be a major player on the sports landscape. Except it never does.
Fast forward to the last couple of days and for some reason, the United States feels the need to interject itself in a sport we have no business interjecting ourselves. Go the ESPN, FOX, or CBS and it's the lead story on their websites. It would be like Italy indicting Roger Goodell and the NFL. I realize soccer is a big deal in most of the world, but does anyone share my opinion we should leave all of the indictments and subpoenas to the countries that actually care about this?