There is a saying or quote that talks about what football means to each part of the country..."Football in the NOrth is a ......, in the West it's a ....., in the East it's a ......, but in the South it's a religion!" Does anyone know the rest of the quote??? Originally heard this quote said on an old episode of Designing Women. Thanks!
Southern Football
In the East, it's a cultural exchange
In the West Coast, it's a tourist attraction
In Texas, it's the big stakes
In the Midwest, it's a slugfest
But in the South, it's a religion
Roll Tide Roll
...this might not be what you are looking for but came across it on the web.........