Am I being incredib;y naieve here? I don't understand how giving healthcare to people is seen as political. I understand that it costs money, but I also understand that a number of pharmacutical companies haeve a vested interest in not changing things.
To me this is a basic human right or should people suffer and die so that already rich people can get richer?
When you read the stats
http://en.wikipedia.o..._in_the_United_States it makes you realise that all the stuff that the "right wingers" spout like the best health care in the world etc isn't true.
So Social medicine is socialism? The land if the free my a**e