My own recent hospital stay makes me wonder, yet again, how people survive in the United States without health insurance.
As I said: Every American who is not living in extreme right-wing denial knows that the money itself is more important to insurance companies than the actual health of individual human beings.
No wonder the rest of the world is appalled by America as a nation "caring" for her people.
Monday, April 26, 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment