Where are our conservative political analysts on the issue of health insurance?
The “Health Care Reform” does not have anything at all to do with health care, it has to do with insurance. By the simple twist of semantics to say “health care”, while referring to “health insurance”, we are all up in arms about the government taking control of the “health care industry”.
It is a fallacy, it is about insurance. Presently, with the exception of the Federal government’s prohibition of insurance companies selling products across state lines, the insurance industry is under the jurisdiction of the States. I will concede that most states have mismanaged that responsibility, succumbing to the insurance lobbiests and the idiotic demands for everything from physicals to viagra to be covered.
However, it is still legally a States responsibility. It is this State responsibility that the federal government is usurping.
And not a peep from a single conservative pundit or analyst.
Can anyone explain why?