Doctors Say Florida Health Insurance Mandate Is Necessary
Florida is one of the states that has filed a lawsuit against the Affordable Care Act challenging the constitutionality of the mandate to get minimal health coverage. While some politicians seem to think they are experts on how to fix health care, what do physicians have to say about this?In an article published in the Journal of the American Medical Association, physicians say that the FL health insurance mandate and a similar mandate for every state is the only way to expand health care access to the public and keep health care costs from increasing. According to Dr. Edward Miller, dean and CEO of Johns Hopkins University School of Medicine, policymakers and as well as the judiciary should never lose sight of the patients. The overall health and wellness of patients greatly depends on access to health care. Dr. Miller also says it is very necessary to straighten out the system, and the Florida health care mandate is one key to do that.Miller views the mandate as being critical t...