The right to health was initially recognized by the World Health Organization (WHO) in 1946. According to WHO, the enjoyment of health is a fundamental right of every human being regardless of race, political belief, or religion. The debate over whether health care is a right or privilege has been raging for more than a century. Here is a look at the definition of health care and health care providers and whether health care is a right or privilege.
Health Care and Health Care Providers
Healthcare is the provision of medical care to individuals and communities. Healthcare careers are not limited to doctors and nurses but also include chiropractors, administrators, therapists, and technology professionals.
According to federal regulations, a health care provider is a doctor of medicine, dentist, podiatrist, clinical psychologist, nurse, or any medical personnel, authorized to practice by the State and working according to the standards laid out by State law. A healthcare provider is; therefore, anyone who is legally permitted to administer healthcare to patients.
Is Health Care A Right or Privilege?
The right to health care is internationally recognized but this does not mean this right is enforced worldwide. In the U.S., health care was included in the Second Bill of Rights that was drafted by Franklin Delano Roosevelt. His wife took his work to the UN which led to the clarification of health care as a human right in the Universal Declaration of Human Rights (UDHR).
After the adoption of the UDHR, all industrialized countries implemented universal health care programs to make sure their citizens enjoyed the right to health. However, in 2015, the U.S. report to the UN does not acknowledge health care as a human right. The U.S. does not have a health care system but only a health insurance system in the form of Medicare and Affordable Care Act. It is; therefore, reasonable to claim that in the U.S. and many developed countries health care is more of a privilege than a right.
The Houston Health Initiative is a group of health conscious patients and physicians who have come together to maintain and improve the health of each member of their group. The goal of HHI is to change the way Americans pay and receive medical care. If you want to learn about the state of health care in the country, or to make a change to the health care system in the U.S., Houston Healthcare Initiative has got your back.