People have cared for the sick and injured and sought cures for illness since the earliest days of humanity. At that time most care was provided by family, members of an individual’s tribe, or shamans and priests. Although healers could sometimes devise effective remedies, few diseases were understood, and sickness was often attributed to supernatural or superstitious causes. As civilization progressed, the application of medical care became ritualized. In ancient Greece for example, people visited temples dedicated to the healer-god Asclepius, where they obtained medical advice and were even cured via surgery for conditions such as abdominal abscesses. The first medical school in Greece opened around 700 B.C. Facilities to care for the sick and injured were also founded in ancient India and other regions. Physicians as well as workers similar to modern-day nurses and nurse assistants cared for the ill. Managers probably oversaw the work of these caregivers and the overall operations of the facilities.
During the Middle Ages, more schools pursued the study of medicine, and by the 13th century, numerous hospitals were founded by religious communities. They cared for the sick and injured, but they sometimes also housed poor people and religious pilgrims. The first hospital in the New World was the Hospital of St. Nicholas of Bari, founded in 1494 in what is now the Dominican Republic. Records show that 40 injured Spaniards were likely treated there after the Indian uprising of 1494. The Pennsylvania Hospital was the first hospital in the United States. It was established in Philadelphia in 1751 by Benjamin Franklin and Dr. Thomas Bond. As the practice of medicine became more sophisticated and better organized, a need emerged for dedicated management to administer medical institutions.
The development of modern health care management has largely mirrored the advancement of hospitals and the overall field of health care. Until the late 1800s, most people who were admitted to hospitals were poor and didn’t have family to care for them. Wealthy people typically paid for care provided in their homes. In many instances, hospitals could do little for patients. That changed in the late 1800s and early 1900s. Medical breakthroughs such as anesthesia, antisepsis, antibiotics, improved surgical techniques, and new imaging methods allowed hospitals to more effectively diagnose and treat patients.
The number of hospitals grew quickly. Careers in Healthcare Management (Health Administration Press, 2002) reports that “between 1875 and 1925, the number of hospitals in the United States grew from just over 170 to about 7,000, and hospital beds increased from 35,000 to 860,000.” This rapid growth in the number of hospitals created demand for hospital administrators, who were also referred to as hospital superintendents. Physicians, nurses, workers in other fields, and Roman Catholic sisters were appointed to the position of hospital administrator with little or no special training.
The earliest recognition of hospital administration as a separate profession came in 1898 when the Association of Hospital Superintendents was organized. This group, whose membership today includes nearly all the hospitals in the United States, is now called the American Hospital Association. In 1933, the American College of Hospital Administrators (now the American College of Healthcare Executives) was founded to increase the standards of practice and education in the field.
Educational programs were developed to help professionalize the field. The first formal training program in health economics for nurses who worked as superintendents was established at the Columbia Teachers College in New York in 1900. The first graduate program in hospital administration was founded at the University of Chicago in 1934. Since then, bachelors and masters degrees in health care administration and related fields have become common offerings at many universities and other schools.
Also, there have been many laws passed that promote the expansion of the health care industry in the United States. Most recently the Patient Protection and Affordable Care Act of 2010 (often referred to as the Affordable Care Act, ACA) expanded health insurance coverage to more than 20 million Americans and increased demand for health care professionals, including health care managers.
Technology is changing the health care industry by improving diagnostics and therapies and allowing better communication between health care professionals and patients. One major technologically influenced trend is the transition of health care records from paper to digital format, or electronic medical records. This process, as well as the emergence of medical informatics (which involves the mathematical analysis of patient information to improve health outcomes), is creating demand for qualified managers. Data analytics software is increasingly being used to assess worker performance and patient treatment outcomes, and for many other uses. Telemedicine is allowing medical professionals to consult with and treat patients in their homes and other remote locations. The COVID-19 pandemic, which began in late 2019, increased demand for telemedicine services. Forty-eight percent of physicians surveyed by Merritt Hawkins and The Physicians Foundation in April 2020 reported using telehealth to treat patients—up from only 18 percent in 2018.
- Adult Day Care Coordinators
- Business Managers
- Cancer Registrars
- Clinic Managers
- Clinical Data Managers
- Clinical Research Coordinators
- Contact Tracers
- Geriatric Care Managers
- Geriatric Social Workers
- Health Advocates
- Health Care Consultants
- Health Care Insurance Navigators
- Health Care Managers
- HIV/AIDS Counselors and Case Managers
- Informatics Nurse Specialists
- Medical Ethicists
- Medical Record Technicians
- Medical Secretaries
- Medical Transcriptionists
- Nurse Managers
- Nursing Home Administrators
- Rehabilitation Counselors
- Social Workers
- Transplant Coordinators