This article is part of a series on the |
History of the United States |
---|
The history of medicine in the United States encompasses a variety of approaches to health care in the United States spanning from colonial days to the present. These interpretations of medicine vary from early folk remedies that fell under various different medical systems to the increasingly standardized and professional managed care of modern biomedicine.
At the time settlers first came to the United States, the predominant medical system was humoral theory, or the idea that diseases are caused by an imbalance of bodily fluids.[1] Settlers initially believed that they should only use medicines that fit in this medical system and were made out of "such things only as grown in England, they being most fit for English Bodies," as said in The English Physitian Enlarged, a medical handbook commonly owned by early settlers.[2] However, as settlers were faced with new diseases and a scarcity of typical plants and herbs used to make therapies in England, they increasingly turned to local flora and Native American remedies as alternatives to European medicine. The Native American medical system typically tied the administration of herbal treatments with rituals and prayers.[3] This inclusion of a different spiritual system was denounced by Europeans, in particular Spanish colonies, as part of the religious fervor associated with the Inquisition. Any Native American medical information that did not agree with humoral theory was deemed heretical by Spanish authorities, and tribal healers were condemned as witches.[4] In English colonies on the other hand, it was more common for settlers to seek medical help from Native American healers. [3]
Mortality was very high for new arrivals, and high for children in the colonial era.[5][6] Malaria was deadly to many new arrivals. The disease environment was very hostile to European settlers, especially in all the Southern colonies. Malaria was endemic in the South, with very high mortality rates for new arrivals. Children born in the new world had some immunity—they experienced mild recurrent forms of malaria but survived. For an example of newly arrived able-bodied young men, over one-fourth of the Anglican missionaries died within five years of their arrival in the Carolinas.[7] Mortality was high for infants and small children, especially from diphtheria, yellow fever, and malaria. Most sick people turn to local healers, and used folk remedies. Others relied upon the minister-physicians, barber-surgeons, apothecaries, midwives, and ministers; a few used colonial physicians trained either in Britain, or an apprenticeship in the colonies. There was little government control, regulation of medical care, or attention to public health. By the 18th century, Colonial physicians, following the models in England and Scotland, introduced modern medicine to the cities. This allowed some advances in vaccination, pathology, anatomy and pharmacology.[8]
There was a fundamental difference in the human infectious diseases present in the indigenous peoples and that of sailors and explorers from Europe and Africa. Some viruses, like smallpox, have only human hosts and appeared to have never occurred on the North American continent before 1492. The indigenous people lacked genetic resistance to such new infections, and suffered overwhelming mortality when exposed to smallpox, measles, malaria, tuberculosis and other diseases. The depopulation occurred years before the European settlers arrived in the vicinity and resulted from contact with trappers.[9][10]
The French colonial city of New Orleans, Louisiana opened two hospitals in the early 1700s. The first was the Royal Hospital, which opened in 1722 as a small military infirmary, but grew to importance when the Ursuline Sisters took over its management in 1727 and made it a major hospital for the public, with a new and larger building built in 1734. The other was the Charity Hospital, which was staffed by many of the same people but was established in 1736 as a supplement to the Royal Hospital so that the poorer classes (who usually could not afford treatment at the Royal Hospital) had somewhere to go.[11]
In the British colonies, medicine was rudimentary for the first few generations, as few upper-class British physicians emigrated to the colonies. The first medical society was organized in Boston in 1735. In the 18th century, 117 Americans from wealthy families had graduated in medicine in Edinburgh, Scotland, but most physicians learned as apprentices in the colonies.[12] In Philadelphia, the Medical College of Philadelphia was founded in 1765, and became affiliated with the university in 1791. In New York, the medical department of King's College was established in 1767, and in 1770, awarded the first American M.D. degree.[13]
Smallpox inoculation was introduced 1716–1766, well before it was accepted in Europe. The first medical schools were established in Philadelphia in 1765 and New York in 1768. The first textbook appeared in 1775, though physicians had easy access to British textbooks. The first pharmacopoeia appeared in 1778.[14][15] The European populations had a historic exposure and partial immunity to smallpox, but the Native American populations did not, and their death rates were high enough for one epidemic to virtually destroy a small tribe.[16]
Physicians in port cities realized the need to quarantine sick sailors and passengers as soon as they arrived. Pest houses for them were established in Boston (1717), Philadelphia (1742) Charleston (1752) and New York (1757). The first general hospital was established in Philadelphia in 1752.[17][18]
In the 19th century the nation was flooded with medical sects promoting a wide range of alternative treatments for all ailments. The medical societies tried to impose licensing requirements in state law, but the eclectics undid their efforts. The most famous of the eclectics was Samuel Thomson (1769-1843), a self educated New England farm boy who developed a wildly popular herbal medical system.[19] He founded the Friendly Botanic Societies in 1813 and wrote a manual detailing his new methods.[20] He promised that even the worst ailments could be cured without any harsh treatments. There would be no surgery, no deliberate bleeding, no powerful drugs. Disease was a matter of maladjustment in the body's internal heat, and could be cured by applying certain herbs and medicinal plants, coupled with vomiting, enemas, and steam baths. Thomson's approach resonated with workers and farmers who distrusted the bloody hands of traditional physicians.[21] President Andrew Jackson endorsed the new fad, and Brigham Young promoted it to the new Mormon movement.[22]
Thomson was a master promoter. He patented his system and sold licenses to hundreds of field agents who gained patients during the cholera outbreaks of 1832 and 1834. By 1839, the movement claimed three million followers and was strongest in New England.[23]
However, in the 1840s it all fell apart. Thomson died in 1843, many patients grew worse after the treatment, while a bitter schism emerged among the Thomsonian agents. The result the sect's sharp decline by 1860. Nevertheless, Thomson's influence still can be seen among people suspicious of modern medicine. Many herbs he popularized, such as cayenne pepper, lobelia, and goldenseal, remain widely used to this day in herbal healing routines. The Thomsonians had been briefly successful in blocking state laws limited medical practice to licensed physicians. After the collapse the MDs made a comeback and reimposed strict licensing laws on the practice of medicine. [24]
In the American Civil War (1861–65), as was typical of the 19th century, far more soldiers died of disease than in battle, and even larger numbers were temporarily incapacitated by wounds, disease and accidents.[25] Conditions were worse in the Confederacy, where doctors and medical supplies were in short supply.[26] The war had a dramatic long-term impact on American medicine, from new surgical technique to creation of many hospitals, to expanded nursing and to modern research facilities.
In the Civil War, as was typical of the 19th century, far more soldiers died of disease than in battle, and even larger numbers were temporarily incapacitated by wounds, disease and accidents.[27] Conditions were very poor in the Confederacy, where doctors, hospitals and medical supplies were in short supply.[28][29][30]
Medicine in the 1860s did not know about germs and it tolerated bad hygiene. The risk was highest at the beginning of the war when men who had seldom been far from home were brought together for training alongside thousands of strangers who carried unfamiliar germs. Men from rural areas were twice as likely to die from infectious diseases as soldiers from urban areas.[31] Recruits first encountered epidemics of the childhood diseases of chicken pox, mumps, whooping cough, and, especially, measles. Later the fatal disease environment included diarrhea, dysentery, typhoid fever, and malaria. Disease vectors were often unknown. Bullet wounds often led to gangrene, usually necessitating an amputation before it became fatal. The surgeons used chloroform if available, whiskey otherwise.[32] Harsh weather, bad water, inadequate shelter in winter quarters, poor sanitation within the camps, and dirty camp hospitals took their toll.[33] This was a common scenario in wars from time immemorial, and conditions faced by the Confederate army were even worse since the Union blockade sharply reduced medical supplies; adequate food, shoes and warm clothing were in very short supply.
The Union had money and responded by building 204 army hospitals with 137,000 beds, with doctors, nurses and staff as needed, as well as hospital ships and trains located close to the battlefields. Mortality was only 8 percent.[34] What was critical in the Union was the emergence of skilled, well-funded medical organizers who took proactive action, especially in the much enlarged United States Army Medical Department,[35] and the United States Sanitary Commission, a new private agency.[36] Numerous other new agencies also targeted the medical and morale needs of soldiers, including the United States Christian Commission as well as smaller private agencies such as the Women's Central Association of Relief for Sick and Wounded in the Army (WCAR) founded in 1861 by Henry Whitney Bellows, and Dorothea Dix. Systematic funding appeals raised public consciousness, as well as millions of dollars. Many thousands of volunteers worked in the hospitals and rest homes, most famously poet Walt Whitman. Frederick Law Olmsted, a famous landscape architect, was the highly efficient executive director of the Sanitary Commission.[37]
States could use their own tax money to support their troops as Ohio did. Following the unexpected carnage at the battle of Shiloh in April 1862, the Ohio state government sent 3 steamboats to the scene as floating hospitals with doctors, nurses and medical supplies. The state fleet expanded to eleven hospital ships. The state also set up 12 local offices in main transportation nodes to help Ohio soldiers moving back and forth.[38] The U.S. Army learned many lessons and in 1886, it established the Hospital Corps. The Sanitary Commission collected enormous amounts of statistical data, and opened up the problems of storing information for fast access and mechanically searching for data patterns. The pioneer was John Shaw Billings (1838-1913). A senior surgeon in the war, Billings built the Library of the Surgeon General's Office (now the National Library of Medicine, the centerpiece of modern medical information systems. Billings figured out how to mechanically analyze medical and demographic data by turning information into numbers and punching onto cardboard cards as developed by his assistant Herman Hollerith. This was the origin of the computer punch card system that dominated computers and statistical data manipulation until the 1970s.[39]
After 1870 the Nightingale model of professional training of nurses was widely copied. Linda Richards (1841 – 1930) studied in London and became the first professionally trained American nurse. She established nursing training programs in the United States and Japan, and created the first system for keeping individual medical records for hospitalized patients.[40]
After the American Revolution, the United States was slow to adopt advances in European medicine, but adopted germ theory and science-based practices in the late 1800s as the medical education system changed.[41] Historian Elaine G. Breslaw describes earlier post-colonial American medical schools as "diploma mills", and credits the large 1889 endowment of Johns Hopkins Hospital for giving it the ability to lead the transition to science-based medicine.[42] Johns Hopkins originated several modern organizational practices, including residency and rounds. In 1910, the Flexner Report was published, standardizing many aspects of medical education. The Flexner Report is a book-length study of medical education and called for stricter standards for medical education based on the scientific approach used at universities, including Johns Hopkins.[43]
As Campbell (1984) shows, the nursing profession was transformed by World War II. Army and Navy nursing was highly attractive and a larger proportion of nurses volunteered for service higher than any other occupation in American society.[44][45]
The public image of the nurses was highly favorable during the war, as exemplified by such Hollywood films as Cry "Havoc", which made the selfless nurses heroes under enemy fire. Some nurses were captured by the Japanese,[46] but in practice they were kept out of harm's way, with the great majority stationed on the home front. The medical services were large operations, with over 600,000 soldiers, and ten enlisted men for every nurse. Nearly all the doctors were men, with women doctors allowed only to examine patients from the Women's Army Corps.[44]
In the colonial era, women played a major role in terms of healthcare, especially regarding midwives and childbirth. Local healers used herbal and folk remedies to treat friends and neighbors. Published housekeeping guides included instructions in medical care and the preparation of common remedies. Nursing was considered a female role.[47] Babies were delivered at home without the services of a physician well into the 20th century, making the midwife a central figure in healthcare.[48][49]
The professionalization of medicine, starting slowly in the early 19th century, included systematic efforts to minimize the role of untrained uncertified women and keep them out of new institutions such as hospitals and medical schools.[50]
In 1849 Elizabeth Blackwell (1821–1910), an immigrant from England, graduated from Geneva Medical College in New York at the head of her class and thus became the first female doctor in America. In 1857, she and her sister Emily, and their colleague Marie Zakrzewska, founded the New York Infirmary for Women and Children, the first American hospital run by women and the first dedicated to serving women and children.[51] Blackwell viewed medicine as a means for social and moral reform, while a younger pioneer Mary Putnam Jacobi (1842-1906) focused on curing disease. At a deeper level of disagreement, Blackwell felt that women would succeed in medicine because of their humane female values, but Jacobi believed that women should participate as the equals of men in all medical specialties.[52] In 1982, nephrologist Leah Lowenstein became the first woman dean of a co-education medical school upon her appointment at Jefferson Medical College.[53]
Nursing became professionalized in the late 19th century, opening a new middle-class career for talented young women of all social backgrounds. The School of Nursing at Detroit's Harper Hospital, begun in 1884, was a national leader. Its graduates worked at the hospital and also in institutions, public health services, as private duty nurses, and volunteered for duty at military hospitals during the Spanish–American War and the two world wars.[54]
The major religious denominations were active in establishing hospitals in many cities. Several Catholic orders of nuns specialized in nursing roles. While most lay women got married and stopped, or became private duty nurses in the homes and private hospital rooms of the wealthy, the Catholic sisters had lifetime careers in the hospitals. This enabled hospitals like St. Vincent's Hospital in New York, where nurses from the Sisters of Charity began their work in 1849; patients of all backgrounds were welcome, but most came from the low-income Catholic population.[55]