The Care of Strangers: The Economic Story of Hospitals & Hospitality ~ Chapter Two

by faithgibson on September 3, 2015

Chapter 2: The Great Divide – Pre and Post Germ Theory of Infectious Disease

The previous chapter identified the historical role of hospitals as social institutions that provided charity care and the contemporary role of hospitals as temples of high-tech therapy. The model of the first was pretty simple – the care was free, but virtually of NOT therapeutic value, except for setting broken bones, stitching wounds, and providing palliative care including opium-derived pain relief.

Unlike the inexpensive but ineffective care prior to mid-20th century, hospitalization in the 21st century is the single most expensive aspect of medical care; but the good news is that its therapeutic success rate is very high.

Unfortunately, this only applies to people with adequate economic resources — comprehensive health insurance, personal wealth or poor enough to qualify for Medicare-Medicaid. Public health officials have calculated that about 40,000 uninsured-uncovered Americans fall thru this wide chasm each year with fatal consequences. Many more suffer needlessly with painful or chronic conditions that create personal hardships, reduce productivity and are burden on other government services.

How scientific discoveries transformed custodial care into ‘modern’ medicine, and replaced revenue-neutral charity with a trillion-dollar economic system in the US is both interesting and instructive. In the last 130 years two additional elements were incorporated into the classic function of hospitals. The hospitable care of strangers was expanded to include the ability to ‘heal’– i.e., modern therapies made possible by new technology and professionally-trained staff. No longer were prayers or incantations the only hope; nor did patients have to choose options described as “the cure was worse than the disease”.

The third new element was the ‘fee-for-service’ business model adopted by hospitals in America. For better or for worse, this turned out to be a shotgun wedding between medicine and the law, as reflected in the litigious aspect of modern hospital care. Traditionally, medical liability did not apply to the non-therapeutic form of ‘hospitality’ provided without charge to patients in the custody of church or government institutions. The ability of medical science to offer effective therapy and their promise of therapeutic excellence (safety and success) are two side of the same coin. The legal responsibility of physicians and institutions is to provide the ‘standard’ treatment, including diagnostic procedures, surgery, prescriptions drugs and other forms of medical advise and medical devices. This is an implied contract between the physician or institution and the patient involving a prior promise of benefit and freedom from harm in exchange for the agreed upon professional fee. It carries a guarantee of a ‘good outcome’ and is accompanied by right to litigate if such an outcome is not forthcoming.

 Hospitals and the Care of Strangers

The modern hospital care as we know it today is the product of complex social, scientific, legal and economic changes started in the late 19th century – a historical period I refer to as ‘The Great Divide’. This refers to the era just before and after the discovery of microorganisms, in particular, bacteria and other pathogens. While this era of history is unfamiliar to most people, it is to the single most central advance in the history of biological science and functionally splits the story of human health into two very uneven halves demarked by the discovery of the germ theory of infection and contagious diseases. In combination with social and political influences of that era, these great scientific discoveries produced the unique economic system of 20th century American hospitals. To understand how this came about, we have to go back to even further and look at the history of health and healing as it developed thru the centuries.

Before we can appreciate the personal, practical, day-by-day experience of hospital patients and their doctors, and the economic factors that under lie these transactions, we have to return to the original configuration of hospitals, which were mainly charity institutions run by religious orders or the government. True to their name, charity hospitals dispensed hospitality to the poorest of the poor who also had the misfortune to be sick or dying, as well as providing safe haven to homeless pregnant women and foundling infants. In addition to being indisposed, these unlucky souls were also. But their pitiful circumstance did not stop with the simple misfortune of being ill and indigent. Aside from setting broken bones and stitching up a bad cut, hospitals during this pre-scientific phase were unable to offer any other effective medical treatments. When it came to actually treating health problems, repairing damage caused by accidents and curing human disease, nothing about this preliminary “medical” system was up to the job. The ability to entirely eliminate a disease was associated with miracles and answered prayers, not anything that doctors did.

[??????]

[?? How science finally triumphed over superstition ??]

The strange idea that tiny invisible creatures were the cause of illness was first mentioned in far-distant antiquity (400 BCE). However, this was more of a philosophical claim than a scientific one, as it was not accompanied by any scientific method of investigation that could have established this hypothesis to be either true or false. Nonetheless, the mere idea was enough to elicit instant opposition, as physicians and other people were found it impossible to believe that tiny invisible organisms would be able to kill larger organisms such as humans and animals. It would take another 2,500 years to relieve humanity of its prejudice.

The key elements of what we think of as “modern medical science” were the result of watershed discoveries that occurred between 1840 to 1940. During this time, researchers in biological science laid to rest the 2,000 year-old theory of “spontaneous generation”, which was the mistaken idea that mysterious living organisms capable of causing sickness and death could magically arise from within a person or animal. It was not until 1881 that microscopic organisms were identified as responsible for contagious and infection diseases.

Understanding that bacteria and other pathogens were the true origins of infection dramatically divided thinking about disease and human misfortune into two contrasting styles. This great leap in biological science was the first of many revolutionary changes that ushered in what we now think of as ‘modern medicine’.  Replacing old ways of thinking with the new discoveries biological science gave us our current form of bio-medicine and its many ‘miracles’. This eventually included the discovery of anti-microbial drugs such as penicillin and sulfa drugs in the early 1930s, as well as state-of-the-art surgeries such as organ & face transplants and reattaching severed limbs. The germ theory of disease also forms the basis for modern ideas of public health — food safety, sanitation and hygiene and preventative medicine. It is hard to remember that what we think of today as so “obvious”, was (and is) actually invisible, and yet the knowledge based that this discovery generated is more important to our modern was of life than airplanes, computers or the Internet.

{hummm – what next, move to ?????]

 This great divide in the human story runs thru time like a huge seismic fault line, cleaving history into two irreconcilable and uneven halves. After two thousands years of pre-scientific healing arts, the miracle finally happened a mere 130 years ago, when medicine became a true science. The eons of pre-germ history was one of therapeutic impotence and many iatrogenic practices that gave rise to the expression: “the cure is worse than the disease”. Our recent and dramatic leap forward in human knowledge (and the many blessings it can bestows) was accompanied by a dramatic shift between the classic era of hospitals as custodial institutions — in the business of providing bed and board — and the therapeutic-curative model of bio-medicine we appreciate today.

During the millennia of western medicine’s pre-germ phase, ideas about what caused human disease were as different from today’s world as one can possibly imagine. People had superstitious or supernatural explanations for disease and accidental injury, commonly believing that disease and misfortune were caused by: “the malice of a demon, the justice of an avenging god, the ill-will of an enemy, or the anger of the dead.” [History of Medicine in its Salient Features 1922 by Alter Libby, M.A. PhD, U. of Pittsburgh The Riverside Press, Cambridge, 1922: Page #4:]. In response to unfortunate but supernatural events, physicians did their best to reduce the suffering or treat the symptoms of the illness or injury, but their supportive care could do nothing to cure the underlying disease. Mostly they stood by and waited patiently until the hapless ‘patient’ either got well or died of his condition. The word for the persons who depended on such an uncertain system of medical care – i.e., patient – was well-chosen indeed, as one waited patiently or impatiently while one’s fortunes, and even one life, depended almost solely on Mother Nature, the passage of time and things mysterious that could not be controlled or even understood.

However, there was an aggressive minority of doctors who simultaneously rejected the idea that illness was caused by demons or the disfavor of gods, and also rejected the idea that there was nothing a doctor could do change the patient’s fate. These doctors believed themselves to be on the cutting edge of medical theory by employing treatments originated by “father of western medicine”, the ancient Greek physician Hippocrates. The old Hippocratic system was based on the four humors (blood, phlegm, yellow bile, and black bile) and was a rather complicated theory that saw an excesses of one or more of these mysterious elements as the cause of all disease. This ancient humeral system gave us language for body type and temperament that are still familiar to most of us. The idea of being ‘robust’ equated to the humeral type in which blood predominated; ‘phlegmatic’ referred to a preponderance of phlegm, and black bile produced the characteristic known as melancholia (in Greek ‘melan’ refers to the color black and ‘cholia’ to bile). Medical treatment of an illness consisted of determining the humeral type and then ridding the patient’s body of its excess of the offending humor. Doctors believed it was possible to cure patients of their illness by restoring the balance of these theoretical body fluids.

Hippocrates’ original methods for balancing the four humors were included an improved diet, rest, fresh air, moderate exercise, and other forms of supportive care remarkable similar to the philosophy, principles and practices of naturopathy. Only in cases that did not improve with supportive care did Hippocrates believe that blood-lettings should be used. As the centuries passed, doctors using the Hippocratic ideal of balancing the humors moved away from its more holistic methods; good food and fresh air were replaced by increasingly aggressive practices — emetics to induce vomiting, laxatives and harsh purgatives, and the frequent and extensive use of blood-letting.

For nearly 2,000 years western medical doctors have used therapeutic bleeding of patients by opening a major vein or artery. At the same time, multiple leeches were often applied to the patient’s skin to suck out additional amounts of blood. These early doctors assumed they had taken out the perfect amount of blood when the patient lost consciousness. To make this determination more efficient, patient were bled while sitting or standing and the treatment continued until they fell over in swoon. The usual amount removed each time was from 10 to 24 ounces of blood, repeated at regular interval for as long as the patient remained ill. This often went on for a period months

One historical record describes the use of these methods to treat a soldier stabbed in the chest, who was unconscious due to blood loss when first seen by the doctors. His emergency treated consisted of being immediately bled of 20 ounces to ‘prevent inflammation’. After watching him overnight, the doctors bled him six more times during the following 24 hrs, taking an average of 10 to 24 ounces each time and leeches were also applied. All together, his doctors and the medical leeches removed a total of 13 pints of his blood during the first 30 days of their care. For his physicians, the fact that the patient lived confirmed to them the wisdom of this Hippocrates’ humeral system. [1993 Semmelweis Hx by modern surgeon Sheldon Lund]

Shipping records for the 1830s noted that France was importing 40 million leeches for medical purposes each year. Before doctors understood the role of bacteria and the idea of ‘contamination’, medical practitioners did not know to disinfect their hands or sterilize instruments between patients. Bacterial contamination resulted in wound infections, with some percentage of patients dying of septicemia (systemic infection of the blood). In addition, some patients were bled to the point of death — the fate of our own first president, George Washington, who was being treated by his doctors for a sore throat.

However, the story of ‘heroic’ medicine had not yet reached its penacle. The most far-reaching upgrade of Hippocrates four humors took place in Europe during the Middle Ages, and produced what was known as the ‘chemical concept’. A Swiss physician known as Paracelsus theorized that the human body was essentially a chemical system composed of three principals – mercury, sulfur, and salt – and that illness was a lack of balance among these elements.

Paracelsus’ actual name was Phillip von Hohenheim and he was also a practicing astrologer, as were most university-trained physicians in Europe. Astrology was central to Paracelsus’ theory of medical treatment and led him to develop a different talisman for each sign of the Zodiac, along with several astrological talismans for curing disease and other maladies. On top of the ancient Greek concept of the four humors, Paracelsus superimposed his own unifying theory – a belief that the entire cosmos was fashioned from three spiritual substances: the tria prima of Mercury, Sulfur and Salt.

These substances were not the simple chemicals we recognize them today, but were seen as philosophical principles that gave objects an inner essence as well an external form. Mercury represented the transformative agent (fusibility and volatility); Sulfur represented the binding agent between substance and transformation (flammability); and Salt represented the solidifying and substantiating agent (fixity and incombustibility). The idea was “as above, so below”. For instance, when burning piece of wood, the product of combustion was believed to reflected the constituents of its own unique tria prima: Smoke was seen as a reflection of Mercury, flame reflected sulfur, and the ash indicated salt.

The tria prima also defined the human identity. Sulfur embodied the soul, (the emotions and desires); salt represented the physical body; and mercury epitomized the spirit (imagination, moral judgment, and the higher mental faculties). By understanding the chemical nature of the tria prima, physicians believed they had discovered the means to cure all human disease. Paracelsus himself created what he called a “Life Elixir” in 1538, a propriety herbal formula said to support the function of all internal organs, and the origin of the modern product known as ‘Swedish Bitters’.

Unfortunately the ‘therapeutic’ side of tria prima as practiced by Paracelsus and his fellow physicians was not so benign. It began by determining the astrological onset of the illness and the chemical associated with that particular planetary influence. If the patient noticed the first symptoms while Mercury was rising, the divinely-appointed treatment was (obviously!) doses of the heavy metal mercury.

Depending on the specific astrological signs, a host of highly toxic, even poisonous chemicals would be administered over a period of many days, a treatment regime that also included the bleeding, purging, and blood-letting prescribed by humeral theory of Hippocrates. All this occurred under the rubric that strong medicine was necessary to balance off the even stronger cosmic influence of the planets. This chemical version of ‘desperate times call for desperate measures’ gave us the concept of ‘heroic medicine’.

For many a hapless patient, the idea of ‘better living thru chemistry’ turned into dying – the result of officially orchestrated chemical poisoning with arsenic, mercury, other heavy metals and toxic substances such as sulfur. These ‘heroic’ treatment regimes were either harmful or therapeutically-ineffective or both harmful and ineffective. What not to not like?

???? make new, stand-alone ‘section’ ???

The story of Germs –> 17th century “wee-beasties” to 20th century “wonder drugs”

Western medicine still considers the ancient Greek physician Hippocrates to be the Father of Modern Medicine, with ‘modern’ equating to our concept of scientific. By applying Aristotle’s ideas of logic to physical medicine, Hippocrates was the first to separate medicine from magic and superstition, thus earning his well-respected place in history. However, his unique brand of scientific thinking — the pearl of great price attributed to Hippocrates – had to fight a bitter up-hill battle against ignorance, superstition, astrology, magical thinking, vested interests, organized religion and just plain wrong conclusions (including his own humeral theory) for the next two thousand years. It wasn’t until the Age of Enlightenment – officially dated to 1678 — that this way of thinking and seeing the world morphed into the “scientific method”, which eventually provided the foundation for modern medicine.

Aristotelian logic was (and still is) a process for testing individual ideas for validity. Logic begins as a mental discipline that uses a rigorous form of critical thinking and distinguished itself from philosophy by extending that inner world of cogitation to real-time observations in the natural world. That extension of logic from the inner to the external world is the origin of scientific experimentation, which provides the crucial ingredient – the opportunity for two or more people to ‘verify’ the results of scientific inquiry and thus establish what we know as ‘facts’.

The extraordinary impact of logic is that it teaches us HOW to think – i.e., an on-going process — but not WHAT to think – i.e., a product or ‘dogmatic’ predetermined conclusion. Although this seems simple and self-evident to us today, this gemstone introduced the human species to the potential of its own internal mental capacity. Aristotle’s ideas changed the process by which people conducted themselves in the public and private arena.

To the great relief and benefit of humankind, it was this way of thinking that led to the discovery of disease-producing germs and broke the spell of thousands of years of superstition about the origin of disease and al the ineffective or harmful therapies imposed by a pre-scientific medical profession. Discovering the power of the mental world led to men of ‘ordinary’ intelligence to invent the microscope, which led a Dutch drapery merchant and part-time inventor to discover little “wee beasties” (bacteria) in a drop of rainwater in 1676 and two and half centuries, to the development of antibacterial drug ‘sulfa’ by the German pathologist Domack, 250 years, the first of many antibiotics that permit us to kill off any ‘wee beasties’ that make us sick. The story of how this came about is both interesting and instructive, since it is the pivotal point in our modern health and hospital care system and their current economic implications.

How an Accidental Discovery Met A Prepared Mind and Changed the World Forever

Advancements in human history are often described as “an idea whose time had come”. Anyone studying the subject will be stunned to find out just how many of these ‘break-throughs’ depended on the happenstance of lucky accidents – scientists looking for one thing and accidentally finding another, much more important discovery or they pursuing the answer to a minor problem and bam – some little unexpected twist delivered the mother lode instead. However, human advancement is never simple, as our history is littered with examples of good ideas that were disregarded or actively repulsed, only to be suddenly embraced at later time. The exact moment when a new idea or discovery is serendipitously incorporated into a pre-existing body of knowledge seems inexplicable – no rhyme or reason for why new information was disregarded in the first place and what happens to change resistance into acceptance.

But historians have also observed that human cultures are moved forward when an accidental discovery meets with a prepared mind. The vital ingredient is an individual who realizes the importance of seemingly random events or information that triggers an order of magnitude shift in thinking or in a capacity — things that would have been so easy to miss or dismiss. The discovery of germs is the quintessential story of accidental discoveries meeting prepared minds.

The modern science of biology began in the 17th century, with the invention of the microscope. This led to the discovery of a world of biological structure and function on a microscopic level, which is where all the action is in living systems. The two people most central to this story were a formally-educated English biologist by the name of Robert Hooke and a contemporary of his, an unschooled Dutch merchant, Anton van Leeuwenhoek (pronounced ‘Lay-when-hook’). These two men lived in different countries, did not speak the same language and never meet in person. But happenstance brought the unique interests and talents of each man together and provided the key to crucial discoveries two centuries later.

Robert Hooke was the first person to build and used a simple, single lens microscope – an instrument that look like a magnifying glass instead of the two-lens microscope now used in modern labs. Hooke’s magnifying glass allowed him to examine the structure of plants and insects and make very detailed and accurate drawings of what he observed. By our standards, his equipment was painfully rudimentary – only able to magnify objects by 10 to 20 times. But he could see previously invisible details on materials such as fabric, cheese and the common flea, including the tiny walled chambers that made up a piece of cork. Because these chambers reminded him of the individual cells for monks in a monastery he called these organic divisions “cells”, a descriptive term we have used ever since.

Robert Hooke published his finding in 1665 in an illustrated book called ‘Micrographia’, which eventually made its way across the English Channel to Holland and was read by van Leeuwenhoek. Unlike the upper-class academic background of the English biologist, van Leeuwenhoek’s was a tradesman whose father was basket maker and his mother’s family brewed beer. Leeuwenhoek, who had been was born in Delft, Holland in 1632, worked in a number of manual jobs before becoming a drapery merchant. In order to better see the warp and weave and other fine details of the fabrics that he was bidding on, van Leeuwenhoek ground an optical lens to make his own magnifying glass. After reading ‘Micrographia’, he was so fascinated by its detailed drawings that he fashioned a functional one-lens microscope so he “could see these things for myself”. This was truly one of humanity’s luckiest days.

In 1673, at the age of 41, van Leeuwenhoek used his microscope to peer into a drop of rainwater and became the very first person to ever see the world of micro-organisms. His skill at grinding lenses produced the best microscopes of his time, with magnifications ranging up to 200th power. This allowed him to observe a menagerie of tiny “animalcules” under his microscope. On April 26th 1676, van Leeuwenhoek used an even stronger lens and for the first time saw bacteria, which appeared as wiggling threads, strings of beads and undulating rods, and twilling spirals. Van Leeuwenhoek referred to these microscopic creatures as “wee beasties”.

Even though van Leeuwenhoek was not trained as an academic researcher, he took meticulous notes and later described his findings in letters to the most renowned scientists of the day. What started out as a curiosity and part-time hobby turned him into one of the first people to use what is now called the “scientific method” — a clear ability to construct experimental procedures that are both rational and repeatable. Van Leeuwenhoek’s talent at analyzing problems became the foundation of scientific investigation and many of the ground rules he formulated in the 1670s are still used for scientific experimentation as it is done today.

Standing on the 17th century shoulders of Hooke and van Leeuwenhoek, three other key players came together in the same interrelated and serendipitous manner to prove the germ theory of infection disease and propel us into 20th century ‘modern medicine’. This was the direct result of scientific investigation by the French chemist Louis Pasteur, the English surgeon Sir Joseph Lister and the German physician-scientific Robert Koch. Their efforts spanned several decades, coming to fruition in a step-wise fashion between 1857 and 1881.

These three physician-scientists were able to prove that contagious diseases, local and systemic infections, including puerperal sepsis (the frequently fatal septicemia following childbirth) and post-operative fevers, were all caused by microscopic pathogens. The old and incorrect theory of spontaneous generation was formally and forever replaced by the correct theory of pathogenic microbes as the origin of infectious disease. The development of modern biology and western medicine as a scientific-based discipline that affected far more than just medical practices, one of the important areas being public healthy and food safety.

Louse Pasteur – French chemist, inspired scientist, father of the Germ Theory

Those on the pathway to this great discovery were helped along by many unsung or unknown heroes of scientific investigation whose contributions are a mere footnote or lost to history altogether. However, it fell to Pasteur be the right person at the right time – the prepared mind recognizing the accidental discovery – and thus to bring all these the big and little contributions together to make an extraordinary whole that was surely bigger than any one of its parts.

As a chemist, Pasteur was called on by wine industry to provide a practical solution to the souring of wine. During his preliminary research, he discovered that microbes were responsible for fermentation process that turned grape juice into wine, via the wonders of organic chemistry occurring at a cellular level. The ‘wee beasties’ in the yeast consumed the natural sugars (glucose and fructose) and as a by-product of the microbes’ metabolism, converted them into alcohol and carbon dioxide. After wine is mixed with room air, only the alcohol remains. However, sometimes this mixture contains some microbes that grow aggressively and spoil the taste of the bottled wine.

In April 1862, Pasteur figured out that heating wine to a temperature slightly below the boiling point could prevent the over-growth of unfriendly ‘wee-beasties’ without changing the wine’s taste. From this he deduced that heat was an effective way to kill microbes and from this observation, he perfected the method we still call “pasteurazation”. Harmful bacteria and mold spores in organic liquids — milk, fruit juices, honey, etc — can be killed by slowly raising their temperature to about 180 degrees, and maintaining it for a predetermined length of time. This is one of the most important public health measures ever discovered. Like the idea of hand-washing that was to follow, the principles of pasteurization can be easily understood by ordinary people and carried out with simple inexpensive equipment that any householder had access to, using simple methods those of ordinary intelligence could employ.

But the single most pivotal year in the fledgling history of ‘modern’ medicine came 19 years later, in 1881. That was the year that Louis Pasteur’s 25 years of research in bacteriology culminated in one of those rare and dramatic moments in human history known as a paradigm shift when the Germ Theory of Disease as we know it came into being. This great leap in biological science was the first of many revolutionary changes that ushered in bio-medicine and public health policy.

For the two preceding decades, several members of the medical profession were increasingly interested in microscopic organisms and the idea that some of them might be pathogenic. During this time, Pasteur had also proved that all putrefaction – rancid spoiling of meat and other biological tissue – was caused by microbes. But Pasteur could not scientifically establish that specific microbes were to blame. Except for the Pasteur’s process of pasteurization for wine and other liquids in 1862 and Lister’s principle of asepsis and sterile surgical technique introduced in 1867, there was still no widespread agreement about the role of microbes as pathogens, no uniform application of the new principles of bacteriology and no universal agreement on techniques or “standards of practice” to follow to prevent contagion between patients and contamination between the doctor and his patients.

Without irrefutable proof that each specific disease or infection was caused by a specific bacteria or other microbe (virus, fungus or protozoa), the medical profession was not willing to accepted the germ theory, or more to the point, they were not willing to act on the idea. To do so would mean a dramatic change in the clinical practice of medicine – ideas like mandatory hand-washing and aseptic technique to prevent contagion or post-operative, postpartum infections. They weren’t ready to embark on a huge reformulation of the entire practice of medicine based on an unproven hypothesis.

Then came the day when simple words and a cartoon-like drawing on a chalkboard changed the trajectory of human history forever. At a prestigious meeting of the French medical society in 1881, Louis Pasteur presented his paper on the relationship between pathogenic micro-organisms and childbed fever. Pasteur able to scientifically established that a particular bacteria — hemolytic streptococcus pyogenes — was the source of childbed fever, a potentially fatal disease that killed many new mothers and their babies. He also identified the fever-producing (pyogenes) hemolytic strep as the cause of “hospital fever” — the post-operative infections of a surgical incision.

At this prestigious meeting in Paris, our wine chemist-turned-microbiologist sketched a picture on a chalkboard of what streptococcal bacteria looked like under a microscope. After drawing a line of tiny organisms that resembled a miniscule string of tanker cars on a train track ( a configuration first observed by van Leeuwenhoek in 1676), Pasteur announced to his prestigious audience: “Gentlemen, this is the cause of childbed fever”. This pathogen — Streptococcus pyogenes — is also responsible for a long list old and new disease such as epysiphilas, necrotizing fasciitis, toxic shock syndrome, scarlet fever, otitis media, meningitis, endocarditis and pneumonia. The human family continues to be indebted to this French vintner’s son turned scientist extraordinaire, for proving us with the process of pasteurization and establishing the germ theory of infection.

The invisible but nonetheless lethal power of germs had been unmasked for what it was – a killer — and beaten back. Understanding the connection between infection and the microscopic world of bacteria and other pathogens made a huge difference in public health policy, sanitation, medical practice and in the role that hospitals played in society. During the decades between 1880 and 1910, the post-germ system of modern medicine rather quickly replaced the ‘bad old days’, as the connection between infection and the microscopic world of bacteria gave rise to the preventative principles of asepsis and surgical sterility. The use of germ-killing disinfectants meant that hospitals were no longer contagion factories constantly plagued by epidemics. and doctors used sterilized surgical instruments and sterile technique to prevent surgical patients from getting fatal post-operative infections.

gave rise to wide-spread use of Lister’s principles of asepsis and surgical sterility, drastically reducing post-operative infections. Before the development of surgical sterility, the post-operative death rate of surgical patients was so high that it gave rise to the expression: “The operation was a success but the patient died”. No matter how brilliant the surgeon’s technical skill, it was all for naught if invisible bacteria were left behind to fester and invade the dark interior of the patient’s body. After these changes were implemented, surgery was not necessarily a potential death sentence, which allowed surgical procedures to be performed ‘electively’. For the first time, the odds of living were better than the likelihood of dying – all based on the use of new machines and new technologies that didn’t exist a decade earlier.

For example, the first sterile operating room was built at John Hopkins and opened for use by 1889. By drastically reducing post-operative infections, elective surgery was no longer a death sentence. For the first time, the odds of living through surgery outweighed the likelihood of dying – all based on the new ideas of biological sciences and new machines and new technologies that hadn’t existed even a decade earlier. This allowed hospital to take on their modern role as the core of a ‘healhcare’ system, able to simultaneously provide the traditionally supportive hospitality – feed, bathe and care for a patient’s physical and psychological needs — and at the same time deliver effective therapy that reduced the length and damage of disease. The marriage of classic hospitality, new technologies and ever-advancing therapeutic success created a new age and new appetite for hospital services and ‘the care of strangers’.

Slightly more than a decade after Pasteur’s ground-breaking discoveries, the German Professor Wilhelm Roentgen’s invented a practical use for radioactivity (1897) that quickly produced the new medical technology of x-rays. The combination of these two scientific discoveries kicked medical care into a whole new realm of effectiveness. New laboratory tests could identify a specific strain of bacteria, while x-ray examinations could make pinpoint a diagnoses by providing a picture form inside the patient’s body.

Its own quiet way, this was medical science’s equivalent of a moon landing, a core understanding about human biology that sparked a sudden shift in thinking and practice. Almost overnight, humanity was taken from the many millennia of ignorance – a dark and ignoble epoch of superstition by ordinary people and mistaken ideas by medical practitioners — to knowledge of the microbial world as the common property of humanity. This ushered in a newly enlightened world defined by the new scientific disciplines of microbiology and bacteriology. These biological sciences continued to advance and refine the use of antiseptic practices, disinfectants, aseptic principles and sterile techniques and new diagnostic abilities.

@@@ transition @@@

In contrast to previous centuries, the scrupulous washing of one’s hands finally became the hallmark of a scientifically-trained professional. From a public health standpoint, hand-washing was a gift to humankind like pasteurization — a hugely important scientific advancement at the most fundamental level. As a principle (instead of a patented technology), it could be used freely by all without the need to purchase or maintain expensive equipment. Hand-washing was something anyone could understand and use to the betterment of the human condition. Until Pasteur was able to prove the connection between a specific strain of bacteria and childbed fever in 1881, many in the medical profession had refused to take the simple precautions like hand-washing and aseptic technique seriously.

Without the germ theory and the body of knowledge it generated, most of the extraordinary advances in the world of scientific medicine would have been impossible or greatly hampered. Standing on the firm foundation of microbiology and bacteriology, other scientific disciplines — anatomy, biology, chemistry, immunology, physics, and physiology – were able to make their unique contributions to the practice of medicine, including the rise of modern obstetrics as surgical discipline. Practical application of these wonderful advances were translated into diagnostic tests, medical treatments and surgical procedures, which never could have been safely to offered to patients were it not the use of aseptic principles, and sterile technique. Over the course of the 20th century, many cures for previously chronic, disabling or deadly diseases have been discovered.

The New Age of Institutional Care

These wonderful new abilities were soon joined by the development of x-ray machines that could be used to actually see into the human body, helping to redefine the potential for ‘modern’ medicine to treat and often cure disease.   Yes, Houston, the Eagle has landed bringing us a bright new future for humanity made possible by the science-based practice of medicine. Hospital of the early 20th century were eager to take advantage of all these new ideas by purchasing and using the new scientific equipment that made it all possible – microscopes, sterilizers, autoclaves. When Pasteur’s breakthrough in bacteriology, was combined with the German professor Wilhelm Roentgen’s x-ray machine, the ability of the emerging practice of scientific medicine to diagnose disease and better treat injuries was greatly and immediately expanded.

These two scientific discoveries kicked medical care into a whole new realm of effectiveness with new laboratory tests could identify a specific strain of bacteria, while x-ray examinations could diagnosis with a high degree of certainty by pinpointing the problem from a picture of what was going on inside the patient’s body. By 1910, every hospital was eager to have at least one of each of the new — but vital — pieces of medical equipment and take their place in this brave new world of safe and therapeutically effective medical care.

Patients and professionals alike were happy to turn the corner on the bad old days and bad old ways and march forward into this brave new world. The future of medical care in the US could not have It certainly looked more promising.

Section three – The Great Mismatch between what a hospital could offer and what and a patient could afford to pay for, new customers, and the birth of ‘elective’ hospitalization