The marble and granite statue in the Boston Common depicts a physician in medieval clothing holding a cloth next to the face of a man who seems to have passed out. An inscription on the base of the statue reads “To commemorate that the inhaling of ether causes insensibility to pain, first proved to the world at the Mass. General Hospital in Boston, October A.D. 1846.” No names are mentioned.
It was on Oct. 16, 1846, that dentist William Morton ushered in the era of surgical anesthesia by putting printer Gilbert Abbot to sleep with fumes of ether from an inhaler he had devised. Surgeon John Collins Warren then proceeded to remove a tumour from the patient’s neck without any of the usual screaming or thrashing about.
Warren looked up at the doctors who had witnessed the event in the surgical theatre that would become known as the “ether dome” and proclaimed, “Gentlemen, this is no humbug.”
That was in reference to a failed attempt by another dentist, Horace Wells, to demonstrate anesthesia with nitrous oxide, or laughing gas, at the same hospital. In that case, Wells hadn’t waited long enough for the nitrous oxide to take effect and the patient howled in pain as Wells attempted to extract a tooth. He exited in disgrace to the cries of “humbug.”
Although Morton gets credit for the first organized demonstration of ether anesthesia, he certainly was not the first to experiment with the chemical. The sleep-inducing effect of ether was first recorded 300 years earlier, when famed Swiss alchemist, philosopher and physician Paracelsus noted that its vapours would induce a state of unresponsiveness in chickens. Ether does not occur in nature, so where did Paracelsus get it?
In 1540, German physician and botanist Valerius Cordus discovered that heating alcohol with sulphuric acid, then known as oil of vitriol, yielded a new highly flammable substance with a characteristic smell. Vitriol was the archaic name for compounds that today are termed “sulphates.”
Cordus discovered that heating a solution of green vitriol, or iron (II) sulphate, a naturally occurring mineral, yielded “oil of vitriol.” Then in the 17th century, German-Dutch chemist Johann Glauber found that burning sulphur with saltpetre (potassium nitrate) produced sulphuric acid.
Potassium nitrate decomposes to yield the oxygen needed to convert sulphur to sulphur trioxide, which dissolves in water to produce sulphuric acid. In the 19th century, potassium nitrate was replaced by vanadium pentoxide, which acted as a catalyst allowing for easier production of sulphur trioxide. This was the method used to produce the sulphuric acid needed for the synthesis of ether in the 1800s.
Before ether’s triumphant performance in 1846 at Massachusetts General, it had developed a reputation as a recreational substance. Middle-class partygoers and medical students both in Europe and America frolicked under the influence of ether. More curiously, drinking ether was common in Europe and was particularly popular in Ireland, where the Catholic Church promoted abstinence from alcohol and asked people to pledge not to drink alcohol. Drinking ether was a way to get around the pledge. Ether was sold in pubs and shops until the 1890s, when it was classified as a poison.
Dr. Crawford Long had taken part in ether frolics as a medical student at the University of Pennsylvania, and when he took over a rural medical practice in Georgia in 1841, he recalled that ether frolickers sometimes developed bumps and bruises of which they seemed to be oblivious.
Could ether be used to relieve pain, he now wondered? The answer came when he delivered his wife’s second baby with the aid of ether anesthesia. Long went on to perform a painless dental extraction, and in 1842 used an ether-soaked towel to put James Venable to sleep before proceeding to excise two tumours from his neck. But Long was not an academic, was not interested in publishing, nor did he crave fame or fortune.
It was two years after William Morton’s celebrated demonstration that Long documented his efforts in the Southern Medical and Surgical Journal in a paper titled “An account of the first use of Sulphuric Ether by Inhalation as an Anaesthetic in Surgical Operations.”
He described a number of cases, including the amputation of two fingers of a boy who was etherized during one procedure and not the other. Long reported that the patient suffered terribly without ether but was insensible with it. The reason he had waited to publish, he said, was the need to overcome criticism by local colleagues, who had suggested that the ether effect was just an example of mesmerism, which at the time was promoted as a pain-reduction method.
With his publication, Long added his name to the list of people claiming to have been the inventors of ether anesthesia. There was William Morton, of course, and Charles Jackson, a physician who had given up medicine to establish a private laboratory for analytical chemistry, where he also taught students, including Morton, who had come to expand his scientific knowledge.
Jackson claimed that he had introduced Morton to ether anesthesia, and the two got involved in a rancorous battle for years. There was also a Berkshire Medical College student, William E. Clarke, who claimed he had first used ether to put patients to sleep.
It was because of the controversy that the Boston monument does not bear the name of any of the claimants. But it does bear a biblical quote from Isaiah: “This also cometh forth from the Lord of Hosts which is wonderful and excellent in working,” addressing the worry people had that relief of pain was somehow interfering with God’s will.
The quote suggests that medical intervention is itself a gift from God and is backed up by a relief on the statue depicting a woman who represents Science Triumphant sitting atop a throne of test tubes, burners and distillers, with a Madonna and Child looking on with approval. There is also a Civil War scene on the side of the monument with a Union field surgeon standing ready to amputate a wounded soldier’s leg. The soldier sleeps peacefully. Thanks to ether, he would feel no pain.
Dr. Joe Schwarcz
Recently, there’s been an influx of media attention on guts. More specifically, the microbes that live in your gut. Extensive research is being done on these little guys as they seem to be having a real impact on our health. These gut microbes may be miniscule but their function is major. And I learnt all about them at “The Secret World Inside You” exhibit now on display at the American Museum of Natural History in New York.
Before I begin walking you through the exhibit, first a brief explanation as to what microbes even are. Microbes are microscopic living organisms that can only be seen with the help of a microscope. And they are everywhere – in every fold and lining of our bodies, including our inside. They literally govern the world inside us and are responsible for much of how we function.
Our skin is the first point of contact for microbes, which is most probably why it’s the first section you get to in this exhibit. There is not one individual whose microbiome is like that of another. However, what came as a real shocker was the fact that people living together – families, roommates, and yup, pets too –share certain microbe make-up. So much so, that when one person leaves the nest for a few days, the microbiome of the house shifts until they return home again. Pretty sweet, no? Everyone sharing the same types of microbes…(It could also be slightly gross if you think about it too much, so just don’t). It was also pointed out how certain microbes, as distant as they may seem, are actually closely linked. Let’s take cheese, for example. The holes in Swiss cheese are made from a bacterium that is similar to one located on the skin, which is why (some) feet take on a cheesy-like smell. On feet, the Brevibacterium linens bacteria converts amino acids into smelly sweat, but in the world of dairy, it serves to ripen Limburger cheese. Delicious? Depends.
Now perhaps it’s my age and the fact that my ovaries now twitch on a regular basis thanks to all the babies on my FB feed, but the next section of the exhibit was hands down my favourite. “Before Birth”, the world of the baby and the microbiome of the mama. Now one would think that the two are inextricably linked since the fetus is totally reliant upon the mother; however, to my surprise, the mother’s microbiome does not mix with the fetus at all. In fact, if the microbiome of the mother interacts at all with the fetus, it could be very risky. And it’s thanks to the placenta, the gatekeeper in this whole process, why the two don’t mix. After visiting this exhibit, I really developed a whole newfound respect for the placenta since it serves a pivotal function, allowing nutrients and oxygen to enter the amniotic sac and preventing any other materials from doing so.
Now once a woman’s water breaks all rules are off. The baby is now cooked enough to not only mingle with the microbes of its’ mother but to start developing a microbiome of their own. And the birth canal is where this all happens. When the baby travels through the canal, the mother’s microbes get pressed into the skin, nose and eyes, and even swallowed by the little one before being delivered to the baby’s gut where they can then start their own gut microbiome. This process is crucial in the development of a baby’s healthy immune and digestive system. (How awesome!) But you may be wondering (as was I), about those C-section deliveries since these babies do not go through the birth canal picking up the mother’s microbes along the way. Instead, these babies pick up microbes from the doctor’s hands and the environment. They end up lining the baby’s digestive tract and in turn have an impact on their immune system, causing C-section babies to be at a higher risk of a variety of conditions, such as asthma and allergies. To test this, studies are now being done where the baby, immediately post-C-section delivery is slathered with a gauze pad that soaked up the microbes in their mother’s birth canal right before birth. Time will tell whether this can benefit the baby but most signs point to yes, which is good news since about one mother in three now gives birth by C-section in the United States.
As life goes on, microbes live, grow and multiply based on what we feed them. Meaning, the food we eat and the choices we make influence our gut bacteria. This has spawned a huge new area of research looking at individual variation when it comes to weight gain and loss, which was another section of the exhibit that I found fascinating, since like the majority of people on the planet I have a few pounds that just won’t relent.
Different people react to different foods in different ways. This is not a novel idea. I mean, just look at allergies and adverse food reactions. Some people have them, some people don’t. But what if this can be attributed to the type of microbes living in your gut? Let’s take a “healthy” food like a tomato, for example. Could you imagine if someone’s blood sugar spiked after eating tomatoes the same way it would after eating a donut? And research has shown, that this is the case! And yet in another individual, tomatoes can have zero spike effect. This whole new line of research could be a breakthrough in terms of weight control. Costly, but important. I know I’d be among the first to sign up to find out just what type of bacteria I have going on in my gut. Of course, as the exhibit suggests, one cannot know whether obese people are obese due to their microbiome or if there are external factors that caused their microbiome to be as such in the first place. It’s the chicken or the egg debate and we shall leave it to science to continue the research.
After leaving the exhibit, I realized that the microbiome is truly a hotbed of scientific research. We know so much but at the same time there are so many question marks about how we can use, manipulate, and alter our microbiome to enhance our health. And I am confident that science will, at one point or another, provide us with these answers; but until then, I’m just going to hope that my gut bacteria interact favourably with tomatoes.
You can visit “The Secret World Inside You” exhibit at the American Natural History Museum in New York where it will be on display until August 2016.
Last week, we heard in the news about a shocking admission from the execs at the National Football League in the USA that they may finally admit to believing that there is a link between football-related head trauma and chronic traumatic encephalopathy or CTE, a degenerative brain disease that is well known to be caused by repeated trauma to the head, such as from concussions. The reason that I find that news headline to be shocking is not because there was an admission of the link, but rather because the NFL was able to deny the existence of that link up until now by simply choosing not to believe that one was there.
A little closer to home, the news over the past few days has contained stories of email conversations among the top brass at the NHL that would show them debating the merits of fighting in hockey by exchanging their beliefs over whether or not repeated concussions during one’s life may lead to mental illness, brain injury or addictions later on. As if these corporate jocks have any knowledge of the medical information required to understand this concept well enough to be qualified to decide upon it. When the medical experts have made pronouncements on this issue, they unambiguously say that getting punched in the head repeatedly over one’s career is very likely to cause brain damage of one kind or another.
How is it that scientific specialists, like the medical researchers in these previous examples, are so easily discounted as being irrelevant in the face of someone else’s belief in something else? This ability to whimsically discount science in favour of a more convenient belief is not restricted to sporting organizations. In fact, we well know that it can be observed at a larger scale, both geographically and destructively speaking, in the anti-scientific dismissal of climate change. How many times have you heard of an online poll or a talk-show pundit that asks whether or not we should believe in climate change? Unfortunately for those pollsters, brain injuries and climate change are real, whether you believe in them or not.
The thing is that the medical knowledge of brain injuries or the vast expanses of knowledge on the earth and its climate are not based on beliefs at all, but rather on thousands upon thousands of accumulated and inter-supportive facts that have not been able to be falsified. These notions form the basis of the scientific method, which is simply a process of seeking to observe and describe the universe and to explain its properties and behaviour. Incidentally, it is also the most reliable and robust tool that we have found to date that allows us to work out fact from fiction in the natural world.
One of the most pervasive problems in society today is the mistaken equivalence of a specialist’s knowledge and understanding of facts with a non-specialist’s beliefs in something else. Belief and knowledge are incompatible with one another and only one of those two is a reliable way of understanding truth. ‘So what?’, we may ask. Perhaps, I may be seen to be an academic fuddy-duddy arguing about semantics and that we should just let people live and let live, each with their own thoughts and beliefs. Well no, I say. It is a problem that has real consequences on people, such as those ex-hockey enforcers dying from brain damage or the millions of people already being affected by runaway global warming.
On a very basic level, this important issue often hinges on the flawed equivalence of the nature of beliefs vs. knowledge in society. These two terms are so dissimilar to one another that they are more like opposites than synonyms. This acknowledgement alone would go a long way to guiding society progress towards an improved health, safety and prosperity of its people. As a starting point, it may be helpful to examine the meaning and usefulness of each of these two words as concepts.
Belief is the acceptance that something is true through the acts of trust, faith or confidence. There is no requirement for evidence in order to believe in something, making it a useful option for simplifying otherwise difficult concepts to absurdity or for dismissing inconvenient truths to irrelevance. Knowledge, on the other hand, is the theoretical or practical understanding of a subject. The key word in that sentence is ‘understanding’, because to truly know something, one must be able describe what, how and why it is what it is… and this requires the use of facts as evidence.
Obviously the only reliable path towards an understanding of something is by knowing it, which requires understanding it and knowing so, in some kind of mutually-supportive conceptual symbiosis. There is no place or need for belief in this context. In fact, beliefs are useless in generating knowledge because they offer no power to explain anything.
Historically, belief has been used as a means by which to attempt to explain the unknown, arguing such baseless statements as the earth is flat or that humans didn’t evolve but instead magically appeared through divine intervention. In many ways, belief continues to have significant influence today. However, over time our scientific advances have allowed us to replace most beliefs of things with knowledge of things, eventually making faith and belief entirely unnecessary.
Furthermore, when knowledge is not currently available due to a lack in our understanding of something, there is no shame in admitting that we don’t know it. It certainly is a more noble approach than to invent a belief-based explanation for which we have no supporting evidence. In fact, the ability to acknowledge what we do not know is arguably the most important component to having knowledge. Socrates famously pondered the nature of knowledge by stating that it may only come with an admission that one must know that they do not know what they do not know, or something along those lines.
Perhaps it was paraphrased more effectively by then US Secretary of Defense Donald Rumsfeld when justifying the attack on a country, despite the lack of evidence that it may actually have posed a threat, when he said that “there are known knowns, there are things we know we know; there are known unknowns, that is to say there are things we know that we do not know; but there are also unknown unknowns, the ones that we don’t know that we don’t know”.
I couldn’t agree more with Rummy! However, the solution to the conundrum of both the known unknowns and the unknown unknowns is not to invent a friendly fact that may likely be untrue (and often is). The right thing to do is to say that we don’t know but will try to find out. This is the only honest path to the truth and one that is built into the scientific method of knowledge building.
Whenever someone asks me what I may believe about something or other, I always answer that I don’t believe anything at all; I either know something or I don’t. That fact also applies to everyone else as well, believe it or not! Personally, I like to think that Yoda would have said it best: “Know or do not know, there is no belief”.
Adam Oliver Brown, Ph.D.