Wednesday, March 26, 2014

How To Act When We Don't Know Something!

After three years of grueling, hands-on training in Internal Medicine at one of our nation’s greatest hospitals, I’d like to think I’ve seen it all.  But the truth of the matter is that I have just skimmed the surface of the ocean of disease that is out there.  I sometimes found that even my senior attendings, after decades of having cared for patients, would get stumped at times.  This got me thinking.  What guides us in treating patients whose diagnoses remain a mystery or are not the “bread and butter” cases that we are taught in medical school or residency?

In November of 2010, I was called to the Emergency Department to evaluate a woman who had been sent in by her rehabilitation facility because she was not acting herself.  Ms. Jackson (as I will refer to her here in order to keep her identity confidential) was a 51 year-old woman who had recently been diagnosed with the Human Immunodeficiency Virus (HIV) infection.  She had been sent to the rehabilitation facility after presenting to a local hospital with weakness and was found to be anemic.  Further testing revealed that she had some inflammation of her stomach lining, so this was treated and she seemed to be improving.  But in the ensuing days, her ability to perform higher-level cognitive skills began to decline over the span of two weeks.  Her daughter noticed it first, and then medical personnel began to pick up on overt signs of “encephalopathy” – a generalized disturbance of brain function.  So they brought her to the hospital, and this is where I met her for the first time.  

Ms. Jackson was not the typical case of encephalopathy that came through our emergency room doors.  Most of the time, in the barrage of blood tests and imaging that most patients with this symptomatology are subjected to by the first physician who treats them, a hint is usually uncovered which oftentimes leads to the answer.  And then I start appropriate treatment as they continue being cared for on the medical ward, and the patient sometimes gets better.  Things were different for Ms. Jackson.  Her blood tests and imaging were of no help.  I even drew fluid from around her brain and spine – the infamous “spinal tap”.  Even this was not helpful.

So I reset my thoughts and started over with her.  There were a few aspects to her physical exam that were a bit unusual.  For one, she was extremely cachectic.  This pointed to the chronicity of her disease, and was most likely due to the HIV infection itself.  She was also “catatonic” – her facial expressions were largely frozen in place, and it seemed as if she wanted to speak but did not have the energy to even move her mouth.  The most curious part of her exam was her apparent loss of the usual spinal reflexes in her arms and legs.  We asked specialists in neurology to assist in the diagnosis, and their recommendation was to send her blood to be tested for the presence of certain antibodies that can attack nerve cells – specifically, one that is named GQ1b.  The problem with the test for this antibody is that the answer is unavailable for up to two weeks.  In the meantime, we tried every antibiotic and steroid available to us to treat Ms. Jackson, hoping that one of them would spark her recovery.  But her illness prevailed, and eventually she began to have seizures and difficulty breathing.  After many lengthy discussions with the family, we decided it would be best to pursue comfort-care for Ms. Jackson so that she would no longer suffer.  After two weeks of merciless testing, imaging, and experimenting, Ms. Jackson left our ward to die at home in peace.

The next day, her GQ1b antibody test resulted – much to our surprise, she was found to have a small amount of this rare antibody circulating in her blood!  Now what?  Over the last week of her hospitalization, her family had begun the coping process that had ultimately culminated in them accepting her imminent death.  Was it ethical for me to call her family and tell them to bring her back to the hospital?  What does the presence of this antibody even mean?

I went to the medical literature for guidance.  It turns out that the GQ1b antibody is not very well understood.  What we know is that the nerve cells in our brain and spinal cord are made up of a substance called “ganglioside” that can sometimes be the victim of an antibody that our own body can spontaneously produce as a result of an intestinal infection or other abnormality of the bone marrow.  It is rarely observed, and because of this, much of what we know about the diseases caused by this antibody is from individual case reports.  This antibody has been connected to the Miller-Fisher syndrome and Bickerstaff brainstem encephalitis, both of which are very rare conditions – both of which shared features that were present in Ms. Jackson.  Albeit the quantity of antibodies present in Ms. Jackson’s blood were less than the usual patient with these potential syndromes, the laboratory result in front of us brought up the question of the next step.  In a patient still currently hospitalized, there would be a discussion of whether trying a procedure called “plasma exchange” would be beneficial.  It would involve plugging a very large tube into one of the patient’s large central veins, removing blood from the patient’s body, filtering it through a dialysis-like machine, and then returning the cleaned blood to the patient, free of any harmful antibodies.  But laboratories sometimes make errors, and what if the low level of antibody to GQ1b present in Ms. Jackson’s blood was a mistake and didn’t mean anything at all?

With Ms. Jackson, there remained the bigger question of whether interrupting a coping family’s mourning of a soon-to-be-departed family member was justified in the midst of a potential neurologic diagnosis whose outcomes after treatment are not predictable or guaranteed by any means.  And what about a single laboratory test whose significance is not fully understood and, frankly, may not mean anything at all?  As a comparison, some patients have detectable levels of rheumatoid arthritis antibodies circulating in their blood without ever developing symptoms of arthritis.  For these uncertainties, we decided not to interfere with the dying process.  This was not a blocked coronary artery that would lead to certain death if left untreated.  With Ms. Jackson, there were too many “ifs”.  So she passed peacefully with family at her side, and we were left to wonder whether pursuing aggressive therapies in this dying young woman would have changed anything.

With regards to our current practice of medicine, what we do now is based simply on what we know.  And what we know is heavily dependent on our past experiences of human disease.  Yet we know that there is a plethora of knowledge to be discovered, and hundreds of diseases still to be named.  So why should anything ever be deemed impossible?  One hundred years ago, we thought that heart attacks could only be treated with bedrest – now we have stents and surgery that have allowed people to take back their lives.  Thirty years ago, we believed all stomach ulcers were caused by stress – now we know a large portion of them are caused by bacteria that can be successfully treated with antibiotics.  In fifty years, I may look back at Ms. Jackson’s case and see a clear diagnostic or therapeutic option that we had never tried.  But such are the shortcomings and beauties of medicine.  Practicing medicine in the 21st century where our imaging and laboratory testing have sometimes accelerated past our knowledge of new human disease, it has never been more important to uphold the ethical principle of non-maleficence (i.e. “First do no harm”) to help guide us in making these tough decisions for our patients and their families.

Doc Veritas

Tuesday, March 18, 2014

Doctors versus Nurses


Doctors and nurses have shared the common goal of patient care for centuries. They selflessly give up their livelihoods to better the health of the rest of us. So why would there ever be conflict in the work environment when these two professionals attempt to share responsibility over the sick and dying?

Every doctor has been there.  Its 3:00 AM, and you’re just entering REM sleep after somehow finding a comfortable position in the call room.  Your body is in a heavenly state of paralysis, and the only muscles doing any work are those in your eyes, busy making your orbits dart up and down in synchrony.  Then comes the feared jolt of your pager’s vibrations followed by a screeching alarm that signifies that someone, somewhere, needs you.  You clumsily punch in the numbers on your cell phone and await a human voice.  A young nurse picks up.  She is taking care of 78-year-old Ms. Jackson, whom you admitted from the emergency department earlier that evening for a urinary tract infection, and wants to know if she can give the patient a stool softener since her thorough review of systems picked up the fact that Ms. Jackson had not had a bowel movement in four days.  Its 3:10 AM by now.  There is no chance of enjoying another round of REM sleep before your morning rounds start at 6:00 AM.

Every nurse has been there.  You’re assigned to Mr. Stephens who is a 55-year-old man admitted last night to the hospital for a bout of chest pain.  While you’re getting settled for the beginning of your shift, he calls for his nurse and asks why he hasn’t had any meal trays delivered today.  You review his orders and see the infamous “NPO” order (nil per os, latin for “nothing by mouth”).  You tell him that you’ll check with his physician as there may be a good reason for this.  You page the physician and await the callback.  Meanwhile, Mr. Stephens is growing more and more restless and is threatening to leave the floor to buy his own food from the cafeteria.  You try to talk him down, and simultaneously send a second page to the physician.  Finally, there is nothing more you can do to stop him and he goes off the floor to satisfy his cravings for the cafeteria cheeseburger.   How are you expected to know the plan of care if no one has made any effort to keep you abreast of the details?  The physician finally calls back, initially with profuse apologies for his delay which quickly degenerate into scathing remarks directed towards you for allowing his patient to eat since he was just about to take him for his cardiac catheterization.

Before getting into the differences between doctoring and nursing, it may help us to look back at the beginnings of both professions.  Healthcare providers date back to as early as 3000 BC when Imhotep, advisor to the king of the 3rd dynasty of Egypt, made detailed accounts of anatomical observations, various diseases, and observed cures.  Imhotep was also a superhuman of antiquity, acting as an advisor to the King, inventor, architect, engineer, philosopher, and priest.  Physicians in other civilizations such as India and Mesopotamia arose independently around 1000 BC.  In 700 BC, the Greek formed the first medical school, which soon led to the time of Hippocrates, considered to be the “father of modern medicine.”  He wrote the Hippocratic Oath which continues to guide the practice of medicine and is still recited at medical school graduations to this day.  The original oath includes such promises as “to do no harm”, “to not give a pessary to cause an abortion”, and “to keep confidential all matters of the doctor-patient relationship”.  The 19th century AD saw major paradigm shifts in our understanding of disease with new ideas and discoveries from Charles Darwin (evolution and natural selection), Gregor Mendel (genetics), Ignaz Semmelweis (aseptic technique), and Louis Pasteur (germ theory).  These legends of medicine revolutionized the field and opened the door to new discoveries which advanced medical therapies and extended life expectancies to what they are now.

During the Crimean War (1850’s AD) fought between the Russian Empire and a few Western European kingdoms, Florence Nightingale was an Englishwoman who rose to fame as she tended to wounded soldiers on the battlefield.  She had already displayed a very independent nature when she went against the grain of the woman’s typical housebound role in her day and entered nursing in 1844.  On the battlefield, she gained the nickname “The Lady with the Lamp” as she made her rounds late at night with a lamp among the barrack’s many injured.[1]  Nightingale was best known for bringing to attention the poor living conditions of the soldiers at war.  Ninety percent of the casualties during her first winter at war were from illnesses such as typhus, typhoid, cholera, and dysentery as opposed to battle wounds.[2]  After the war, in 1860, she founded the first secular nursing school in the world at St. Thomas’ Hospital in London.  Akin to the Hippocratic Oath, the Nightingale Pledge was soon created by Lystra Gretter, an instructor of nursing in Detroit, Michigan, in 1893 and is recited by new nurses upon entering the profession.  The pledge includes promises such as “to never administer harmful drugs,” “to maintain confidentiality,” and “to aid the physician in his work”.  Since then, we have seen nursing take a more holistic and assertive approach in patient care over the decades.  Nurses are no longer passive members of the medical team.  They now take a more proactive role in the care of their patients, anticipate needs from both patients and physicians to facilitate efficient delivery of healthcare by understanding the pathophysiology of disease, and serve as the final checkpoint in the error-free world of medicine that we all hope to live in.

The similarities between doctors and nurses are obvious from their historical origins as recounted above.  Both professions got their start from highly dedicated and diligent people, and both professions are held to the highest standard with oaths that pledge their commitment to a noble cause.  But the most striking difference in their stories is the millennia of ancestry that precedes physicians in their trade.  Doctors have frankly been around for much longer.  Nursing is by far a much younger profession and, thus, is still experiencing an evolution of its role in healthcare.  In the last few decades we have seen nurses leave the hospital wards and enter the role of primary care provider as nurse practitioners, helping to restore the growing lack of general practitioners needed to treat our populous country.  We have seen other nurses leave the bedside altogether to join the ranks of hospital administration or to manage large research projects that create some of the biggest breakthrough discoveries that guide our new therapies.  For young doctors, the legends from eras past like Hippocrates and Semmelweis serve as a reminder of the nobility of their profession and the timeless nature of which it has given to its patients.  So much of what we do as physicians is based on habits of the past because that is what we know.  What we know comes from these legends who have changed the way we deliver healthcare.

Nurses see things differently.  Because of their origins and perspectives on patient care, they have looked across disciplines as a means of strengthening who they are and what they can provide.  They integrate physical therapists, nutritionists, social workers, substance abuse counselors, clergy, and cafeteria workers to deliver a complete package of healthcare to the patient along with the pills that doctors prescribe.  They see success not in giving the right steroid or the most proven form of chemotherapy, but instead in keeping a patient mobile during their hospital stay to prevent blood clots in their legs; in turning an immobile patient frequently enough to prevent pressure ulcers; in helping a diabetic learn to pick healthy options from a cafeteria menu; in spending more than the two minutes a doctor spends convincing a smoker to lay off the cigarettes; in talking to patients about social support at home and why this was their third time in the hospital this month; and in asking patients if they’re sure they can’t do anything else for them.  With the reduction in physician reimbursements and the need for more detailed documentation given our current litigious environment, doctors are spending more time behind a computer or in a chart than they are at the bedside.  Efficiency is great, but a hospital’s drive to coordinate multiple tests on a single day in order to speed up hospital discharges also has the effect of keeping patients out of their rooms on morning rounds when medical students and residents sometimes have their only chance to learn about the practice of medicine at the bedside.  With all of these trends, nurses will soon find themselves as the only part of the medical team delving into the lives of our patients in order to explore the intricacies of the social world that blends with so much of their physical health.  And with this comes more responsibility to effect change in these realms for the betterment of patient care.  It is well know that medical wards filled with nurses who are good at what they do have documented improved patient outcomes.[3]

On my medical ward at the hospital, doctors and nurses had a very collegial relationship.  There was a team approach to medicine that I seldom saw on other wards.  But there was one thing I still observe that I wish would change.  I wish that nurses would stop having to refer to doctors as “Doctor.”  Most patients will refer to their physician as “Doctor,” and I am no different.  I enjoy the peace and trust that is implied when referring to the person to whom I am surrendering control of my health and well-being.  But when I am on a team of individuals who are not recipients of these gifts but instead help to deliver just the same to my patients, this phrase has less relevance and instead creates a rift that I believe is destructive to the team.

Let’s take a closer look at the word itself.  The word “doctor” originated from the Latin word doctoris which means teacher.  It dates back to 1088 AD when Western civilization’s first university in Italy, the University of Bologna, assigned this prefix to refer to its graduates, specifically of law.  The term was then extended to graduates of other fields such as Philosophy and Medicine in the 13th century.  Although graduates of law in some countries such as Italy, Spain, or Portugal continue to refer to themselves as “doctors”, this prefix was lost in England and America since the practice of law was taught as an informal and undocumented apprenticeship until the 19th century, when lawyers were once again required to have a formal university degree.  The phrase never caught on though, and lawyers here are seldom referred to with the prefix “doctor” outside of certain conferences or academic meetings.  In its stead, many other degrees have had the “doctor” prefix extended, such as optometry, pharmacy, chiropractic, and even ayurvedic and homeopathic medicine.  What it means to be a “doctor” has become more vague and dispersed than its roots in 11th century Italy.

The doctor-nurse relationship will continue to evolve through this era of medicine.  We are sure to see new roles develop as healthcare becomes more stream-lined with electronic medical records and genetic predictions of disease.  But one thing is for sure – the 21st century-trained physician is more than ever dependent on the nurse to supply top notch care to his patient.  And the only way to do this is to work as members of the team, with the patient as the leader.

Doc Veritas








[1] Cook, E. T. The Life of Florence Nightingale. (1913) Vol 1, p 237.
[2] Nightingale, Florence (1999-08). Florence Nightingale: Measuring Hospital Care Outcomes. ISBN 0866885595.
[3] Blegen M, Goode C, Reed L.  “Nurse Staffing and Patient Outcomes.”  Nursing Research 1998;47(1):43-50.