Search This Blog

Follow by Email

Monday, October 23, 2017

Technology is unkind to the elderly


In about 1990 we got our first computer. I say "we" because my husband and I shared a desktop which I used infrequently. I mostly used it for word processing, and the internet was very young. At the turn of the millenium, my children were using computers and they were limited to 1 hour of computer time a day, which was on a shared desktop. By 2001 I was communicating with my grandmother, then in her late 80's, by email.

My grandmother had received, probably just prior to the millenium, an email machine from her son. It was a little thing on which she could type messages to a dear friend who lived in London. She loved the ability to spontaneously send him a message and get an answer in a day or less. She was a retired reference librarian and had worked in the Bay Area school system where an exhaustive knowledge of the Dewey Decimal System allowed her to connect students to the resources they needed. We thought she would be delighted to have an Apple Macintosh. With such a thing she could query the world of data and feed her insatiably curious mind.

This gift was a failure. Nothing about it was intuitive. Scrolling, clicking, using a mouse, returning to a previous screen, all were lessons that she had trouble learning. She would call one of us or her son when the screen inexplicably looked different than it had and she couldn't find her way back to something familiar. I think she started to die a little when she couldn't make that pretty white computer work. In retrospect she just needed the email machine.

A few years ago my father, now in his 80's, gave me an iPad which he had bought and didn't use. By guess and by golly (as my grandfather would have said) we managed to navigate its passwords and get it transferred over to me. It was cute, but certainly no more useful than a laptop, and rather delicate, so I gradually broke its screen and then it was stolen when I was in South Sudan. On a recent trip, my father showed me his new iPad, which he also didn't use, which he thought I might have use for. He had been seduced into buying it at an Apple store and, likely with the help of some bright millennial, had entered a new password and the answers to some security questions. Being wise, he didn't write the password on the machine itself, since he knows how important password security is, so it is gone. Also, having had a long and very complex life, the answers to the security questions were subject to shifting interpretation. Although he and I contacted customer service, there is no remedy. The pretty iPad with the retina display is now an attractive coaster or possibly something under which to press flowers. (Those of you with a penchant for problem solving will ask about "return to factory settings" or even "jailbreaking." I will just tell you that, after trying these things for 2 hours with someone of legendary computer cleverness, Apple has those options pretty well blocked.)

Technology, by which I mean computers of all sizes including phones and tablets and the like, offers incredible potential to people as they age. Music can fill their ears, raise their spirits and help them to frame their lives as brave and glorious. They can see pictures of far off places which they might not be able to visit again, talk to children and grandchildren while seeing their faces, access reminder notes, pay bills, review bank accounts, donate to charity, play games that tweak their brains in good ways. Computers, at their best, make our worlds larger and extend the capacity of our minds. This is just what we need as we get older. But computers, with their infernal passwords and vulnerabilities to security breaches, their little bitty buttons and sometimes tiny screens, their failing wireless modems and misleading advertising, are making the old feel older.

By the age of 85, about 1/3 of people have Alzheimer's disease, per the Alzeimer's Association. This, of course, vastly underestimates the proportion of elderly people with some kind of impairment in their memory, problem solving or ability to learn new tasks. This group of people need, more than we younger folk, to have access to their medical records and to use the wealth of online resources to remain healthy or monitor diseases. It is precisely the group whose health can most benefit from computers and the internet who are left out. Almost all of the elderly patients I see in clinic decline to use the computerized patient portal. Although I think that the portal itself is pretty easy to use, it is the many steps involved in getting to the portal that are daunting, so much so that our older patients hesitate to even try.

In the UK in 2012, the Prime Minister issued a challenge to make the country more friendly to patients with dementia. This included a Dementia Friendly Technology Charter. The Challenge includes making workplaces and communities more kind to people as their brains age, but also to help them get some benefit from technology. There are quite a lot of technological solutions to the problems of dementia, especially for caregivers, but I don't think that producers of hardware such as computers and tablets are stepping up to the plate. It is perfectly possible to create an iPad that doesn't depend of remembering passwords and reduces vulnerabilities to abuse while allowing users to access music and video chatting and photos and information. The UK has made some headway toward dementia-friendliness. The United States has no such challenge in place and from my vantage point, people are just becoming more marginalized as they age.

I would like to encourage the hugely successful producers of technology to respect their elders to the extent that they create products that will welcome them. The makers of software that is useful enough to become a necessity should think twice before requiring that users have excellent memories. And until our technology becomes more friendly, companies should develop remedies so that people who develop dementia (or have brain injuries) are not effectively shut out.

Sunday, October 8, 2017

How much do we love guns?

A letter written to JAMA (Journal of the American Medical Assn.) by Robert Tessler MD and colleagues at the Harborview Injury Prevention and Research Center in Seattle presented evidence that the United States'  approach to guns has significantly increased deaths from terrorism.

Using the Global Terrorism Database from 2002-2015 they found that, compared to Canada, Europe, Australia and New Zealand, The US has a considerably higher percentage of terrorist attacks that used firearms and firearm related terrorist attacks were more deadly than any other method, including bombs. Of the 2817 attacks in that time period, only a bit over 9% used guns, but these attacks were responsible for more than half of the fatalities.

It's not just terrorism that is more lethal using guns. Suicide attempts are much more successful if they are made with a gun. In fact, over 80% of suicide attempts made with a gun are effective compared to only 1.5% with drug or poison ingestion. Over half of suicides in the US are achieved with a firearm. Suicide is the second leading cause of death for Americans between the ages of 15 and 34.

Gun ownership is considerably higher in the US than in any other country in the world. We have 112 guns for every 100 people. The next runner up country is Serbia with 58 guns per 100 people and Tunisia has the fewest guns at 1 per 1000 people.

Citizens of the US appear to love their guns. Not everyone, but as a nation we are clearly very enamored. Our second amendment, standing right behind the first which grants us free speech,  allows for "A well regulated militia, being necessary to the security of a free state, and the right of the people to keep and bear arms." This was interpreted by the Supreme Court in 1939 to mean that there was a collective right to bear arms, as would be required to have a state militia, and so a law to make sawed off shotguns illegal was felt to be constitutional. In 2008 the Supreme Court interpreted the second amendment to mean that people had an individual right to bear arms and struck down a Washington DC law prohibiting ownership of handguns. Since that time states have expanded gun rights including, in some, the right to carry a concealed weapon without a permit.

There are federal laws that limit gun ownership, preventing some criminals, drug abusers, spouse abusers, children and felons from obtaining them, but state laws are spotty and many people who use guns to commit crimes obtain them legally. There are classes of weapons that people are restricted from owning, based on the idea that there is no reason for a law abiding citizen to need a machine gun or rocket launcher. The strictest of laws which prohibit gun ownership in some countries would be found unconstitutional in the US, but most Americans support some sort of increased restrictions on gun ownership. Taking peoples' guns away is neither practical or legal, even if a majority of citizens felt like it was a good idea.

People who love guns do so for various reasons. The primary quintessentially American reason is that we feel like it is important to have some physical way to prevent our federal government from controlling a helpless population if that government ever goes over to the dark side. I'm not sure this is really realistic given the very advanced weapons systems and surveillance that the military has at their disposal, but I suppose we could strategically make trouble in a guerrilla warfare sort of way.

There are hunters who like to have rifles of various sorts for sport. There are gun enthusiasts who just think that guns are incredibly cool and love the technology. There are civil war re-enacters who love their classic weapons. There are people who live in dangerous areas who believe that having a gun could deter an intruder. There are people who live in Alaska who very realistically know that a grizzly bear is probably watching them when they hike and may decide to eat them. There are also criminals and violent gang members who want to have guns so they can shoot and kill people.

For dozens of reasons, people in the US love their guns. Because of this we have lots and lots of guns and the guns get used to kill children, concert goers, rivals, wives, husbands, lovers, innocent bystanders, congressmen, police officers, the unfairly and fairly accused, newlyweds and so on. We may love our guns, but most of these deaths are intolerable tragedies. Do we really love guns so much that we are able to tolerate the over 36,000 deaths per year due to them? It appears that, since we have so many guns, people tend to use them. (Go figure.) Do we really need so many guns? We seem to have agreed that certain dangerous people should not own and carry guns. Can we just enforce those laws more effectively?

Seattle, according to an article I just read, has enacted a tax on guns and ammunition. This is a creative and constitutional way to address the sheer numbers of guns in circulation. They charge $25 in taxes per gun sold and 5c per round of ammunition other than 22 gauge which is only 2c. This has been repeatedly challenged in court and has so far stood up. We have also done this with cigarettes (which are responsible for over 10x as many deaths, but usually after protracted and ugly illnesses) with some success. Taxing guns may make ownership go down and perhaps even feed back to production to reduce that. It seems like a reasonable approach and could spread.

It seems like we should step away from partisan politics where guns are concerned and decide to engage in moderation. We did that 50 years ago with cigarettes when the surgeon general told us that they caused cancer. We should do it with sugar as well, as our population is becoming fatter and more diabetic. It is never easy to give up that thing we think we love that is really bad for us, but we need to think, as a nation almost perpetually in mourning over some shooting incident or another, if it isn't just about time.

Wednesday, October 4, 2017

Physician Burnout and Suicide

Physician burnout and physician suicide has been getting more attention in the last several years. Suicide among physicians is horribly tragic, and maybe moreso because of several factors. Suicide is the quintessentially most preventable fatal event. In order to prevent suicide, the person killing him or herself needs only not do it. To anyone who knows the victim/perpetrator it seems that if only the right words had been spoken, the right sentiment expressed, comfort offered, their death would not have happened. Among the family and friends of a suicide, this is one of the agonies that is added to the pain of loss. Physicians have a huge number of close contacts, patients and coworkers, who have a pretty intimate connection with them, all of whom mourn their loss and many of whom question whether they might have had something to do with it. Beside the emotional impact of the loss is the very real fact that physicians are responsible for some part of the care of potentially thousands of people who are left stranded by their abrupt departure. There is the very sad fact that someone whose job it was to help people was unable to get the help they needed.

It is not clear that physicians commit suicide at a higher rate than people in other professions, according to a report by the CDC last summer, and although it is the number one cause of death among male medical residents per a study that was released this year, their suicide rate was lower than average for their age group. Although burnout is clearly increasing among physicians, I have not seen any data that shows that suicide is increasing.

I have been a witness to the kinds of stresses that lead to suicide in physician colleagues. So far, knock on wood, none of the doctors who work closely with me have committed suicide. I have, however, been around some pretty spectacular cases of burnout. According to a Medscape poll, 40-60% of physicians show signs of burnout. Surprisingly, the major problem they complained about was the excessive bureaucratic tasks that they had to do. It was not the stress of making life or death decisions but the grinding demands of the computer, the paperwork, satisfying insurance companies, convincing organizations that monitor quality that they were delivering it. Other frequently mentioned complaints included extended work hours and feeling like they were just a "cog in a wheel." My experience is that it takes more than a bad job to push a person over the edge, though. But life is pretty good at offering that little bit more. The breakdown of a marriage, a child with troubles, an illness can take a person who is competently holding on with her fingernails and plunge her into failure. Alcohol and drugs provide respite and destroy that last pretense of being able to do the work. The colleagues I've seen go through this usually step away from practice and may or may not return.

My worst times were early on in my career. During my first year in medical school, I comforted myself with the thought that if things got too bad I could just jump out of the tenth story window of my dorm. After awhile I replaced that with deciding that I would  just go live with my sister and cook for her. The first year was bad because there was just too much stuff to learn and if I stuffed my head full of it, as I needed to if I was going to pass my tests, I couldn't sleep. If I couldn't sleep I couldn't stuff more information into my head so I walked around gripped by fear of failing. Occasionally I was distracted from my misery by some of my really excellent teachers and was eventually saved by a prescription for sleeping pills. These I hoarded and doled out by the fragment so I wouldn't have to ask for more. A boyfriend and increasingly close friendships helped make the second year almost imperceptibly better. By the third year the opportunity to interact with real patients and be of use cured me. Training continued to be stressful, but there was always something rewarding that came back to me from grateful patients or collegial professors which gave me the joy I needed to make the process sustainable.

After completing my residency, I took some time off to find the right job. I got a house with the man who would eventually be my husband and a big yellow dog. The position I finally found was good, though demanding, and I enjoyed learning from other physicians at my work who had different skill sets than I did. I was able to keep up and felt I did a good job. Burnout threatened when my workload increased and I felt like I couldn't keep up. There was always more that I needed to do at work but home needed me too. Having a baby actually helped because the woman who we hired to help take care of her was wonderful and made me feel like home was well taken care of.

Six years ago I transitioned from a pretty sustainable to a very sustainable lifestyle, doing shift work as a hospitalist. My children have fledged and I no longer need to help them with their homework after work or worry about childcare if they get sick. I still do some outpatient medicine, but have not been sucked up into the complexity of documenting for merit based payment or pay for performance systems. I did go through the growing pains of adopting several computerized health records, both inpatient and outpatient, and have experienced first hand how that can make everything seem impossible.

I can see that in a clinic system where an employer was pushing the physician to see more patients in an hour and patients were pushing back to get what they need, administrative tasks could be a big part of burnout. The recipe, I think, for burning out is one cup of impossible and maybe conflicting demands and several tablespoons of feeling like something terrible will happen if you don't meet those demands. When the demands are from both home and work, things get pretty grim pretty fast. If the work is not rewarding, as it would tend not to be when you can't do it properly, then there is no joy to counteract the stress.

Medical offices and hospitals right now are in a time of transition, which makes things particularly bad. We are moving toward making computers do the work that humans find tedious, but the interaction of computers and people is still awkward. We end up doing lots of the work that the computers eventually will be able to do themselves, keeping track of nearly endless and very complex data, remembering schedules invented and tweaked by organizations charged with optimal care for chronic diseases. We are wrestling with computers instead of doing the human job of reading people and helping them solve their problems.

It is not entirely our jobs which lead us to the brink of suicide and beyond. We are humans with sadness and stories and connections which can be difficult or even crushing. But we can make the job part of this much easier. We need to allow computers to do what they do best and have doctors do doctoring. We need to figure out how to unhook a doctors monetary compensation from how many patients we see, so we can keep those patients healthy and out of our offices and hospitals where they belong. We need to not take on more than we can do well, even if that means saying "no" to the person who writes our paychecks.

Sunday, August 20, 2017

How a pocket sized ultrasound pays for itself--every week

I bought a pocket ultrasound in 2011, determined to learn how to perform and interpret ultrasound at the bedside and thus transform my internal medicine practice. I bought it new and it cost over $8000. That was a staggering amount of money to spend on something I knew very little about. In 2015 after having performed many thousand ultrasound exams with my little GE Vscan with the phased array transducer, I replaced it with the new model which had a dual transducer, with one side for deep structures and one for superficial structures, such as bones and blood vessels. It cost around $10,000. This was an even more staggering amount of money, but more of a sure thing. I knew that it made a difference and that the cost of the machine was a very small portion of the benefit that I would get from using it.

Since the time I bought the new machine, GE has come out with an even fancier machine that is just a wee bit faster and has internet connectivity and a touch screen. Because everyone needs the newest thing, the earlier models like I have are much more reasonable. Without even bargaining, the first machine I bought is available on Ebay for many thousands of dollars less than I paid. I am not trying to sell Vscans. In fact, Phillips has a very lightweight tablet model that gives even better pictures than mine and Sonosite has the iVIZ which also has gorgeous images. These machines are not yet inexpensive, but some day will be. There are bluetooth transducers which interface with tablets. There are very small Chinese machines that are quite inexpensive, but I haven't played with them and can't vouch for their quality.

I think of my Vscan as an $8000 machine. Now it's more like a $6000 machine per Ebay, but it still isn't a small expenditure. I like to believe that it's worth it. Since a day in the hospital in the US costs about $2500, when I avoid 3 hospital days by doing ultrasound I consider the machine paid for. Every time using it saves someone's life, I consider that it paid for itself several times over. In the small picture, I don't actually get that money, but in the big picture I do, since all healthcare dollars come out of the same pot eventually.

Here are the ways bedside ultrasound paid for itself this week:

1. A 45 year old man was admitted with alcoholic hepatitis on top of known cirrhosis. He starts to improve but his abdomen is painfully large and so he is sent by my colleague for a paracentesis, to have the fluid in his abdomen drained. They are able to remove a liter of fluid but a couple of days later he is feeling full again and wants the procedure repeated. I look at his abdomen with my bedside machine and am able to reassure him that there is very little fluid to drain and that his discomfort is caused by his huge liver which will gradually return to a more normal size if he stays off alcohol. One procedure and one hospital day saved.

2. A 90 year old woman whose small bowel obstruction has resolved is ready to go home. I notice that she is a little bit short of breath and I wonder if she has developed congestive heart failure. Her lung exam shows some crackles. I ultrasound her lungs and find that she has just a few "B lines" (indicative of wetness of the lung tissue) in the lower right lung, most consistent with the mild changes often present when a person has been at bedrest. She can go home. She is happy. One hospital day saved.

3. A 50 year old man is recovering from surgery for a perforated colon. He has developed abdominal distension and pain. The surgeon orders a CT scan with oral contrast. The patient is sitting up in bed with a bottle of contrast solution beside him. He is very unhappy. He can't imagine drinking the 500 ml of liquid and feels he might vomit it. I ultrasound his abdomen and find that his stomach is huge and fluid filled and his intestines are swollen and completely full of fluid, filling his abdominal cavity. With this information the surgeon, radiologist and I come to the consensus that having him drink the contrast medium will be useless since it will go nowhere, and what he really needs is a nasogastric tube to drain his stomach and small intestine. The patient is spared the bad things that might have occurred had we attempted to add more fluid to a tense water balloon and appropriate therapy is not delayed. Monetary value=hard to say.

4. 60 year old man is in the hospital after a hip fracture. He is on many pills for pain and for blood pressure which have been re-started after his hip surgery. I am called to the bedside because his blood pressure is very low and he won't respond. Bedside ultrasound shows that his heart, lungs and abdomen are all normal, with no evidence of a heart attack or a blood clot to the lung. His inferior vena cava, which brings blood to his heart from the lower part of his body is so small that it is invisible. He responds well to a liter of IV fluid and a little bit of oxygen and is sitting up eating dinner a couple of hours later. Ultrasound allowed me to rule out complications that would have required further testing or intensive care. In retrospect, he had very little money and no way to pay for most of his medication, so had not been taking all the pills on his list. The many sedatives and blood pressure pills hit him hard. Beside avoiding an intensive care unit transfer and complex testing, he was also able to be discharged the following day since he felt fine on fewer pills.

It's not just the money. (Though, in my experience, it does save money.) Knowing more about what's going on by way of bedside ultrasound allows for more appropriate and compassionate care. It's also much more gratifying to a doctor than guessing.

Thursday, August 17, 2017

The demise of the lecture--the rise of real education?

Today in the New England Journal of Medicine I read an editorial that discussed how lectures are being phased out in medical school education. I was, at first, a little bit appalled. Why would they eliminate an educational method that worked so well for me and my generation of doctors?

Or did it? I actually remember only a few things now from lectures, and all of those things don't support the idea that lectures were an effective way of teaching. I remember vividly how I would fall asleep and write progressively more poetic and less linear notes in my binder. How I would startle myself awake, causing heavy textbooks to fly in the air. I remember the time when the professor showed us the structure of vitamin B12 and I considered learning it, just for grins, and decided not to. I remember formulating questions for the lecturer that would display such minuscule understanding of the material that he or she would actually understand how deeply we students had been left in the dust. But I don't remember learning anything. I'm sure I did, at least eventually, when I highlighted and rewrote my lecture notes and read the material in the book. I'm not sure lectures were a good use of my time, or that of the eminent scientists and clinicians who were trying to teach us.

I do remember learning things in the laboratory. I remember learning about diptheria as we carefully sucked virulent Corynebacteria diphtheriae into glass pipets to examine it under the microscope. I remember using machines to understand sine waves and the concept of gain in order to learn how monitoring of vital signs could go wrong. I remember working with a group of 4 medical students to dissect a human body and how I worked with my professor-attending to reveal obscure diagnoses of real people. I particularly remember how a classmate and I decided to learn half the material in a certain class really well and teach it to the other person, creating a typed handout with jokes and cartoons and completely acing the essay exam on that subject.

What particularly bothered me about this idea of getting rid of lectures was the thought that students would have no structure to their learning, that they would just bop around aimlessly trying to absorb the enormity of medical science. Reading on, however, I realized that what is intended to replace the lecture are shorter and smaller doses of facts interspersed with questions and group work and cases that integrate the facts with problem solving. Medical students will still need to get up in the morning and come together in classes, but the classes will be different. The author mentions that students who hear an eloquently presented lecture may feel that they understand the material, but on further questioning realize that they have only a very superficial grasp. This is intuitively true and I know I have seen it, meaning that even the most clearly delivered lecture probably isn't very useful from a practical standpoint.

A few years ago I attended a talk about how to give a talk. In the talk the speaker said that most people remember only 1 (or is it 3?) things from a lecture. I also remember that he said to practice in front of a mirror which I tried but will never do. I don't remember what else he said, except that he thought Steve Jobs gave a great talk. He was definitely right about the number of things most people remember, though I don't quite remember what he said.

The conclusion of the article about saying goodbye to lectures was that they really are going away, at least in their long and fact filled monologing glory. Good teaching may involve a speaker and a large group of listeners, but will include shorter and more easily absorbed facts interspersed with questions to ascertain understanding.

New methods of learning are based not only on the fact that humans have limitations in their ability to absorb information, but also on the exponentially increasing amount of it as communication and technology co-evolve to deepen our potential understanding of the natural world. It is no longer practical to expect a person to keep an adequate body of knowledge to practice medicine in his or her brain. A couple of teachers of bedside ultrasound, Mike Mallin and Matt Dawson spoke about "just in time" rather than "just in case" learning at a meeting a few years ago, arguing that we remember and learn things better when we access the information at a time when it is relevant. They created a phone app called "1 minute ultrasound" which gives a person just the information they need to perform a bedside ultrasound exam right before they go into a patient's room. "Just in time" learning. I know that I would never have remembered the basic science behind Acute Intermittent Porphyria had I not had a patient suffering from it who needed me to mix up an ink-black orphan drug to abort her painful episode. In fact, the disease was so complex and obscure that I had sworn NOT to learn about it since I would likely never use the information in practice.

Not all learning can happen "just in time" since a certain knowledge base is necessary to filter the information a patient provides in order to be thinking in the right general area. Also some emergency conditions require immediate action, though I'm often surprised how easy it is to brush up on a condition by using my cell phone, even in dire situations. A fourth year medical student pulled out a Palm Pilot 15 years ago when a patient asked about a drug interaction. As I promised I would check a reference on it, she had the answer. I am eternally grateful for my first introduction to a peripheral brain that expanded my own. Now I have volumes of updated information on any condition known to man in my pocket.

I know that we will cling to the lecture for many years, in medicine and in other learning situations. Big changes happen slowly. As I partake of them I will appreciate the art and the effort that goes into their creation and sense that they are a noble tradition. I will try to learn more than 1 (or is it 3?) things from each one, but I won't beat myself up when I don't. As a tool for learning or teaching, though, I may be about ready to say "Goodbye."


Sunday, July 2, 2017

Agreement and division--the American Health Care Act and what we all want

It's been hard to be a concerned American citizen lately. We are facing huge problems which will become larger in our lifetimes, including the need to take care of our increasing global population and the medical complexity of taking care of people who are becoming older and sicker. There is global climate change, which is hard for all but the most stalwart of partisans to ignore. There is an increasing gap between rich and poor in our nation and in many others, which places the rich and powerful at odds with the much more numerous and therefore potentially powerful poor.

To help guide us through these challenges we have a government so deeply divided on democrat/republican party lines that it is mostly unable to do anything creative at all. And we all pay them lots of money to be dysfunctional.

I have been following the activities surrounding repealing and replacing the Affordable Care Act. The ACA (Obamacare) was passed without a single republican "yes" vote in the senate. The American Health Care Act (AHCA = Trumpcare), if it passes, will do so without a single democrat voting for it. It has been difficult to write because it has to please all republicans, including those who feel that healthcare should just take care of itself using the free market and who would happily get rid of any federal subsidies. There does not appear to have been any attempt to make the bill palatable to democrats or even relatively conservative healthcare organizations such as the AMA. The most recent iteration abolishes taxes on investment income which is effectively a tax cut for the rich, takes away all federal money from providers of abortions, even if the vast majority of what they do prevents abortions and so may de-fund Planned Parenthood. It offers block grants for Medicaid instead of paying a percentage of Medicaid costs. This leaves states to either pay more for the program or make cuts to services if medical prices go up faster than the consumer price index (which medical costs have done historically.) It reduces subsidies to pay for insurance for many people who are poor which means that many of them will stop paying for health insurance which they will be unable to afford.

The AHCA, as it was written, also would have provided some subsidies for insurance companies which have lost money under the ACA and many of which have either withdrawn from exchanges or increased their rates. In Idaho, I read in our local newspaper, insurance costs are set to increase by 22% this year, which will be very painful for many people. The insurance companies were hit hard since passage of the ACA, because a republican dominated congress did not appropriate the money promised to the insurance companies in case of shortfalls. People buying health insurance through the exchanges may already be priced out of paying for health insurance, even if nothing is done to "repeal and replace" the ACA. Not only will this leave more people uninsured but rising health insurance costs affect all businesses that are required to buy insurance for their workers, which will either impact their employees' paychecks or even cause the businesses to fail.

The ACA, our present health care system, is like a house whose roof is leaking, and has been leaking awhile. Instead of fixing the roof in the first place we are now wrangling about how to build a new and crappier house. If we don't either fix the roof (which is vanishingly unlikely in a republican held legislature) or build the new crappy house, we will all be shivering in the corners pretty soon.

But there has been a bright spot in my thoughts about the future. I have been reading Srdja Popovic's book Blueprint for Revolution. He was a member of the group Otpor! which was partly responsible for mobilizing the people of Bosnia to oust their dictator Slobodan Milosevic. He talks about some of the ways that people can work together to get big things done. The most important step is to find out what issues virtually everyone agrees about and to move on those. Also to maintain a sense of fun and positivity, because that is what feeds people and helps them stay active.

Our communication via the internet, with a new addiction among some of us to reading what we think is "the news" has been both good and bad. One thing that comes of it is that the economy of the internet, which is driven by ads which are equivalent to real money and resources, pushes conflict. There are natural conflicts, but increasingly we are pulled in by more petty conflicts. People who basically agree, share a political party and a vast number of values, enter twitter or facebook wars about smaller points and end up mortal enemies. This is exactly how you can get more clicks on your comment or your news story and not at all how you can unite to make good things happen.

There are many things that a majority of American's agree upon. We want to be paid fairly for our work. We want our children to grow up safe and responsible and useful. We want to breathe clean air and have healthy food to eat. We want adequate health care that doesn't stress us financially. We enjoy beauty. We want to end the divisiveness that creates inefficiency in our government so we can further our shared values.

It is likely that if there were leaders who stood up and insisted on ending divisiveness in government, they would have followers of all kinds who would come out in force. Democrats and republicans, churched and unchurched, black, white and other rainbow colors of  people would be willing to march in the streets or sit down to a picnic together.

In a congress that was not divided along party lines a healthcare bill could be designed that would serve most of our needs. Legislators who populate the fringe would have to convince others of the wisdom of their ideas, but they would not control outcomes as they do now. Bernie Sanders just sent a letter to me and his 50 million other best friends and suggested "Medicare for All" as an option. This will never pass in a divided congress, but might just gain traction if combined with cost saving ideas that would make it palatable to republicans.

In our present political environment I do not know what to do about the AHCA. The progressive organizations who contact me daily by email urge me to write letters and make calls to my congressmen to oppose it. But I don't know that we have any other options at this point than a bill that, if left unchanged, will have long term consequences of reducing health care to vulnerable populations. Left un-fixed, the ACA is going to have some of the same problems, with bloated but cash strapped insurance companies pricing many people out of the market. If the AHCA is terrible, maybe we will get more substantial improvements as people stand up together to insist that they get what they need. Two states (California and Nevada) have already begun the process of assuring their people adequate health care. We need more action like this.

Most of all we need to realize that we are all in this together and that we agree on many more things than we disagree on. The ways in which we disagree are important. Debate, change and consensus making is a valuable use of our energy, but right now we need to also pull together and gently but forcefully insist that our government do the same.

I recognize and respect people who say that Mr. Trump, our frighteningly incompetent president, should not be "normalized" by cooperation. I do not trust that the election which put him in that position represented the wishes of the American people. But deep divisions and lack of cooperation preceded his presidency and brought us to where we all are. It is time that we all, as citizens, begin to visualize what we all want rather than feel complacent in our resistance.

Tuesday, June 20, 2017

Should a type 2 diabetic monitor blood sugars? Maybe not!

Today in the JAMA (Journal of the American Medical Association) I read that a group out of the University of North Carolina had actually done a randomized study of whether non-insulin treated type 2 diabetics (usually the adult onset ones) achieved better control of their blood sugars if they did a finger stick test of their blood glucose daily. It turns out that they do not. Blood sugars were not improved in a group of patients who monitored their blood sugars once daily compared to patients who did not monitor them at all. Also combining the blood sugar testing with an automatic message from the machine telling them how to interpret that blood sugar did not improve blood sugar control.

Since 75% of patients with type 2 diabetes are estimated to check their blood sugar and there are over 29 million Americans with type 2 diabetes, and blood sugar monitoring is moderately expensive (though better than it used to be), not checking blood sugars could save billions of dollars a year. But that's not all. The energy used to focus on those numbers, by patients, doctors and nurses, could be focused on something that might actually matter, like increasing physical exercise or eating a more healthy diet...

To be absolutely clear, this information does not apply to all diabetics. Insulin dependent diabetics, who usually get their disease as children, and absolutely require insulin to survive, do need to check their sugars. For those patients it's vital to know the blood sugar so that an appropriate amount of insulin can be administered to keep sugars as close to normal as possible. Even type 2 diabetics who use insulin often need to know their blood sugar levels in order to adjust their insulin dosages. Some type 2 diabetics take medication and a regular dose of long acting insulin, and it would be interesting to know if they, too, could forego testing.

Checking blood sugars is not simple, though it is a procedure that most people learn pretty quickly. It involves pricking the finger with a lancet to draw a drop of blood, placing the blood on a paper or plastic strip which is then read by a little machine which displays a number. There are talking machines for patients who are blind, there are machines with fancy functions, expensive machines, cheap machines...You can buy a machine without a prescription at places like Walmart and even buy the test strips over the counter now. It is, however, just one more thing to fit into a busy day and the numbers can make a person feel like a failure if they are high. The monitors require a certain amount of maintenance and sometimes malfunction, leading a person to make unnecessary adjustments or phone calls to health care providers.

This study does have some caveats. Many of the patients in the group that did not test blood sugars had been testing their blood sugars already, so it is possible that they had already gotten valuable information from testing. The patients were told to check their blood sugars once daily. It could have been than testing more frequently would have given better information and been more effective. For instance, if a patient didn't know that their lunch of yogurt and a ham sandwich lead to a higher blood sugar in the evening than a lunch of soup and salad, he or she might not change their diet appropriately.

Despite these issues, this study does indicate that we can safely allow many of our type 2 diabetics to stop routine monitoring. Previous studies have alluded to this, and many physicians are already backing away from badgering patients with type 2 diabetes to check their blood sugars. Nevertheless is remains common and is a way that a patient might mis-allocate time away from something active and directly beneficial to their health. It is probably time to allow many of our patients to relegate that blood smeared glucose meter to the back of the bathroom cabinet.