Monday, March 1, 2010

Health IT Safety Hearing Transcript

Certification/Adoption Workgroup Testimony

Draft Transcript

February 25, 2010


Presentation


Judy Sparrow – Office of the National Coordinator – Executive Director
Good morning. Welcome, everybody, to the HIT Policy Committee‘s Certification/Adoption Workgroup hearing on health information technology safety. Just a reminder that this is a federal advisory committee. This is being broadcast to the public. There will be opportunity at the close of the meeting for the public to make comments, and minutes of the meeting will be posted on the ONC Web site. Committee members, please remember to identify yourselves when speaking. With that, we‘ll go around the table, and the committee members who are present here in the room will introduce themselves beginning on my right.


Jodi Daniel – ONC – Director Office of Policy & Research
Jodi Daniel, ONC.


Carl Dvorak – Epic Systems – EVP
Carl Dvorak, Epic Systems.


Scott White – 1199 SEIU – Assistant Director & Technology Project Director
Scott White, 1199 SEIU.


Adam Clark – Lance Armstrong Foundation – Director for Health Policy
Adam Clark, Live Strong.


Joseph Heyman – AMA – Board Chairman
I‘m Joe Heyman. I‘m a gynecologist from Massachusetts.


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Paul Tang, Palo Alto Medical Foundation.


Paul Egerman – eScription – CEO
Paul Egerman, software entrepreneur.


Marc Probst – Intermountain Healthcare – CIO
Marc Probst with Intermountain Healthcare.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Latanya Sweeney, Harvard and MIT.


Joan Ash – Oregon Health & Science University – Associate Professor
Joan Ash, Oregon Health & Science University.


Judy Sparrow – Office of the National Coordinator – Executive Director
And I believe we have a number of workgroup members on the telephone. Could you please identify yourselves? They might night have dialed in yet. With that, I‘ll turn it over to Paul Egerman and Marc Probst.


Marc Probst – Intermountain Healthcare – CIO
Okay. Thank you, and thanks for being here. I understand the weather in the Northeast is not so good, so I don‘t know how that‘s impacting some of the folks that are trying to be here. But hopefully they‘ll be able to travel all right. For what it‘s worth, I‘m from Salt Lake City, Utah, and we‘d just as soon have that weather out where we are, as probably Vancouver would.


I understand that Dr. Blumenthal will be here. We‘re hoping to have that around 9:30 or so, and so he‘ll be with us today, and we look forward to that.


This issue was brought up several weeks ago when we started talking about HIT safety within the policy committee, and we were asked at the adoption and certification workgroup to hold this hearing today, and to get a better understanding of the issues that surround HIT and safety. We are very appreciative. This came together quite quickly, and we‘re very appreciative to the people that have given their time, that have traveled here, to give us what we think is just going to be excellent testimony today. Again, thank you for being here.


I think it‘s a very interesting topic, and look forward to the hearings today. That will give us an ability as a workgroup then to work through some recommendations as to what we want to take to the HIT Policy Committee and then to ONC regarding this issue of HIT safety. Paul Egerman had a few logistics that he wanted to get through, and then we ought to be moving into our panels.


Paul Egerman – eScription – CEO
Good morning. I‘m Paul Egerman. First, I want to welcome the members of the public who braved the elements to be here, and also the members of the public who are braving the elements and are listening on the telephone or on the Internet. We very much appreciate your participation.

To describe the logistics quickly, we will have three panels who will be presenting. Our very first panel would really be giving us an overview, identifying the issues involved with patient safety concerns. Then the second panel is what‘s called stakeholders, which people who will be looking at this from a number of different perspectives. The third panel is called possible approaches.


The way the hearing works is each of the presenters will be presenting for about five minutes, and they have already submitted to us some material in advance that the people here have read through very carefully. They knew we were going to have a discussion, a question and answer period. At the end of the entire day, a little bit before 3:00, I think it‘s 2:45 in the agenda, then we will have a period of time for public comments, and so that‘ll be a time for the people here in the room and the people who are listening on the Internet or on the telephone to also make whatever comments they want, and those comments are very much encouraged.


I also want to make sure that we explain to everybody who we are. We are a workgroup of the HIT Policy Committee, so the policy committee was established by ARRA, the legislation that some people call the stimulus legislation, and really what our function is, is to make recommendations to the National Coordinator, David Blumenthal. So we‘re a workgroup of that group, and so what will be happening here is we will be listening to the hearing, the presentations that are made today. We will be listening to the public comments that are made, and then the workgroup itself will then have some discussions in terms of what comments or recommendations we want to make. And those discussions will also be done in the public, and so that will be published.


I think it‘s March 12th is when we‘re scheduled, but the time will be published, and that‘s the telephone conference call. People can listen in to that if they want. Then there‘s a policy committee meeting, which I believe is March 17th, at which point we will present to the policy committee whatever comments we have. If we have a concensus on any recommendations, we will present them, and there will also be a public discussion of those comments at that time. That‘s basically the process. Again, I welcome everybody. It‘s an excited issue.


To get started, our very first panel is called ―Identifying the issues,‖ and so I‘d ask the three panelists who are here to step forward onto the—


Judy Sparrow – Office of the National Coordinator – Executive Director
Rick Chapman is on the phone.


Paul Egerman – eScription – CEO
Yes, and Rick Chapman is on the phone. Okay. Terrific. If we could ask the three panelists, Ross Koppel, David Classen, and Alan Morris to come forward. Rick Chapman, who is the CIO of Kindred Health, is on the telephone. He is one of the people who, because of weather, was unable to make it here. Rick, do you have any introductory comments about the first panel?


Rick Chapman – Kindred Healthcare – Chief Administrative Officer/CIO/EVP
Yes, I would, and thank you very much. Which panelist did you say is not present?


Paul Egerman – eScription – CEO
Gil Kuperman was unable to make it due to illness, which has, understand, nothing to do with the computer system.


Rick Chapman – Kindred Healthcare – Chief Administrative Officer/CIO/EVP
All right. Great. Sorry for not being able to make it in person today. We have a very good panel assembled today for the first topic, identifying the issues, as Paul has said. We did ask the first panel to kick off today‘s meeting in identifying the issues, and we‘ve asked them six questions that we‘d like for them to address, and they have so done in their papers that were submitted, and those were: what are patient safety risks that may be introduced inadvertently through the use of electronic health records or other HIT products? Are there specific types of risks that are more common than others? What are the causes of these risks? What are the ways to prevent and mitigate these risks? How would you weigh the benefits and risks of using EHRs in patient care? How might data on risk best be identified as greater HIT adoption occurs? And what are the factors that might impact an organization from reporting adverse events or known concerns about HIT products?


We are fortunate today to have three of the very four distinguished panelists. The first is Dr. Ross Koppel from the sociology department at the graduate school of medicine at the University of Pennsylvania. Dr. Koppel also is on the faculty there, and he‘s a principal investigator as well as at the Center for Clinical Epidemiology. Professor Koppel‘s work in medical informatics reflects is 40-year career as a researcher and professor of sociology of work and organizations, statistics, ethnographic research, survey research, and medical sociology. And he has been the principal investigator of Penn‘s study on hospital workplace culture and medication eras.


Also with us today is Dr. David Classen, who is a senior partner at CSC and leads CSC‘s safety and quality of healthcare initiatives. Dr. Classen is also an associate professor of medicine at the University of Utah and an active practicing consultant at infectious diseases at the University o Utah School of Medicine in Salt Lake.


And the third panelist that we have today is Dr. Alan Morris, who is a professor of medicine and adjunct professor of medical informatics at the University of Utah, and director of research and associate medical director of pulmonary function and blood gas laboratories at the LDS Hospital in Salt Lake City, Utah.


I would remind the panelists, as Paul mentioned, that we would like to limit the comments to five to seven minutes, so others can ask questions, as we proceed. With that, I‘ll turn it over back to Paul, and I assume we‘re going to ask Dr. Koppel to begin the meeting.

Paul Egerman – eScription – CEO
Yes. Thank you very much, Rick. As Rick said, we asked the panel to be mindful of the timer that you see on the lower right-hand corner of the screen, so Ross.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Thank you. Good morning. Thank you for inviting me, and thank you for having this committee meeting on this topic. I believe that America should commit itself toward using EHRs an other forms of HIT, and I think we can do it within the decade if, and I would argue, only if we‘re willing to systematically analyze HIT‘s problems in addition to their benefits. If we apply a scientific approach to informatics, and usability analysis, I think that faith based EHRs will not get us into heaven.


Like everyone else, I want HIT to increase patient safety, care efficiency, treatment quality, savings, and drug ordering guidance. I want HIT to provide coherent structures for test results and other data, and I wanted to provide better visualization of complex clinical data. Unlike many of my colleagues here who are HIT scholars and advocates, however, because of my training perhaps, I‘ve studied the surveys that have been used to guide and explain the current HIT strategy.


These surveys explored why doctors and hospitals have not embraced HIT‘s benefits. The findings pointed to the cost of HIT, to the physicians‘ resistance. They called them technophobic, hide bound, and perhaps most gruesome, too old. It also talked about overwhelmed hospital IT staff and other user pathologies or user inadequacies.


Now the years of research on HIT suggested that those answers could not be complete. They weren‘t right. The reasons the surveys found this, I‘ve been investigated, was I looked at the questions that were asked of the doctors and hospitals. And the only answer options dealt with the problems of hospitals and doctors. In other words, they found only the questions that they asked about physician difficulties, doctor difficulties. They didn‘t look for any other options, and they didn‘t even give other option answers to talk about the following issues.


So they could have asked, does HIT slow or speed your clinical work? Are EHR data presented in helpful ways, or do they generate unnecessary cognitive burdens because, for example, the data that should be contiguous are in five separate screens, where you‘re scrolling across vast wastelands of rows and columns looking for the needed information. How many information displays are understandable or are disarticulated, confusing, or missing key data? Does HIT distract from your patient care or improve it? And the last one I‘ll ask, although I have about 50 others, how responsive are HIT vendors to acknowledging and repairing defects? How quickly are these defects repaired?


The absence of the relevant questions divert us from understanding the actual HIT needs of clinicians and patients. Now I‘m certain that the people who asked these questions, designed these surveys, were not intentionally deceptive. Well, I‘m certain most were not intentionally deceptive. But the restricted options reflects a series of assumptions or … that says HIT is intrinsically beneficial. Anything that

encourages HIT is good for patient safety. Anything that discourages it or retards it is bad by definition. With this factor, let‘s look at your questions.

The first one talks about patient safety risks. Well, in my work, I‘ve presented about 80 specific patient safety risks related to HIT. These include inability to see the list of current medications, abnormal lab results hiding in sheep‘s clothing, not being flagged as abnormal, drug canceling routines that triple the dosage, CPOE orders that appear on the EMAR, what the nurse sees, as either not there or strangely twisted. If you force me to choose, you know, general categories, I‘d say poor usability, a primary focus on back office functions, on accounting functions, which are the bones of most HIT store bought software, and that‘s to the detriment of the clinical functions. In other words, we‘ve grafted on a clinical function onto what is basically VisiCalc—for those of you too old to know what that is, I‘m sorry—and clinicians‘ inability to obtain constructive responses to reports of defects.


On the usability issues, I‘m reminded of the human factors professor, Stephanie Guerlain, when she examined a whole bunch of EHRs. She said, you know, if HIT vendors designed a car, you‘d have to click on the speedometer icon to see how fast you were going. That‘s professor Guerlain, professor of human factor.


What are the causes? The first two would be the rush to market, and the locked in customer syndrome. If you buy a toaster and it doesn‘t work, you throw it out. If you buy a $70 million to $100 million piece of software, and then you spend five times that amount installing it, implementing it, you‘re wed to it for years. You can‘t change. The high cost of switching means that getting to market is critical for the vendors, and it means that they have you locked in. So the extraordinary switching costs means that there‘s almost no motivation from the vendors to fix problems or to update the installed base. I know a CMIO who calls his hospital HIT incarcerated by the vendor.


I‘ve only got a few minutes, so let me give some specific examples. Doses requiring hand conversions of units because the CPOE matches neither the EMAR nor the pharmacy, even though they‘re both from the same vendors. When entering data into the EHR, the physician clicks no allergies, then she sees no known allergies. Maybe she‘s had some training in epistemology, so she clicks that too. The next doctor sees multiple allergies, which everyone knows means no allergies. Well, it turns out that there are patients with multiple allergies, so everybody then sees multiple allergies, which they know means no allergies, and they ignore it, and that‘s not good for the next patient….


Paul Egerman – eScription – CEO
Excuse me, Dr. Koppel, but….


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Yes. Okay. CPOE systems accept doses for units or drip rates that don‘t exist or are 1,000 times. For all drugs that involve two elements, which are about 25%, the orders to outpatient pharmacies became completely garbled. Sometimes it was word salad. Sometimes it was a dose 70 times what it should have been. In all cases, the vendors responded by either saying, golly, we haven‘t heard of that one before—where‘s Jim—or it was the user‘s fault. Eventually they said, we can fix it, but it‘ll take several months, or it‘ll come in the next upgrade. And the last case, the CMIO taped papers with the names of about 300 multi-element drugs to every screen in his or her hospital.


So what can we do? We need to stop the nondisclosure of confidentiality clauses that prohibit doctors from talking with other doctors about problems. We need openness in the responsiveness to when people report defects. Right now we have a series of – we have a culture of silence. We don‘t talk about errors. We don‘t learn from them. We‘re used to dealing with workarounds. We‘re used to defining them

as user errors or as problems with insufficient training. You can‘t train your way out of it on the basis of the existing software.


My summary: HIT is not like any other device. It‘s not a tool. It‘s the living core of the organization. It‘s the information‘s ecosystem. The reason the best EHRs are homegrown like the Marshfield Clinic here is that they evolve with the organization. To improve out of the box EHRs, organizations need systematic evaluation, and vendors must listen and respond to the clinicians. Analysis of HIT use shows us it‘s not the doctors who are resistant to that evolution, but it‘s those calling them resistant. Thank you.


Paul Egerman – eScription – CEO
Thank you. Dr. Classen.


David Classen – University of Utah – Associate Professor
Great. I just want to thank the committee for inviting me. It‘s a pleasure and honor to be here. And I also want to thank the committee for focusing on this very important area.


Included in my testimony are a series of embedded studies, and I know some people have had trouble opening them within the document, so I have given Judy a zip file of all the studies, as they are cited exactly in the document. If you have trouble opening them, Judy will be posting, I think, Judy, very shortly, the zip file of all of them.


Obviously the safety of health information technology systems, as designed, implemented, and operated, in the complex environment of healthcare is of paramount importance. Whereas other industries such as aviation have extensive monitoring and oversight of the safety of these complex systems, including aviation IT systems, healthcare, as yet, does not have this level of oversight or monitoring. As such, safety problems have developed in HIT systems in healthcare that have raised serious concerns.

But these problems have not been only noted in blood banks, but in other areas as well. Clearly these problems can occur in healthcare delivery organizations, as demonstrated by this study from the Annals of Internal Medicine from Clem McDonald on a barcoding serious problem that led to a near miss. A patient was almost overdosed with insulin. And I think Clem McDonald‘s conclusion probably bears a lot of relevance on our focus today. He said, this near miss shows the computer systems, although having the potential to improve safety, may create new kinds of errors if not accompanied by well designed, well implemented, crosscheck processes, and a culture of safety. Moreover, and I think this was a very important observation. Computer systems may have the pernicious effect of weakening human vigilance, removing an important safety protection.


HIT systems clearly can prevent safety problems, as in the study cited here from LDS Hospital where a reduction in adverse drug events on the order of 70% was accomplished with a complex clinical decision support system. And that, as yet, is the largest reduction yet seen, but that was the result of a very complex EHR with very carefully developed clinical decision support on top of it.


The problem is, and the great challenge is that just putting these systems in place does not guarantee that kind of impact on safety. And I think that‘s really nicely demonstrated by the next article cited by Jonathan Nevakurdle that showed that in a hospital with a completely implemented EHR with completely electronic dispensing and barcoding for administration, with all that in place, there was still a 25% incidence of adverse drug events in the patients in that hospital.


That study traced the problem, and the failure to improve safety in that setting to the effective use of clinical decision support. And this raised the concern that many organizations that might have

implemented EHRs may not have seen the improved safety of medication use that was documented at LDS Hospital. To evaluate this concern, ARC and the Robert Wood Johnson California Healthcare Foundation have funded the development of an EHR flight simulator that‘s been used in hundreds of hospitals in the United States, and now in the U.K. to evaluate the safety of these systems after implementation. And the result of that information included in the article embedded suggests that there is enormous variability in actual operating functional EHR systems in their ability to improve medication safety.


In addition, HIT systems can help detect safety problems, as outlined in this study from LDS Hospital that improved the detection of adverse drug events by over 60 fold when an HIT system was used to detect these problems versus a voluntary reporting system. This type of approach, using EHRs to improve the detection of safety problems, has now been broadened to a surveillance system in a large health system that detects all inpatient safety problems in real time, built on top of a commercial EHR system, not a homegrown system.


With that perspective, what are the patient safety risks that may be introduced inadvertently through the use of electronic health records and other HIT products? Unfortunately, there‘s no overall reporting database of patient safety risks or problems that can occur from HIT, although there are several local studies of this problem, and many of the panel members will speak to this. From a national perspective, we can look to the MEDMARX system, which has been tracking medication errors, to look at the impact in EHR and CPOE systems. And it just bears worth noting, in 2006, of approximately 176,000 medication errors reported to MEDMARX, 43,000 were in some way, that's 25%, related to the use of HIT, to give us some sort of sense. But unfortunately, there are no overall studies that have examined that in detail.


In addition, several researchers have conducted broad assessments of the safety risks of CPOE, and one of the best is included in an article published in JAMIA by Emily Campbell and associates that looks at all the unintended consequences of a CPOE system, and I think that really is a very helpful article looking at many of the issues that CPOE, as a manifestation of HIT, can generate.


What are the causes of these risks? Well, the causes of these HIT risks are multiple and related to software development, design, the method of implementation, the method of updating, and the method of testing. And I‘ve cited several studies here that look in detail at the cause of HIT risks, and those particular studies have found, as you would not be surprised, that these risks are usually multiple, that there are multiple holes in the Swiss cheese model that lead to failures in these systems, and it‘s usually not one single cause.


How might we prevent or mitigate these risks? Well, I‘ve attached a paper from … that outlines eight steps within a socio-technical model that might be used to mitigate these risks, and I think it‘s a nice framework to view this perspective.


How would I weigh benefits and risks of using EHRs in patient care? Well, unfortunately, the answer to this question remains unknown at the moment. No study that I‘m aware of has evaluated both the risks associated with EHAP systems, as well as their concurrent benefits. Several studies suggest that HIT benefits are not being achieved as expected, and it may take several years to achieve these benefits, as outlined in the study of….


How might data on risks best be identified as greater HIT adoption occurs? One suggestion is a recent article that we published in JAMA that suggested that we might be able to cade an overall monitoring framework for safety, and that framework could be placed on basically five different components, creating a system that enables practitioner organizations to report patient safety events or hazards related to

EHRs, enhance EHR certification that includes specific assurance that good software development procedures have been followed, along with evidence that previously reported adverse events and hazards have been addressed. Creating self-assessment, attestation, and testing tools for local organizations, creating local, state, and national oversight in the form of onsite accreditation-like activities for EHRs. And, finally, the creation of a national EHR related adverse event investigation board that could review incidents and have the authority to investigate, much like the NTSB.


What are the factors that might impact an organization from reporting adverse events or known concerns about HIT products? Clearly at the organizational level, the hospital level, concerns over liability, reputation, confidentiality, contractural issues will all likely deter organizations from reporting adverse events or serious safety risks. Clearly, approaches need to … mitigate these problems. One approach suggested by Jim Walker, et al, which I‘ve attached to this testimony, I think is a very reasonable one, and I‘m sure Jim will speak to it.


However, organizations will need self-assessment tools that can allow them to evaluate safety risks in their operational systems, especially with the complex, yearly, EHR updating process that goes on in all organizations. Such self-assessment tools are described in the article attached and might offer organizations another method to evaluate the safety of their systems. And given the diversity of implementations of these systems, these self-assessment tools at the organization level, I think, will be actually critical. Thank you.


Paul Egerman – eScription – CEO
Thank you. Dr. Morris?


Alan Morris – University of Utah – Professor
I appreciate being able to participate with you in this important event. Thank you for inviting me. I thought the best contribution I could make with my verbal comments would be to talk about an approach that I think would complement the major emphasis that exists on systems, assay systems, problems, and system solutions. In that end, I would like to emphasize two propositions of a list I included in my handout.


First, like any tool, electronic health records and decision support protocols can be used appropriately or inappropriately, and can lead to favorable or unfavorable outcomes. Fundamental in this is that the reason for collecting information and aggregating it into records of data is to influence, at one more or more levels, human decision making. So I want to focus on what I think could be a nice complement to systematic approaches, and that is the patient/clinician encounter level as a complement to the systems approach. This would complement the eight steps alluded to a moment ago by Dr. Classen.


I think we should recognize that decision support is too vague a term to be ultimately as helpful as we wish. It‘s quite important to distinguish between guideline, protocol, and adequately explicit protocol. That is, a protocol with enough detail that will lead multiple clinicians given the same patient data, the same patient states, to the same decision. In this regard, it‘s important to avoid parochialism and to emphasize generalizability. So I want to make a few comments about a few of the elements. I won‘t try and touch every one of the questions. They‘re addressed, at least briefly, in my written handout.


One of the reasons we have risks is that we confound different levels of inquiry. So just as extrapolating from reductionist science results to clinician/patient encounter decisions can be profoundly misleading, also trying to extrapolate from the more systematic approach to analysis of HIT system behavior, to clinical individual patient decision making can be misleading. In that regard, I want to focus on the importance of complementing the emphasis we have on systems approaches with a parallel effort to

emphasize the patient/clinician encounter, use of development of and study of decision support tools in HIT application.


National guidelines are generally broadly defined suggestions for care that leave much to be decided through the judgment of clinicians viewing the guidelines and interacting with patient problems. This means that we cannot achieve the kind of uniformity that would be best for healthcare delivery, both for research and for clinical care purposes. Now my colleagues and I have been engaged for 25 years in the generation of detailed protocols that achieve such uniformity in different institutions and in different cultures. And although they standardize decision making, they retain because they‘re adaptive. They retain patient specific instructions for care, so I want to emphasize that as a complement to systems approaches because I think it will inform the development of systems if we in fact do the right kind of work at the clinician/patient encounter and involve interested clinicians in the pursuit of problems that raise their concerns regarding patient outcome.


One of the important issues I think we need to address with regard to HIT applications is to avoid a focus on parochial development. Most uses of which I‘m aware emphasize the development and the local environment for the purposes of aiding local physicians to do their job or other clinicians: nurses, therapists, and others. This is useful at the local level and unquestionably valuable. But at the broader community level, what this is likely to do is to formalize the unnecessary variation first decried by John Lundberg in the 1970‘s and still a major concern to all of us.

What we don‘t want is ten different institutions developing ten very effective ways of managing diabetes, each tailored to their own institution so that we cannot use the community-wide data to ask, answer, and improve our approach to clinical problems. Generalizability becomes quite crucial here, and here the emphasis on clinician/patient encounter and clinicians crossing institutions to develop tools that they would use for inquiring about the effectiveness of treatments, for example, and then thereafter using these tools for translation is quite important.


Finally, I‘d like to discuss briefly accountability. Our regulatory agencies provide a very important service to the community and a high level of protection that we all enjoy and value. We also must ask about the potential harm done by regulatory agencies through oversight that impedes research, development, and innovation. My own activities have been impeded for almost a year and a half by regulatory agency oversight by one of the FDA groups, and I‘m pleased to see that that group is represented here. The OHRT has interfered with the work of the Acute Respiratory Distress Syndrome Network and shut them down for a year, interfered recently with the work of Peter Pronovost in his efforts in Michigan to improve quality, so that we need to assess not only the benefits of regulatory intervention, but also the harmful impacts that result from their discharge of their congressional responsibilities.


Finally, how to weigh the benefits and risks of using electronic health records is a problem of major proportion. Dr. Classen, I think, is right on when he says it has to be evaluated systematically. I fully agree with that, and I submitted a table to you of potential impacts of reproducible that is adequately explicit decision support tools that has 14 items, 11 of which have potential favorable and potential unfavorable impacts. So the reason I put that in is, I think it‘s a case example of what‘s operative at the higher system levels for HIT in general, and that is that there can be positive and negative impacts and, therefore, it‘s extremely difficult to predict and extremely important to assess systematically through experimental studies the impacts of such systems. Thank you very much.


Paul Egerman – eScription – CEO
Thank you. And thank you, all three of you, for excellent presentations. We have some questions. Actually, the first question, I was very interested what you said, Dr. Morris about some of the benefits and

risks of regulation. And what I find very interesting, as I look at how people are discussing this issue is they‘re using a lot of different words, almost interchangeably. They talk about HIT sometimes. Sometimes they talk about EHR, which is not quite the same as HIT. And they also, a lot of the issues, especially a lot of the issues that Dr. Koppel, that you raised, were really issues related to either CPOE systems or medication systems. That‘s different.

My question is, how should we view this? Should we look at all of HIT with a broad brush? Should we be looking at specific areas, applications like CPOE that may have higher patient risks? To use like an FDA terminology, is HIT one device, or is this multiple devices? Dr. Morris?


Alan Morris – University of Utah – Professor
May I take a stab at that? Alan Morris. What we should look at depends upon what question we want to raise. What‘s the scale of inquiry, the scale of interest? For example, if someone is interested in studying the heavens, they address the question to astronomical levels of inquiry, and use telescopes and not microscopes. If someone is interested in studying cell biology issues, they address the question to that scale of inquiry, and use microscopes, and not telescopes. The questions and the instruments, the techniques, the analytical approaches are quite different for different scales of inquiry.


It seems to me that there are a number of scales involved here. We have the national level, systematic approaches, questions that have to do with how we would implement electronic medical records, health information technology, broadly across institutions or large numbers of institutions. But there are complementary scales that are, in some ways, even more important, and that is how would we use this technology to answer questions for a practicing physician who has to respond in her clinic, in her office, right now to a patient‘s needs? And I see these as complementary, and they‘re mutually informative. I think it would be a serious mistake for us to engage exclusively in system approach questions, system level questions without the complementary inquiry at the patient/clinician encounter because, ultimately, the benefits of what we realize, the benefits that we realize from a systems approach will be realized only because of actions at the clinician/patient encounter level for many, if not most of the issues we have in mind.


Paul Egerman – eScription – CEO
State your name.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
It‘s going to be darn hard to disaggregate one from the other. A CPOE system that doesn‘t interact with the pharmacy system or the EMAR is a paperweight. An EHR that doesn‘t incorporate data from the lab reports is a paper chart. They are all of the piece, and I don‘t know how you can separate them.

Some of the issues that are relevant to all of them like usability, like defect reporting, like responsiveness to problems, those are all pan systems. CDS, computerized decision support, interacts with both the EHR and the CPOE system. Many people consider them part of those systems, although I disaggregate them. Currently, they are overridden or ignored 80% to 96% of the time. How can you separate the usability of CDS from that of the EHR or the CPOE system? I see them as a part of a whole. And, by the way, I‘d include in that the EMAR, the nurses‘ administration system for medication.


Paul Egerman – eScription – CEO
Let me just ask about that because you look at usability. The issue that we‘re addressing is patient safety risks, because usability, of course, you can always argue usability is important. I understand why usability is important with CPOE systems, but is it that important for the government to be interested in

for, say, an inventory control system or a nurse scheduling system where the patient risk … essentially much lower?


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Well, if by nurse scheduling, you mean scheduling of medications, then I think you….


Paul Egerman – eScription – CEO
No. I meant like scheduling the staff.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
No, I don‘t think we‘re … we‘re not interested, I agree totally, with inventory and nurse scheduling. That‘s not our problem. That doesn‘t have direct patient – well, it doesn‘t have direct patient safety risk in the same way that an EHR or an EMAR does. So I would include the IT functions that we‘ve talked about and not the others.


Alan Morris – University of Utah – Professor
Yes. Dr. Koppel, I think, makes a very important point, focused at the system level. He‘s absolutely correct, if you‘re asking about the system. I don‘t know if there are any climbers in the room, but if you‘re a climber, and the rope breaks, it‘s hardly important what section of the robe broke, as you‘re falling towards the ground. So if you‘re talking about the system, he‘s absolutely right. They‘re of a piece, and they‘re integrated, and they must all work together.

He points out that decision support is not very frequently followed, in fact, almost always not followed. But I would complement those comments at the system level by pointing out that when we focus at the clinician/patient encounter level, we have 94% compliance in an open loop servo control system in multiple U.S., and also in Singapore site, crossing cultures. So it is possible to develop the rules that enable clinicians to respond uniformly, and it is also possible, as we have demonstrated, to use those tools, those decision support rules, the protocols, to effect translation of research results to practice by moving them into the practice mode, as we‘ve done at Intermountain Healthcare.

I see these as complementary. You‘re absolutely right from the system perspective, this is a huge and extremely complicated problem that is daunting from many perspectives, including the financial investment perspective with which you opened your comments. But on the other hand, at the clinician/patient encounter level, we can learn things and gain insights that could inform the system approach very, very importantly if this were a parallel emphasized activity, but I am sorry to say, from what I see, it is not a parallel emphasized activity. I think it will be a national risk to continue without such a complementary activity.


David Classen – University of Utah – Associate Professor
This is David Classen speaking. Imagine if we had attempted to regulate the air traffic control system as a medical device. I think it would ultimately fail. So I would agree with Alan. I think we‘re going to have to focus regulation both at the system level because HIT, and especially EHRs, is so pervasive. It‘s so complex and so ubiquitous, especially as we lead to more and more adoption. I don‘t think we can afford to regulate it solely as a medical device in thinking that we can put boundaries and a box around it easily. I would agree with Alan that I think we need to focus regulation both at the system level, and also at the local level. And Alan‘s construct as doing it as the physician/patient or the clinician/patient interaction, I think, is a very reasonable construct.


Alan Morris – University of Utah – Professor
With regard to regulation, I‘d like to make one comment, and that is, I think it would help us enormously, catalyze and enable competent innovation if the FDA would consider reposting the 1989 guideline to decision support, oversight, and exemptions that was, unfortunately, from our perspective, removed. That guideline would help everybody. Right now we are being impeded enormously, and I think that deserves redress.


Paul Egerman – eScription – CEO
That‘s great. Let me open up for the panel for questions. Did you have a question, Paul?


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Yes. Paul Tang, and I wanted to jump perhaps away from the solution and go to the context first because this is a great panel to do that. One, thanks to the panel for all the work you‘ve been doing in raising awareness of the potential risks of using this technology because they‘re certainly well appreciated at this point. As we look toward solutions, and whether it‘s regulation or just policy, we‘ve got to, of course, balance the needs, the benefits of using the technology or any intervention, the risk. You talked about the potential impact on innovation and the costs of going through that.


Perhaps if I could get you to weigh, using medication errors as an example. David had mentioned that 25% of med errors that were reported were associated with the use of an HIT. How do you weigh the use of HIT and its association with some of these harms against the former or continuing, unfortunately, use of the pen and use of the human brain? In other words, how do we balance the intermediary versus the entire context of making decisions and how do you make better decisions, and how do you avoid harmful decisions and put it in that context rather than just associating it with an intervention?


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
I think HIT is better than paper, and I think it helps many elements of patient safety. I think HIT is also better than cuneiform tablets, smoke signals, semaphores, and carrier pigeons. But the question is, is it good enough? And, more importantly, are we on a trajectory to improve it? If we are obliging people to buy it now, the systems that are rather primitive, are we ossifying a system that is not very good, and sort of walking that in place? I worry about the innovation from this point forward. We know about the innovation to date.


The other side of that is that good EHRs will help decision support dramatically. Right now we have studies based on a 25-year-old with one problem taking one drug, whereas your average patient is a 76-year-old with 5 comorbidities taking 12 drugs. EHRs could feed back that information and make CDS truly meaningful such that we don‘t override 96% of the annoying alerts. So it‘s both. I think we have to get better EHRs, simply not more EHRs, and that‘s my plea.

Alan Morris – University of Utah – Professor
Humans are not capable of designing the system we would want, implementing it, and having it work. I mean, that‘s, I just don‘t think that‘s going to happen. But humans are capable of starting with a design and iteratively refining the system. That certainly has been characteristic of the work my colleagues and I have done or 25 years. We‘ve never designed a protocol that worked well when we first initiated the rule set. It‘s always an iterative refinement process.

It seems to me, one of the big challenges, and I‘m not quite sure how to deal with this. I‘m not very knowledgeable about business, but one of the big challenges, given the fact that we are a market driven economy, is how do we do this in a manner that allows iterative refinement, you know, faced with multiple different corporate interests and private patents and so forth? How do we iteratively refine this process so that we get eventually to where we want to go? If we want until we have the system that works as well

as everybody would like, I think we‘ll have perhaps another 30 years like the past 30 years of waiting for a comprehensive dictionary and set of standards and terminology and so forth that have been occupying a lot of people who have an interest in the top down systems approach. I think this is part of the get it running from clinicians with passions to address certain problems, resolve those problems, improve those problems, and do it with a mindset of iterative refinement, but I‘m going to have to be advised about how that can be done in a market economy by others.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Just to respond to Alan‘s point, a lot of it could be addressed by having greater responsiveness to reported defects. Right now they go into the vendor‘s dungeon where they may or may not be addressed depending on their development schedule, their iterations, the patches, and the like. We need, as I think Jim Walker will discuss, a coherent reporting system for defects that are publicly available that must be addressed, even if it‘s to say the user misunderstood, and it‘s not our fault. But we can‘t hide the defects. We can‘t punish people who report them, as is the current system. We have to relish those reports because they are the tools for systematic improvement. And we can‘t wait for the perfect, obviously, but we need to have a system to fix the problems, to identify them and then fix them.


Paul Egerman – eScription – CEO
Do you have something to say, David?

David Classen – University of Utah – Associate Professor
Yes. I would just say, Paul, that it‘s sort of like moving from the horse to the car. It‘s cool that everyone is saying we‘re not going to back to the horse, but look how many years it‘s taken us to get all the potential capabilities realized in a car from a horse, and we obviously had new risks with a car that we never had to face with a horse: seatbelts and airbags. And I think that‘s very much where we are here, which is the great challenge here is the failure to realize the benefits that many of the leading centers have shown us, and our work with the EHR flight simulator has shown enormous variation in achieving those benefits across the country in different hospitals.


I come back to Alan that the people who have done this well, who have realized these benefits, have done it iteratively. Right, Alan, over years and years? And I think that‘s going to be a key part, much like with the automobile. It‘ll take a number of years to realize those benefits.

Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Thank you. This was a very helpful discussion.


Latanya Sweeney – Laboratory for International Data Privacy – Director
…one more question?


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
No … follow up….


Paul Egerman – eScription – CEO
….


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
What was stated that Ross said we need to have something good enough to get started, and I think all three of you said, and it is extraordinarily important, and we may or may not have it now, have an ability to iterate and continuously improve and make safer. So I would just appreciate the panel‘s opinion on, is it

good enough to get started at this point? And then we‘ll talk about the – I think we all agree about the interation.


Alan Morris – University of Utah – Professor
Which scale are you addressing when you ask is it good enough to get started?


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Let‘s talk about the EHR since that's what we are trying to push.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Well, I wouldn‘t rip out every EHR that‘s there and start with paper and pencil and work on it, but I would say that we have to not base our belief in the EHRs on faith, and say that if it‘s there it‘s, by definition, better. I think we need to, as I argued before, just keep on working on them, and the key to that is to use informatics as a science and understand what‘s going on, to study the problems. You‘re not going to study them by voluntary reports because no doctor that I know of makes an intentional error and knows about it. If they start writing the wrong script, or they start ordering the wrong lab, they stop it and fix it.


It‘s the ones they don‘t know about that we need to study. And we need to study that using a variety of evaluation techniques. Yes, I am the incoming chair of the AMEA evaluation working group. We need that kind of on the floor, constant analysis, and we need to encourage those reports, 99% of which we have no idea about the errors. We need to find them and fix them iteratively.


It‘s not an accident that all of the good EHRs are the ones that have grown organically at places like Harvard, BI, the Marshfield Clinic. They‘re homegrown systems, as Richard calls them, because they have evolved iteratively. It‘s the out of the box systems that are so problematic, in part because of the market structure that wants to see them sold and resold and updated, and they can‘t deal with or won‘t deal with the problems that are reported on a daily basis. We need to change that so that we can learn. But sure as heck don‘t want to pull them all out and start with paper and pencil again.


David Classen – University of Utah – Associate Professor
I think, Paul, you can‘t stop it. It might be nice to try to stop it. But the horse is out of the barn, and even if you were to try to stop it, I don‘t think it‘s possible to design the perfect system the first go of it anyway, so I think we must start, and we must learn. So what we really need to do is build a learning system and a monitoring system that, if this is going to be a reality, much like cars replacing horses, we need to decide how to do a much quicker job than we did with the car in realizing its capabilities and building a safe system. But I think the horse is out of the barn.


Alan Morris – University of Utah – Professor
Yes. Alan Morris. I concur with David Classen, and I think systems, many systems are ready to be implemented. I would like to express a brief concern. I don‘t want bugs in programs, and I don‘t want defects in systems any more than anybody else. However, I‘m a little concerned by the emphasis on anecdotal description of an event that took place that was unfavorable without a systematic approach to the incremental benefit or harm that‘s done by the system. I mean, during a discussion with the FDA, I pointed out to colleagues there that I don‘t want bugs in our programs.

But my colleagues and I are quite happy to accept a program with bugs that produces more favorable outcomes for our patients than we would have without the program. In other words, if the car is not quite perfect, but it works a heck of a lot better than anything else you had without it, you‘ve got to look at the whole. So an emphasis seems to be some common right now on anecdotal descriptions of particular

events that are adverse needs to be tempered by a systematic study of risks and benefits, and that complements what David Classen just said about evaluation.


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Are you all set? Great. Thank you very much. Latanya?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Yes. I just wanted to make sure that I understood the summary of the issues that I heard from sort of a computer science standpoint. I heard about usability factors and their issues with respect to workflow conflicts. I heard a lack of invariant and inconsistency testing in the software. And I heard about openness of reporting, sort of like what we use with CERT for computer security problems where you can report the problem, and it becomes generally known, and that puts pressure on the manufacturer to then find a remedy, and then the remedy of these patches gets pushed in.


First of all, I wasn‘t sure I captured everything, and so I wanted to respond on that. And the second part of it is, I wanted to push a little more on this reporting system. CERT works because we don‘t have, you know, a million operating systems. We only have a few operating systems in computers, and they‘re pretty standardized. But my take on the medical systems is that the clinical decision making aspects in particular, there‘s far more variability. So how would generalized reporting actually work? Those are the two things: do I have the issues and how would standardized reporting work?


Alan Morris – University of Utah – Professor
I didn‘t understand fully the questions. I had a little trouble hearing because of the sound. Would you repeat the two questions, please?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Sure. When I was summarizing my notes, I heard, so the first question has to do with can I summarize the issues from a computer science standpoint of what I heard? And I would say that there was a lack of understanding, the usability factors, and workflow conflicts, that there was a lack of invariant testing, and a lack of inconsistency testing in the software. That many of you spoke to the openness of having a reporting system. And you, Dr. Morris, also spoke with respect to perhaps an … testing with respect to the patient/doctor encounter. I wanted to first make sure that that‘s a good summary of the issues from this panel. Why don‘t we do the first question, and I‘ll come back to the second?


Alan Morris – University of Utah – Professor
Yes. I‘m not the most expert here, but—


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Yes, the only other point I‘d add is the need to capture market share now with the stimulus money to walk in facilities and physicians offices because once they do that, the cost of switching is so vast that the vendor has limited motivation to truly improve the system and to respond to problems. We need to get that responsiveness restored. And if we need an FDA involved reporting structure, then that‘s what we need. If we need a button on every single screen in an EHR that says I‘ve got a problem with this, and we capture it, and then we get a chance to examine that.


We don‘t go – you know, this is a problem. Stop the system. We go, let‘s look at this. Four thousand physicians have said, look; this is a problem. Maybe there‘s a problem there. We‘re only asking for analysis of the reported problems, not discounting them as user error or insufficient training. That‘s it.


Latanya Sweeney – Laboratory for International Data Privacy – Director
My second question, and I‘ll be – sorry.


Alan Morris – University of Utah – Professor
I‘m puzzled by your question about inconsistent evaluation. My understanding is that vendors and developers are quite consistent in doing code testing and evaluation in order to meet federal requirements, regulatory oversight. Am I misinterpreting what you said?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Normally, if you go to a Web form, and you were typing in a form, and it asked for a date of birth, and I give it a date that‘s not possible, if it accepts that, it has a lack of – it‘s not doing a proper test. It‘s not doing a semantic test on allowable values. Some of the examples I heard was that that kind of testing, especially in the medication, in the CPOEs and so forth, is occurring. That‘s something that‘s somewhat reasonable to repair or fix.


Alan Morris – University of Utah – Professor
Right. Just to echo Dr. Koppel‘s comments, in our clinician patient encounter protocols, we do capture every time a clinician declines an instruction. That occurs about 5% of the time, 6% of the time. We do capture the decline and the reason, and we do feed that back in an iterative review process. That‘s part of the iterative refinement that‘s crucial. Your second question…?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Yes, and I‘ll be quick. I‘m looking for a sense from you, how generalizable will reporting be? In the computer world, we would look at CERT, which is what has generated that whole idea of, gee, Microsoft this or operating system that has an error, and it gets reported by people who say somebody broke into my system, or this had this problem, and then they fix it, and it gets patched along. So over the last five years or so, we‘ve seen those kinds of systems and reports working. They work pretty well in the computer environment because they‘re not – the systems at the operating system level in the computer, there aren‘t very many choices. A lot of people are using them, and so the number of reports can be high, and it‘s really easy to replicate on a test system, the problem that was described. Something that‘s similar with respect to FDA reporting as well, it has a similar kind of containment and generalizability. I‘m a little worried. I‘m not sure that it‘s true with respect to HIT systems.

David Classen – University of Utah – Associate Professor
I think the HIT systems are more complicated than that. I would agree with you. And there are several initiatives going on that might be helpful here. Jim Walker, who is going to talk to you later, has created a national HIT hazards ontology, if you will. And that might be helpful in reporting these problems. Jim, I think you‘re going to talk about it.

In addition, I cochair the National Quality Forum patient safety common formats committee where we are trying to create HIT reporting capabilities within the patient safety organization statute. Bill Munier is going to talk about that. What we had thought is that perhaps unifying this effort of Jim Walker and the ARRA common formats might be one way to begin to organize those reports and learn from them in some sort of structured way so that we don‘t get a million just narrative reports. I think that‘s one approach.


There is another possible approach, based on the work we did at LDS Hopsital, which is, we can use these systems as surveillance systems to look for these problems, which is what Alan is talking about, so that we don‘t have to rely on just volunteer reporting. We can actually use surveillance as a way to detect these problems as well.


When we wrote the Institute of Medicine Patient Safety, Achieving a New Standard of Care report that Paul was involved in, we thought we needed to do both, both voluntary reporting systems and some sort of surveillance system. And I think that‘s a wonderful opportunity here to maybe envision what that might look like.


Alan Morris – University of Utah – Professor
And with regard to specific protocols that might be used for specific clinical problems, these adequately explicit protocols to which I made reference, there‘s no need for surveillance because all the data are captured for the operation of the protocol. And, therefore, all the data are available for review. And it‘s from those data that we can derive iterative refinement imperatives. I don‘t know if you understand fully what I mean, but a clinician has a screen. She sees something. She either accepts or declines. She puts in the information. All of that is captured in the electronic database and available for study.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
David‘s research found that systems that come from different vendors that go from the EHR to the CPOE to the pharmacy IT system don‘t work. Okay. That‘s David‘s research that cross-vendor systems for that don‘t work. The level of complexity here is astounding. Hopkins has, I believe, 12 EHRs. We‘ve got, at Penn, three or four. The amount, the combinations and permutations of screw-ups is beyond human capability of appreciating.


The systems could be good in and of themselves, but combined, it‘s close to chaos. Yes, of course, we‘ll fix it eventually. But in the interim, there are profound patient safety risks. That doesn‘t mean we shouldn‘t try, but—


Paul Egerman – eScription – CEO
Great. Are you all set, Latanya?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Yes. Thank you.


Paul Egerman – eScription – CEO
Okay. So I have Joan, Carl, and Marc and George. Before you speak, Joan, we have a lot of people asking questions. To the panelists, I‘d say, I appreciate all your contributions. However, it‘s not necessary that every panelist answer every question. I had to say that. Sorry.


Joan Ash – Oregon Health & Science University – Associate Professor
I‘d like to go back to the definition of an EHR and whether we view it as a whole, or whether we pick apart the parts. And Dr. Morris said something very intriguing about the fact that I think it‘s a part of the EHR that is regulated by the FDA, and I‘d like to find out more about that.

Alan Morris – University of Utah – Professor
During one of our research projects supported by an NIH contract, we were required to submit our decision support tool for FDA review, and the FDA did a series of determinations that varied from this is not a problem. We have no jurisdiction. Go ahead and do this. It‘s an open loop, servo control system. That is, there‘s a clinician in between the instruction generation by the protocol and the actual execution of anything for the patient, up to the requirement for an IDE, and investigational device exemption, for a clinical trial.


We are now overviewed by the FDA, and one of the consequences is that trivial changes made in the protocol, for example, we changed the frequency of assessing a blood glucose to make it more consistent

with a clinical practice in a pediatric ICU and, therefore, safer, has taken us nine months to get approved, and has basically shut down our program for nine months. That‘s one example of where regulatory oversight can be an impediment to competent innovation.


Joan Ash – Oregon Health & Science University – Associate Professor
Thank you.


Paul Egerman – eScription – CEO
Great. Thank you. Next we have Carl.


Carl Dvorak – Epic Systems – EVP
David‘s placard fell off, but I think you‘re up next, right?


Paul Egerman – eScription – CEO
Pardon me?


Carl Dvorak – Epic Systems – EVP
David‘s placard fell off, but I think he‘s up next.

David Blumenthal – Department of HHS – National Coordinator for Health IT
I sacrificed my placard.


Carl Dvorak – Epic Systems – EVP
Adam or Scott told me not to breathe, and it wouldn‘t tip over, so I was taking his advice. One of the observations I‘ve had is having worked with multiple organizations who have switched from homegrown to out of the box or commercially provided EHRs is that they‘ve actually gone on to even more impressive results. Has anyone in the panel studied the level of organizational investment or involvement or engagement?


To me it seems that an organization willing to create their own EHR is demonstrating the strongest belief that it will likely be a good thing and, therefore, they execute better. They focus more. I wonder if, instead of homegrown systems versus commercial systems, what we‘re actually seeing is the net effect that causality is really the organizational investment, involvement, and the execution that ensues due to that engagement. Has anyone studied that?


David Classen – University of Utah – Associate Professor
Anecdotally, Carl, in our work with the EHR flight simulator, what we‘ve found when we look at the real exemplars is they‘re organizations that have the mentality of innovation and ownership. Whether they have a homegrown system or a commercial system, they‘re likely to take it and customize it iteratively and aggressively. I would say that you‘re observation is correct. Whether they have their own homegrown system, or whether they have a commercial system, they will take that, evolve it, and customize it because they have the organizational culture to do that. And so, as mentioned in several of the studies, that cultural innovation and patient safety focus, I think, is a marker of organizations that are likely to take whatever product they have and excel with it.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
This is Ross Koppel. Let‘s not focus on the culture, says the only social scientist on the panel. Let‘s focus on the recursive loops that enable them to keep on iteratively improving them rather than getting a nasty response from a vendor saying, A, it‘s your fault. B, it doesn‘t exist. And, C, it‘ll take seven months to fix. So culture is important, but fixing the darn software is also important.


Paul Egerman – eScription – CEO
Okay. Before I call on David, I just want to make an observation. People are talking about homegrown and vendor solutions as if those are the only two choices. But there is a third choice, which is, there is an open source solution that is available. We have a vibrant open source community, and so I did want to mention them. Did you want to make a comment, David?


David Blumenthal – Department of HHS – National Coordinator for Health IT
First of all, yes, I do. This is David Blumenthal. I also want to thank the panelists, thank them for their work. Making certain that anything we do is as safe as possible and done as well as possible is critical for our activity, so we welcome your perspectives and your comments.


I wanted to reiterate a question that Paul asked in just a slightly different way. This is a question of whether intrinsically an electronic information system is more capable of innovation and improvement than a paper-based system. Do you have any opinions about that?


Alan Morris – University of Utah – Professor
I‘ll start. Alan Morris. Just from the perspective of organizing, collating, and presenting information in a systematic and readable format, the electronic system is light-years ahead of anything we could do with paper. From the perspective of trying to get to a consistent link between clinician decision-making and evidence-based practice, the electronic system has the potential to do things that are just not achievable with paper. So I think the possibilities are pretty clear that the electronic system is able to do more things.


It‘s expense, organization, integration, and so forth that present the challenges, and also the willingness of the clinical community to adopt the kind of guidance that would lead to uniform clinician decision-making for the same patient state. That is not a given. Remember, clinicians are bright, committed people, who have generally entered the field because, among other things, they like being captains of their own ship.


I‘m not quite sure it‘s like herding cats, but it‘s not a trivial challenge to get people to agree to abandon personal style, but in fact that is something we‘ve been able to achieve with the early adopters who have collaborated with us at multiple institutions, that people are willing to look at the information and agree that there are many things we don‘t know that are associated with many different styles of response and care. And for those things that we don‘t know, for which evidence is lacking, our colleagues have been willing to abandon their personal styles and adopt a reasonable approach linked to the best evidence and the best understanding of physiology and so forth. But that is a significant challenge.


David Classen – University of Utah – Associate Professor
David, just in two areas where we‘ve actually measured the ability to impact, one is reducing adverse drug events. Clearly, electronic systems have almost an order of magnitude greater capability to reduce them than paper systems. Then the other area, a detection of adverse events in real time, clearly we‘re talking about more than an order of magnitude in just detecting these things, both with the homegrown systems and … commercial system, and … system.

David Blumenthal – Department of HHS – National Coordinator for Health IT
So if you were designing it….


David Classen – University of Utah – Associate Professor
So the capabilities are incredible. The question is, are those capabilities realized.


David Blumenthal – Department of HHS – National Coordinator for Health IT
Fully realized.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Yes, and are they realized in ways that don‘t introduce additional errors. So if you precode information into a system wherein everybody will get X, Y, and Z drugs, and then you make it very difficult to cancel one of those drugs because it‘s inappropriate, you create problems that are unneeded. In theory, the systems are dramatically more plastic and better. They can be made to respond to the needs of clinical work, but they just have to be fixed.


David Blumenthal – Department of HHS – National Coordinator for Health IT
So the next question I wanted to raise is not so much – it‘s both a question and a comment. Dr. Koppel raised the concern that the design of policy would force an ossification of the current system with primitive technology. And I wanted to get your reactions to the meaningful use framework in light of your concerns about innovation and continued innovation. Just for informational purposes, the notice of proposed rulemaking, which is now available for public comment, outlines a staged approach, not a single standard of use, but a staged approach towards increasingly demanding use of electronic health systems over time. Do you think that that increasing demand on users over time could have the ability to influence vendors‘ willingness to improve their systems over time?


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Thank you for the question. This is Ross Koppel. The staged approach requires that the physicians increasingly order more and more medications via the CPOE and the like. It sounds reasonable, but I have a problem with it, and the problem is this. We give the clinicians not ready for primetime software, and we ask them to use it more and more. The burden is on the clinicians to meet the meaningful use standards, not on the vendors to produce usable software that would enable them to do it.


If I were in charge, I would ask the vendors to make better and better systems, given the meaningful use standard. And then I would expect the clinicians to eagerly seek them. I don‘t know any physicians who actually want to use lousy systems and create patient safety risks. The emphasis on meaningful use focuses on the idea that the doctors have to be bribed or coerced to use the systems, and I don‘t think that‘s true. I think doctors want to use good systems, and use them in the way that we‘ve outlined. It‘s the systems that are the problems, not the physicians. And so, the underlying logic of the meaningful use methodology or schema, I see as problematic, although well intended.


Alan Morris – University of Utah – Professor
Dr. Koppel emphasizes the rigid use of rules that would be basically a reflection of what most clinicians were taught in the second year never to do, and that is practice cookbook medicine, as a danger. He‘s absolutely right. I am emphasizing a different output of knowledge engineering, and that is a detailed enough rule set to provide not rigid responses, but adaptive responses, responses that are driven by patient data and that adapt to patient state.


If we don‘t make that distinction, we will get into a lot of confounded discussion about what needs to be done. He is absolutely correct that rigid rules that are not adaptive are a danger because we don‘t know exactly how they will affect certain patient sets. On the other hand, properly configured knowledge engineering will lead to adaptive programs, and that‘s why I think it‘s so important to complement the top down with the clinician driven bottom up. That‘s where the knowledge engineering will be generated for the issues that interest the clinician and for which she has fire in the belly to do what‘s necessary to get the job done.


David Classen – University of Utah – Associate Professor
I would think that homegrown systems, David, could easily keep up with that innovation approach because it fits right into their iterative loop, making refinements based on a series of goals. I agree with the framework and the set of goals, and I think homegrown systems could meet it. I think what‘s an open question is can vendors work closely with providers to go through that iteration to achieve those goals, and I think, in fairness, that‘s probably an open question right now.


David Blumenthal – Department of HHS – National Coordinator for Health IT
Just as a matter of record, stage two and three are not specified, so to imply that there are already prespecified, rigid rules, I think, is a misinterpretation of the notice of proposed rulemaking. The other point I would make is that I took from your comments an implication that part of what‘s needed is a market for improvement, a set of incentives that value improvement, and that there is an interactive between the user and the vendor that‘s constant over time, and that, at that constant interaction that an individual physician an organizational level with vendors is part of that process of improvement. If so, the question becomes how to create that market. What I‘d ask you to think about is whether a set of incentives tied toward more improved uses over time, staged, has the capability to create that market.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
I would say that the current structure works against those improvements because once somebody spends, you know, $100 million on a system and $500 million implementing it, the vendor has them totally, and the incentive on the vendor‘s part to keep on making improvements and the like goes not to zero, but asymptotically close to that.


We need to change the incentive process to motivate vendors to be constantly innovative for their locked in customers. For the other ones, the market exists. And the current structure doesn‘t do that. It puts the entire burden on the clinicians to use the systems that they bought, and try to meet those standards. The standards are reasonable for, as David says, homegrown systems where they can keep on fixing them. For the ones that have been locked in it‘s problematic.


Alan Morris – University of Utah – Professor
There‘s another issue here, and that is parochialism. It would be, it seems to me, very difficult for the best-intended vendor to be able to respond to ten different installation sites, all of which request different modulations of the common elements because they fit better with local needs. I see here a coordination challenge that‘s quite important for us, as a community, because it seems unreasonable to expect the vendor to be able to respond to everybody, and even counterproductive if they did because we would wind up just formalizing differences between institutions instead of coordinating and making a distributed laboratory for clinical care from which we could generate community wide conclusions that would be generalizable. Was that clear?


George Hripcsak - Dept. of Biomedical Informatics Columbia University – Chair
Yes. It‘s George Hripcsak. Dr. Morris, that‘s exactly my question.

Paul Egerman – eScription – CEO
Sorry, George. We‘re running a little bit short on time, so I‘d ask everyone to be brief, but go ahead.


George Hripcsak - Dept. of Biomedical Informatics Columbia University – Chair
I agree with the top versus bottom down approach. I‘m a very strong supporter of bottom up approach, but as you also pointed out, these need to be generalizable solutions that are consistent across the nation, so what would a national policy look like that encourages bottom up solutions, but were consistent across the nation? How do you do that?


Alan Morris – University of Utah – Professor
Thank you, Dr. Hripcsak. It‘s nice to see you. A challenge, but it seems to me that two parallel pathways that interacted so that the top down efforts informed the bottom up work, and the bottom up work uncovered and exposed a whole host of issues that needed to be addressed would, in a complementary way, inform the top down effort. These two parallel pathways would be together much more productive than what I see configured, at least from my vantage point right now. The details of that, like for other things, involves discussions with the devil.


George Hripcsak - Dept. of Biomedical Informatics Columbia University – Chair
Okay.


Paul Egerman – eScription – CEO
Marc?


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Just one comment: Lousy usability is hardly a marketing tool, you know. Some of this stuff can be fixed quickly and easily, and it‘s not.

Marc Probst – Intermountain Healthcare – CIO
This is Marc Probst. First, having your own homegrown system, I don‘t think is a slam-dunk for making meaningful use the way it‘s currently structured. However, I understand your principal that we have a lot more ability to react to it. Dr. Classen, you put the eight rights in, in the documents that you sent us. We‘ve talked about the varying capabilities of systems. And I think, as we look at it is, as a panel, we are where we are in the market with the vended products. And I think, as has been pointed out a couple of times, there is value in getting these things automated and getting data into systems, whether they‘re perfect or not. And I understand the safety issues that you talked about.


But to what degree, and as I look at the eight rights, again, a lot of those are kind of there, what we have today, and what we‘re dealing with relative to moving IT into the healthcare system. But how much of it is around implementation? My concern is, you‘ve talked a lot about iteration through the process, and there‘s a lot of education actually through that process. I think, a bigger concern for me is if we try to implement things too rapidly, are we introducing significant safety issues into the process, again, given the structure of the systems that we have today.


David Classen – University of Utah – Associate Professor
Yes. I think it‘s a great question, and one I struggle with because I see us moving how we‘re implementing in terms of timelines. So in the old days, at LDS when we did this, we used to talk about implementing in a multiple, five-year perspective. Then we shrunk to implementing in an 18-month perspective. And now we‘re down to the week perspective. I mean, we‘re really talking about organizations rapidly designing and implementing these systems within weeks, maybe 10, 20 weeks. And so I think, as the systems have gotten more complex, we‘ve been shrinking the timeline to implement them.


I would agree, that causes me a lot of heartburn and great concern, and especially when we‘re talking about highly complex systems that have been evolving rapidly, right? I mean, as you well know. So that‘s the part that really does concern me, and so I think what that‘s going to do is that‘s going to uncover a lot of these problems with these systems more aggressively, more dramatically, and more quickly, as we move to more rapid implementation.


I think it puts more and more import on the way we actually implement to assure safety. So if we‘re going to implement rapidly, are we going to do workflow analysis, process redesign? Are we going to engage the organization, or are we going to go the easy way, which many organizations are contemplating right now, which is just to take a prepackaged, pre-designed software product, and ram it in quickly.


Marc Probst – Intermountain Healthcare – CIO
And the implication of that last approach?


David Classen – University of Utah – Associate Professor
Well, I think … very troubling, and we have some research data to support that, so that does cause me a lot of heartburn. And I know that‘s so seductive to organizations because it‘s viewed as simpler, easier, and cheaper. And it avoids many of the eight rights, which are viewed as expensive rights. So that‘s my great concern is it will be very seductive to say, here‘s a prepackaged system we‘ve implemented at 25 other organizations. Just take it and put it in yours, and don‘t do any local customization. And I think we‘re going to learn a lot from that process, but it does cause me a great deal of heartburn.

Paul Egerman – eScription – CEO
Thank you. We‘re just about out of time, but, Jodi, try to keep it quick.


Jodi Daniel – ONC – Director Office of Policy & Research
Thank you. Thank you for your testimony. In addition to the meaningful use requirements, and the requirements on providers and hospitals, we do have, we have established standards and certification criteria and functionality that would have to be included in products for them to be certified. And we are developing a certification program to certify those EHR products. My question to you all is whether and how we may be able to leverage a certification program of the EHR products to address some of these improvements that you‘re talking about and address and make iterative improvements to address some of the safety concerns or issues that have been coming up.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Some of us view the committees that have created the certification as regulatory capture the economics term or, in English, the foxes being the architects of the hen house.


Paul Egerman – eScription – CEO
I assume you‘re talking about CCHIT and not this group.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Right. That‘s exactly.


Paul Egerman – eScription – CEO
I feel better now.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
That is, Mr. Cochair, exactly what I‘m referring to. Remember the procrustean bed, you know? Well, we see the CCHIT standards as fitting what the vendors‘ lowest common denominator can be and then calling that perfect. I would argue for meaningful standards for vendors that respond to what this committee is now clearly focused on, not to what the vendors can pop out of the box and sell quickly.


David Classen – University of Utah – Associate Professor
I would add from our with the EHR flight simulator, looking at these certified systems after they‘re implemented. It‘s almost unrecognizable, so the idea that just because we certify them means this is the

way they‘ll look when they‘re actually implemented is not reality. So I would argue that certification is important and necessary, but not sufficient. There has to be testing of these systems, as they‘re implemented, because I‘ve seen wonderfully certified systems implemented very poorly, and that gets back to Marc‘s comment about the implementation.

Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
Agreed.


Jodi Daniel – ONC – Director Office of Policy & Research
Can I just, in follow up to that, having a certification body conduct after the fact surveillance would address that question or not?


David Classen – University of Utah – Associate Professor
No, I don‘t think it will address that. I‘ve seen those proposals. They do limited testing. That doesn‘t work. It really needs to be more extensive testing within the organization, so I don‘t think certification can address that. My view is, you need to have certification beforehand and then some other evaluation and testing systems after implementation.


Ross Koppel – University of Pennsylvania – Adjunct Professor of Sociology
You know, they are the intellectual soul of the hospital or something, and to say that you can check it out before and then not examine it inside you is simply silly. It doesn‘t work. These things are modified on a daily basis by people or in constant update. Each update is a potential catastrophe or benefit.


Alan Morris – University of Utah – Professor
Right, and then one of the crucial issues in iterative refinement is capturing the performance of the product in the context in which it‘s intended to be applied. How do you capture that performance when you have a data set that is acknowledged to be of low quality? Clinical data sets are not of high quality. They‘re of low quality. And the data are acquired by clinicians making decisions in ways that are unknown to you. So you see there are some fundamental missing links in trying to evaluate the performance so that you could feed that back to a body that generally doesn‘t exist to try and examine how to modify the system in order to improve the performance.


David Classen – University of Utah – Associate Professor These are highly dynamic systems that that approach will never work with. The case in point that, with the flight simulator, there have been a number of organizations, well established vendor software. Goes through an upgrade and, unfortunately, what happens is we show up … significantly problematic … throughout the system because the upgrade turned things off, and no one knew about it.


Paul Egerman – eScription – CEO
Thank you very much. Unfortunately, we are out of time. I hope you got your question answered, and I ran out of time to ask the people on the phone if they had any questions, but I just wanted to say thank you very much to the panelists. I hope you have….


M

….


Paul Egerman – eScription – CEO
Pardon me?


M

….


Paul Egerman – eScription – CEO
Just terrific information. I hope you have an opportunity to stay for the rest of our presentation, and to chat with us a little bit at lunchtime, so thank you very much. The next panel is the stakeholders panel with my colleague, Joe Heyman will be moderating. If we could ask the people who are members of that panel to step forward, appreciate it.


Joseph Heyman – AMA – Board Chairman
While everybody is sitting down, let me tell you that I‘m Joe Heyman. I‘m a practicing gynecologist in solo practice with an EMR since 2001. I‘m also the immediate past chair of the American Medical Association‘s board of trustees. And we have a very interesting panel.

Now we‘re going to try to address these questions. What the experiences are with EHR associated risks. These are from the stakeholders, vendors. How have you identified those risks? What steps were taken to prevent harm and mitigate those risks? What approaches were recommended to prevent or mitigate harm? And what are the benefits and risks of making these risks and adverse events public?


We have quite a few panelists, so I‘m going to be pretty stern about the five minutes, so I‘m warning you guys ahead of time. I don‘t mean to be rude, but we‘ll never get done in time if we don‘t do this. And I‘m going to just give very, very brief introductions. Dave deBronkart is cochair of the Society of Participatory Medicine and represents ePatientDave. Justin Starren from the Marshfield Clinic is a physician researcher. Jeanie Scott from the Veterans Health Administration is director of its IT safety office. Michael Stearns is CEO of e-MDs, and will be representing HIMSS. Shelley Looby is from Cerner, and she‘s standing in for Gay Johannes, and I‘m not sure what your titles are, so when you do it, when you do yours, you can introduce yourself. And then Carl Dvorak is the executive vice president of Epic. With that, why don‘t we start with Dave deBronkart?


Dave deBronkart – Society of Participatory Medicine – Cochair
First, let me say thank you. It‘s an honor to meet some of you whose names I‘ve known. It‘s good to see some of you that I‘ve seen before. To some extent, although I don‘t have a Chihuahua in my pocket, I feel kind of like Reece Witherspoon coming out of nowhere. I‘m not quite legally blonde, but three years ago, I went in for a shoulder x-ray, found out that the shoulder was going to be fine, but there was a spot in my lung, which turned out to be stage four, grade four kidney cancer. Long story short, my median survival time was 24 weeks, and I got focused really quickly on what works and doesn‘t work. And I went into this with the perspective of somebody who has seen 30+ years of automation in different industries.


I had a great healthcare experience. I had few, if any, complaints, but having faced, literally having faced the grave. When I realized I‘d survived, I truly asked myself what am I going to do with this free replay? And I started a blog called ―The New Life of Patient Dave,‖ and then my doctor, Danny Sands, said, ―Hey, I belong to this study group called epatients.net. We‘re going on a retreat, do you want to come join us?‖

And I read their white paper, and my mind blew open because it was this documentation of what empowered patients have been doing using the Internet. In one regard, medicine is about access to information, and the Internet gives patients access to information and to each other in ways that were not possible 10, 15 years ago.


You could say that this changes everything. Not it doesn‘t literally change everything, but the mistakes that I often see when I attend policy meetings and industry meetings is people thinking, as they think

about what can we do about this gigantic challenge, as they think in terms of what‘s in the industry. People who work for policy groups doing good work, people who work for vendors.


Mind you, I‘m a vendor. I come from the vendor community. Thirty years ago, I worked for companies that made typesetting systems. We automated the newspaper industry. That had some things in common with healthcare, certainly not the same lives weren‘t at stake, but there was no such thing in that industry as saying, well, you know the system didn‘t work today. We‘ll do it tomorrow. And I sat in some really heart-to-heart meetings with people saying, look. I don‘t care what the problem is. Tell us what the problem is, so we can work around it. This paper will not fail to be published tomorrow


Now Dr. Blumenthal, you mentioned, is there a market for improvement? Let me tell you one stakeholder that is really interested in improvement, and that is patients who are in the process of dying or their mother is in the hospital bed, might be dying, and so on. Charlie Saffron of Beth Israel Deaconess testified for the House Weighs and Means committee a few years ago. Said America‘s most underutilized resource is the motivated patient.


As my testimony shows, systems have bugs. And I think it‘s important for everybody, even lawyers, to get over it about whether that‘s a tolerable situation. What I want us to look at is how can we work together to improve things.


Now, you know, it‘s funny because when typesetting turned into desktop publishing, there was the same issue. QuarkXpress took over the newspaper industry. It was really hard to get a straight answer out of Quark about what their bugs were, so people banned together on the Internet, or on CompuServe back in those days, and developed their own bugs list. And, you know, it makes such a difference if you‘re trying to make a certain function work right, and it just doesn‘t work. If you can just look something up and say, oh, that‘s a bug. I won‘t try to do it that way.

The question that really calls to me is what can we do to manage thoughtfully and not be primarily interested in figuring out who‘s guilty and when‘s the hanging. What I really urge in my testimony, let‘s get over the idea of perfection. Let‘s give patients visibility. I‘d like to draw attention to Regina Holiday back here, the cancer widow who, one year ago, found errors and omissions in her husband‘s medical record. It would have done no good to do lawsuits over that, but just even though she‘s not even college educated, as she herself said at a meeting a couple of months ago in Washington, she was able to identify areas for improvement. Let‘s harness this resource.

I have something I sometimes say whatever we do, let‘s not stand in the way of desperate people‘s attempts to save themselves, and maybe even, let‘s take one percent of the stimulus money and set it aside to foster the development of patient communities, because that‘s an audience that doesn‘t need to be encouraged to make things better. Thank you.


Joseph Heyman – AMA – Board Chairman
Thanks very much. Justin Starren from the Marshfield Clinic?


Justin Starren – Marshfield Clinic – Director BIRC
Thank you. We originally heard it was five to seven, so I‘ll get through as much as I can in five.

Joseph Heyman – AMA – Board Chairman
You‘ll do it in five.


Justin Starren – Marshfield Clinic – Director BIRC
The physicians and staff at Marshfield Clinic thank you for conducting this hearing. I‘m Justin Starren. I‘m the director of the biomedical informatics research center. It‘s important to know the chartless electronic workflow of Marshfield Clinic relies primarily on software that was developed by Marshfield Clinic over the last 40 years. Marshfield Clinic also licenses its EHR to other healthcare providers. Thus, we are both a provider and a vendor.


The term EHR associated patient safety risk carries with it an implication that EHRs create cause or worse in such risks. The Marshfield Clinic has invested in the development of its own EHR because we believe that a well-implemented EHR dramatically reduces patient safety risks over what can be achieved with paper.


Let me share a personal example. I take the blood thinner Coumadin. Recently, I also needed to take an antibiotic. By the time I got home with my prescription, there was a message waiting for me saying that given my overall health picture, I should get an extra blood draw to see if there was an internation between the two. I did. There was. My dose was adjusted. That kind of quality of healthcare cannot be achieved without an EHR.


It‘s hard to measure things that don‘t happen. We estimate that using the tools in our EHR, we will be preventing by improving blood pressure control between 70 and 400 strokes over the next five years. That said, we realize that EHRs are not perfect. They are complex pieces of software embedded in complex socio-technical systems. The fundamental premise of complex systems analysis is that most failures are not due to failure of a single component, but, rather, due to interactions between human and computer, between computer and computer, or between human and human.


In reviewing this issue with our senior staff, we identified three events over the past five years. All three demonstrate this principle of complex interaction. One involved a computer interface between a lab analyzer and our EHR where the interfaces were mismatched and results got garbled. The second involved a drug/drug interaction system where the alerting was set at such a threshold that our doctors were buried in alerts and were suffering alert fatigue. In neither case was harm identified. The third involves an affiliated hospital where a confusing dose selection menu resulted in a physician selecting an incorrect dose for a patient.


How do we identify risks? The vast majority of information on issues comes through our helpdesk system. Our helpdesk staff are trained to alert and elevate any safety issue to the director of software development and the director of software quality. They conduct a rapid assessment. And if there is a plausible risk to patient safety, whether or not one has occurred, that software can be rolled back out of production literally in minutes.


What steps have we taken? The first is to involve clinical users throughout the design and development of systems. The second is don‘t rush. Software modules are revised and redesigned many times before they go into pilot, and many times more before they go into production. Third, keep listening.

What do we recommend? Studies of IT disasters frequently site the presence of unrealistic goals or deadlines as contributing causes. Our experience is of pushing an implementation forward. When you do the software where a human organization is not ready almost always causes problems. We were concerned that the current goals for meaningful use present a serious risk of pushing too many providers too far too fast.


In spite of the fact that we already have a chartless electronic workflow, the 80% goal for CPOE will be very difficult for us to reach. We find that outpatient order entry is harder than inpatient because of the

increased diversity. Our clinicians also work in clinician MA teams. Forcing the clinician to be the only one who is entering orders breaks the workflow of that team and takes time from the patient/clinician interaction.


As I mentioned, the majority of problems are due to complex interactions. Teasing apart those interactions takes data, often lots of it, not only on actual events, but also on near misses. The aviation safety reporting system is often cited as a model for near miss reporting. That system has three important factors: third party reporting, confidentiality and, most important, limited liability protection.


Dr. Blumenthal called for the creation of a learning community. That can only happen if we have free exchange of both the good and the bad. Any reporting system that takes a punitive approach will stifle exchange and will ultimately lead to self-protective behavior and inferior systems.


Joseph Heyman – AMA – Board Chairman
Thank you very, very much. Next is Jeanie Scott.


Jeanie Scott – VHA – IT Patient Safety Office Director
I‘d like to thank you for this invite. I‘ve actually been waiting for this invite since Dr. Blumenthal, your predecessor, Dr. Kolodner, was my first supervisor when I took on this job as information technology patient safety. So I‘ve been waiting for the invite. Thank you. And that was my Blackberry that did go off. I actually turned it to vibrate this morning. It‘s a new system. I don‘t know where it is, and I can‘t find the vibrate, so some of the same experiences that our experiences that our physicians get with the systems that they have. It is off now.


VA uses a complex system called VistA, which is composed of over 100 software components supporting VA‘s healthcare system. Just to give you a magnitude of the amount of work that we have, we have over 2 billion patient orders, document administration of more than 1.1 billion medications, and provided access to more than one million images. We‘re catching up to McDonalds.

VA recognizes the need to have a comprehensive information technology patient safety program integrated with the overall patient safety culture. Our patient safety program emphasizes close call reporting. We have multiple avenues for reporting IT safety concerns, of which the majority of safety reporting is recorded through our local and national IT helpdesk systems. Reporting methods include the direct incorporation of data entry fields for patient safety flagging into our IT helpdesk tickets, as well as a Web portal for reporting IT patient safety concerns.


I‘d like to present an example of how reporting, notification, and remediation are key. In a widely publicized October 2008 event, VA experienced a software program defect with the release of our computerized patient record system, version 27. The defect was identified by a clinician, such that when the clinician accessed a different patient‘s electronic medical record, prior patient‘s data would still display. The defect was thought to be rare. We had one physician, one medical center report it during testing.


In several attempts to replicate it prior to release, we were unable to find the cause. Shortly after full release, three days, reporting via the IT patient safety channel identified an increased occurrence rate among at least four additional medical centers. We have 153 medical centers, hundreds of community-based clinics, nursing homes, so it only required 4 more….

We issued a patient safety advisory describing the issue and encouraged immediate reporting to the national software team. Our software team had no idea what this problem was. Within three days of the

notice, VA had received over 20 additional reports. With the assistance of the IT patient safety staff, the software development team were finally able to isolate the particular sequence of events to replicate the issue and a remedy was issued within a matter of weeks of the notice. This wide dissemination of safety advisory was indeed a benefit to timely resolution of the defect.


Some others issues that we continue to find: We find issues with software design for speed and minimum user and keystrokes. Recognizing that designing for default behavior can lead to enhanced adoption, first and foremost systems must emphasize the criticality for accuracy in the associated user task. User center design indeed provides the basis for ease of use, but an organization must be cognizant of the human centered design limitation. Critical tasks must be balanced, not only for speed and adoption of the IT tool, but for accuracy of the tasks ultimately completed.


Another example is mental models and various information systems associated with information derived from shared data elements. Many of the healthcare entities depend upon the same basic data elements. However, require different information displays. This is critical for the transfer of information from nurse to provider to pharmacist.


A safety issue may be initially identified in a single information system at the endpoint of the error. However, one must trace back to the origin and determine whether multiple data points contributed to the chance of that error. Did multiple systems contribute to the chance of that error? Did one feed into the other?


Users are subjected to variability and displays often for the same information. Certain design guidelines may indeed be investigated to concur with the standard geographic layout of certain key information. And, finally, the occurrence of popup fatigue is prevalent. This was discussed by previous speakers.


These examples just describe a subset of the myriad of issues reported with the use of health IT. You will have both technical and design issues will arise from the use of healthcare. One of the things is organizations can expect a direct impact to healthcare delivery when access to the EHR is not available.


I must emphasize, incorporating safety into health IT. Safety reporting should encourage reporting of close calls. Healthcare organizations must be able to openly report safety concerns.


Usability goals must be goal oriented to insure task completion. And, finally, issues that are directly reported to the vendor by the individual facility and other facilities – and other facilities in that healthcare organization may not be aware of those vulnerabilities. This is a challenge that VA has identified. One proposed avenue to explore is to require regular reports to the customer or provide the customer with open access to every report. I thank you for this time.


Joseph Heyman – AMA – Board Chairman
Thank you. Michael Stearns?


Michael Stearns – e-MDs – President & CEO
Thank you very much. I want to thank the ONC, the HIT Policy Committee, and this workgroup for the opportunity to present information on HIT in the context of patient safety. I‘m the president and CEO of e-MDs, Incorporated, an ambulatory EHR vendor. I‘m participating on the panel as a representative of the HIMSS electronic health record association. I will also, however, share the experiences of my company.


There‘s a great deal of evidence that EHRs have potentially improved patient care and reduced medical errors, experienced around the world in particular countries with high levels of EHR adoption that have

demonstrated these benefits. However, less attention has been focused on potential patient safety risks that could be introduced by the use of EHRs. EHR software provider organizations have a long history of addressing patient safety issues, and have drawn from our collective experiences to provide you with feedback and recommendations for moving forward.


At the outset, I would like to emphasize the EHR Association, as member companies are committed to the highest levels of product quality and patient safety. We take very seriously the concerns that recently have been raised by policymakers, the FDA, and others in this area that have in part … this hearing. We are committed to working cooperatively with key stakeholders to address and resolve questions and issues associated with the safe use of EHRs and other healthcare IT applications. We have initiated a workgroup and process to develop such recommendations, and we‘ll be expanding upon this work, as we consider such inputs as discussion at this HIT Policy Committee hearing being held this week.


In my written testimony, I provide an overview of some published reports on patient safety issues related to HIT. As an academic physician, trained as an academic physician, it‘s the first place I turn to is the literature. As commented earlier, and the earlier panel, the first panel, I‘d like to state that I agree totally with the comments that were made. At a high level, the issues raised related to computerized physician order entry, results management, communication documentation, information display, and data integrity emerged when I did the literature review.


Dramatic reductions in prescribing and other errors with the introduction of a CPOE system have been reported. In general, the sorts of problems, when they‘re identified, and this is echoing the panel members earlier, the multifactorial related to poor implementation, inadequate training, lack of attention to workflow and, in some cases, factors related to human/machine interactions. The management results from ordered tests is another area that has had less focus, and most of the information we have right now is on inpatient CPOE systems.


However the ambulatory – information emerged from the ambulatory environment has not been quite as thorough. There was one article I did cite in my written testimony, however, that looked at the workflow involved. Once you go forward with results management, results management failures have been described in several articles in the outpatient setting. But once a system was implemented, a new problem submerged, so new problems were created. There were tremendous benefits from going forward with implementation, but also then close attention had to be focused on the workflow.


I also touched on the experience that software providers have had with documentation of EHRs, which has been a big challenge a lot of physicians had, including mechanisms that allow providers, and discussion mechanisms with a lot of providers … express and capture nuances of clinical care, which is a very important safety consideration. The … of information in EHRs from prior visits is another area of focus, as it draws attention to findings on prior visits that must be used with caution to avoid documentation errors.


Each reported patient safety issue related to HIT needs to be evaluated in detailed … local factors, such as IT configuration, preparation, training, workflow design, etc. are a source of the problem. Experience with EHR associated patient safety risks, I can speak for my own company where I‘m employed. We‘re an EHR vendor, as I mentioned. It‘s been, fortunately, extremely rare to see errors that could create patient safety risks that are not intercepted and remedied through design, quality control, and beta testing process, which is obviously very important. Our staff members are undergoing retraining at all times, and have identified and informed that they need to elevate any patient safety risk to the highest order.


We heard mention earlier of a vendor dungeon, and I‘d like to say that that‘s where the employees would go who don‘t report patient safety issues. Just kidding.


The EHR Association member companies are committed to the highest levels of product quality and patient safety. We take very seriously the concerns that have been raised by policymakers and others in this area. We are committed to working cooperatively with key stakeholders to address and resolve questions and issues associated with the safe use of EHRs and other healthcare IT.


We have already initiated work within the association to begin to address these issues across a broad set of vendors who serve this market. Recognize that our industry requires that a comprehensive set of principles will help insure patient safety and manage risks after our solutions go to market. Our members have long had quality management processes in place, and we‘ll be sharing these best practices within our membership to insure that we are aligned on a strong baseline process.


As an association, we plan to use the knowledge gained from the HIT Policy Committee meeting this week to continue forward and finalize our recommended principles. Thank you very much for your time.


Joseph Heyman – AMA – Board Chairman
Thank you very much. And now, Carl Dvorak.


Carl Dvorak – Epic Systems – EVP
Thank you. I will likely skip around a little bit in an effort to focus on elements of utility beyond what‘s already been testified to, so I won‘t read it, but bear with me. I work with Epic Systems. I come from a background of computer science and software development, and have been a developer on our EHR application since the early ‗90s.

At Epic, we‘re a little bit different than most. We don‘t survive by a large marketing effort. We basically depend on word of mouth and have grown in clusters in communities around the country. While we don‘t directly diagnose or treat patients, we do take our responsibility very seriously to provide the information accurately and timely to those who do take care of patients.


One thing I wanted to point out, I think it is relevant, and that is that computers are not new to healthcare. For 30 years, people have been using computers to manage orders, to enter orders in the pharmacy, in the lab, reporting results out. This is not a new phenomenon, and it‘s something that you could go back through time and study to understand the mechanics of how orders are managed through computer systems.


What is new, however, is that HITECH is now requiring physicians to directly enter the orders themselves. That is the big change that we‘re talking about here. And I think the effort there is to improve legibility issues, remove legibility concerns, and to remove the ambiguity around verbal orders and other order issues that arise when things are not clear.


In terms of the areas of risks that we were asked to testify to, I think there are five areas that I‘ll highlight today. I think the first one is the one that people talk about most commonly, and that is the notion that software should function as its intended to. It should be as close to bug free as you can get it. As ePatientDave remarked, you don‘t ever really get bug free, but there are industry best practices that include design, domain specific design review, programming, programming peer review, quality assurance, unit system testing, regression testing.

For the most part, our observation is that the real risks related to defects in software code are actually the

area that gets the most focus and attention. In the industry, best practices do help insure that EHR products come off the line with good support for basic use cases, and you have to be careful, as you introduce contextually sensitive elements or complicated interfaces, or if you use it in exotic ways that were not anticipated through the original testing. So those do present risk categories that have to be thought through.


The second element, I think, is worth a little bit of mention and some of the literature points to this, the lack of availability of health records. When the system goes down, you have to have strong and pre-thought out backup methods to manage the patients whose lives might depend on the information that is now behind a dark screen. That is something, I won‘t spend time on this morning, but that is an essential element of the safe EHR environment.

The third category is user-friendly design. That gets a lot of popular press these days. I think, in terms of science, there are real benefits in the science of user interface design, human factors, engineering. We take advantage of those. It helps us tremendously produce something that, out of the gate, is pretty darn good. It‘s not perfect, however, and what you will find is that ten different people will often take ten different interpretations on what one might believe is solid science. So there is still an element of subjectivity in design.


I also want to stress the importance of imagination in art and design. If we tried to make cell phones consistent, I commented that if we had done that in 2002, we might never have seen the iPhone. I think there is an imagination and there‘s an art to doing this, and it‘s been interesting to watch over the last 20 years, EHRs, not just outs, but everyone‘s emerge with new and more innovative and better ways to deal with things.


We believe there‘s no substitute for software developers working shoulder-to-shoulder with clinicians. One of our philosophies is to get programmers out at sites designing, developing, and then supporting go lives to make sure that they understand how these tools are being used and the importance of getting things right the first time. All that said, I would strongly advocate against regulation of design by a third party. Our observation is that you would get one person‘s opinion or one small group‘s opinion, and you would limit innovation and progress in the EHR world.


The fourth element, configuration and technical implementation of an EHR, that‘ been discussed quite extensively this morning. That is an incredibly important area. You will find that the EHR, in and of itself, is a small percentage of the risks involved in a full EHR environment. There are many complicated system connections that need to be thought through. Paradigms need to be mapped. Vocabularies need to be mapped, and they all need to be tested carefully. Without that, you will see poor outcomes with great environmental work and great environmental testing. You‘ll see much better outcomes.
The fifth and final safety related area that I want to talk about for a moment is training. It‘s often overlooked, and it‘s often undervalued. I think if the committee could recommend one thing, that would be that to become eligible for stimulus, you must at least attend a day‘s worth of training on your EHR program, and you should take at least a 30-minute proficiency test. That would be a concrete step in the right direction. Although it takes a decade to get a medical education, people often avoid that one day worth of training and wing it at go live.


The second thing related to training is something that we‘ve coined a term around called chart etiquette. I think one of our sites did. The notion that when 100, 200, 1,000 doctors all depend on the same medical chart, you have to make sure everyone knows how to use that chart and puts the right thing in the right place. You can‘t have people putting orders in notes, hoping the nurse will find it, refusing to use the

problem list, and putting problems in a comment box because they didn‘t feel like trying to match it from that list.


I think we have another area of concern around content. The content development lags in many cases the actual capabilities of the computer systems, and it‘s been good to witness continual evolution and more and more friendly content. A good example of that is working with the pharmaceutical file vendors to get less product based content and more therapeutic-based content.


Joseph Heyman – AMA – Board Chairman
Dr. Dvorak, maybe you can work some of the rest of this into your answers to some of the questions.


Carl Dvorak – Epic Systems – EVP
May I have one final comment?


Joseph Heyman – AMA – Board Chairman
Go ahead.


Carl Dvorak – Epic Systems – EVP
One of the things that we take very seriously is the responsibility that any time somebody reports to us a safety concern, that we, as quickly as possible, propagate that out to every user of the system and help them understand how it might affect them. No matter how small that is a responsibly, I think every EHR vendor should have, and it‘s one that we support strongly.


Joseph Heyman – AMA – Board Chairman
Very good. Shelley Looby?


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
Good morning. My name is Shelley Looby. I‘m director of regulatory affairs quality assurance at Cerner Corporation, headquartered in Kansas City, Missouri. I‘m here today instead of Ms. Gay Johannes, who is Cerner‘s chief quality officer, as there was a death in her family. I will be delivering her comments today.


Cerner and I would like to thank the HIT Policy Committee for the opportunity to testify on patient safety issues related to the use of electronic health records. Cerner is transforming healthcare by reducing error variance in ways for providers and consumers around the world. Cerner‘s solutions optimize processes for healthcare organizations ranging in size from single doctor practices to health systems to entire countries for the pharmaceutical and medical device industries, and for the healthcare commerce system.


Cerner began with the development of an information system that optimized processes in the hospital clinical laboratory. Since its founding in 1979, Cerner has expanded the application of health information systems across the healthcare delivery continuum. Today, Cerner‘s HIT solutions assist clinicians in many areas of care, including surgery, pharmacy, women‘s health, intensive care pacs, and blood banks. The solutions are licensed by more than 8,500 facilities around the world, including approximately 2,300 hospitals, 3,400 physician practices covering more than 30,000 physicians, 600 ambulatory facilities such as laboratories, ambulatory clinics, cardiac facilities, radiology clinics, and surgery centers, 700 home health facilities, and 1,500 retail pharmacies. As such, Cerner provides HIT solutions to nearly one-third of the domestic healthcare market.


While HIT is transforming the process of healthcare, it has not replaced skilled clinicians who deliver care to the American public. HIT systems collect, record, and manage information that is relevant to the

diagnosis and treatment of disease. HIT makes that information readily available to the clinician in a form that aids clinical decision-making. Ultimately, however, the clinician diagnoses and treats patients based on his or her assessment of the available information accordance with the medical standard of care. The HIT industry neither intends, nor desires to practice medicine. We simply create tools that help clinicians make better care decisions for patients.


In the early states of HIT system development, Cerner and other HIT companies designed solutions that collected and stored clinically significant data. These systems made information available upon demand, but they did not provide recommendations on medical assessment or proactively push alerts or design support to the caregiver. Largely speaking, these systems automated paper processes. As HIT systems expand and become smart technology, a new level of opportunity for inadvertent risk and complexities may be introduced into the care process. As the complexity and attended risk of these solutions grow, enhanced safety measures may be warranted.


Cerner is uniquely positioned to comment on the safety issues facing the HIT industry and the efforts by government to date to address these issues. The U.S. Food and Drug Administration actively regulates some Cerner solutions as medical devices. As a result, these solutions are subject to the regulations imposed by the FDA under the Food, Drug, and Cosmetics Act. These regulations govern the design and development of medical devices, the premarket clearance of such devices, and the post market surveillance concerning the safety of such devices.


Other Cerner solutions are not actively regulated by FDA and are not subject to these regulatory obligations. Over the last decade, however, Cerner has voluntarily implemented post-market surveillance processes in accordance with FDA standards and has publicly disclosed issues that might impact public safety by reporting through FDA‘s Med Watch program. These reports are part of public record.


Similarly, Cerner voluntarily complies with FDA‘s good manufacturing practice regulations in the design and development of its unregulated solutions. In addition, Cerner has adopted quality system regulations for its development processes across all Cerner solutions to meet the more stringent FDA requirements for regulated medical devices. Cerner is one of the few HIT suppliers in the United States that voluntarily reports safety related incidents for non-FDA regulated solutions to the Med Watch program. Cerner is not required to do this. We believe this is the right thing to do. Such disclosures provide much needed transparency to the success and challenges of the systems.


The transparency is especially important at a time when the federal government and the American public are heavily investing in HIT. Cerner believes there will be increasing HIT regulation, both here and overseas. The new regulation is not exclusively tied to safety concerns, but rather, to the expectation that government should be involved in making healthcare better and more transparent. Cerner‘s participation in Med Watch is just one example of how the company is participating proactively rather than reactively in healthcare reform.


Joseph Heyman – AMA – Board Chairman
Thank you very much. We‘re going to have a question and answer session, and let me just make a suggestion because there are so many panelists. If somebody on the panel says something that you agree with, it isn‘t necessary for you to publicly agree with them. But if there‘s something that you disagree with or you want to add some new fact to it, that would be very helpful. And let me just start off with two very brief questions, and you don‘t need to all answer them.

The first one is that the final question was what are the benefits and risks of making these adverse events and risks public, and I didn‘t hear a lot of discussion of that, so I thought maybe somebody should

address that. And then the second question, just to get it out of the way is, Dr. Morris earlier discussed this idea of generalizability. And I‘m not sure that I agree with him completely because I think physician innovation is also very important, but I would like to know what you think of the ability of different vendors to actually be able to accomplish the idea that physicians presented with the same evidence on the same patient would come up with the same decision from different software programs. Those are my two questions, and why don‘t we start with the first one, which is the public availability of knowledge about actual risks or bad outcomes. Yes?


Dave deBronkart – Society of Participatory Medicine – Cochair As my testimony said, I just can‘t imagine any other worthy focus than an effort to make healthcare better. And while we work on things that will take 10, 20, or 30 years to produce in the way of better systems and workflows, one thing we can do right now for the people who are in hospitals right now or will be next month or next year or whatever is make as visible as possible what we know about things that aren‘t working right so that everybody can work together to try to produce improvement. I can‘t see any argument, having been on the vendor side myself, I can‘t see any argument against doing that except that a lot of people might get upset, angry, whatever.


And I will do my best personally. I mean, I‘m so committed to this, I‘ve quit my day job. I am in this full time now. I will do anything I can to help the public get over the idea that people in the healthcare industry are supposed to be perfect, so that we can start working together for improvement.


Joseph Heyman – AMA – Board Chairman
Thanks. Dr. Dvorak?


Carl Dvorak – Epic Systems – EVP
I‘m not actually a doctor.


Joseph Heyman – AMA – Board Chairman
Well, you look like one.


Carl Dvorak – Epic Systems – EVP
All the years of working with them. In answer to your first question, whether public reporting is good; in and of itself, I think it‘s a hard question to answer. If done properly and done well, I think it‘s a strong force for the positive. If done poorly, it could be a strong influence to the negative. So I think, in general, the organizations that have focused on consolidated learnings like Institute for Safe Medication Practices, have been very, very valuable resources in our industry to learn from the collective experience of others and how to funnel that into preventive strategies for ourselves to try to avoid problems in the first place.


If done properly, I think we could have a strong positive. Improper use of that might be basically people putting a software vendor‘s entire program into the public domain by accident. I think there need to be some manner to it so that you could abstract out the issue, abstract out the event, summarize it properly, and share it as a best practice opportunity or a common failure mode that others could learn from without actually entering all the software that is heavily invested in to try to create an innovative edge from beginning part of the public domain.


Joseph Heyman – AMA – Board Chairman
Anybody else?


Jeanie Scott – VHA – IT Patient Safety Office Director
I‘d like to comment on that from the Department of Veterans Affairs, specifically with the example that I explained earlier during my testimony, the benefits and risks of public safety reporting. That particular notice did go out on a public facing Web site. It received congressional attention afterward. Would I make that same decision again to notify our clinicians? Yes, I would.


It allowed us; it allowed our clinicians to know what was wrong, and they could avoid it, number one. And number two, it allowed our software developers to quickly fix it. So there are definite benefits. The risk of communication of a risk, and it goes out to the public, we were willing to take that so that we could benefit for our patients.


Joseph Heyman – AMA – Board Chairman
Okay. Yes, go ahead.


Justin Starren – Marshfield Clinic – Director BIRC With respect to the issue of standardization, if you were to give me infinite money and infinite staff, and I realize this doesn‘t have an appropriations component to it, I still could not build you the one perfect EHR. One thing we learned, as we moved our EHR out into the market, is that clinicians who have been using an EHR for ten years look for very different features than those who are just moving from paper. So assuming that there is a single solution, I think the committee would be much better served tackling a simpler problem, which is creating a uniform governance model for all academic medical centers.


Michael Stearns – e-MDs – President & CEO
I‘d like to comment. One thing that hasn‘t been mentioned earlier is about the driver of what‘s pressuring the vendors to be responsive, and that is the competitive influences. So a public reporting system, speaking more on the behalf of the MDs, not from the vendor association, a public viewing of information would probably be a significant driver as well. However, there are a lot of concerns about how that information is displayed. The aviation industry has been very successful with making that more anonymous in reporting in a way that the information is carefully reviewed before it‘s released, before it becomes public knowledge. There are certain mechanisms in place to protect the public safety. Thank you.


Joseph Heyman – AMA – Board Chairman
Okay. All right. Paul?


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Thanks to the panel for the multiple perspectives you offered and a lot of the good advice. Our first panel, as you know, spent a lot of time emphasizing the importance of iteration because it‘s just got to get better, and part of this public disclosure is part of it. But obviously we need to apply the resources and the commitment to make that happen in a real sense. Ross also pointed out that since these are usually multimillion-dollar acquisitions, you essentially have a locked in clientele. So we‘re at your mercy in terms of the software at least systems at the base level.


One way to understand where the commitment of a particular company is to look at is to follow the money or the accountability. My question for you is, like in a healthcare organization where the chief medical officer is responsible and accountable for the safety and the quality of the products that they deliver, is there a single clinician, primarily a doctor or a nurse, who is accountable in your organization for the safety of your products?


Does that person wake up every day wondering whether, and certainly doing the activities needed to make sure that your customers and the patients they serve are not put under undo risk? I‘m not talking

about a chief process follower officer, really a person who worries about the products and the way to quickly and iteratively make them better and better over time. And, in addition, to whom does that person report? That‘s to address the fox and the hen house problem.


I‘m particularly, I guess, interested in the commercial side because you touch so many more people, but if we have the time, I‘d love to hear from the folks who self-developed their software.


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
Shelley Looby, Cerner Corporation. Dr. Tang, we don‘t have one specific person. We have clinicians designing and developing systems for clinicians. We have multiple physicians, pharmacists, nurses, medical technologists on staff. I myself am a medical technologist. I came from the blood-banking world. So while I‘m in regulatory affairs, I‘ve always worked in a heavily regulated industry coming into my professional career.


I have to say that we, as clinicians, worry about that daily. It is utmost in our mind that patient safety, as we are all patients, is our primary concern. We ultimately end up reporting up to our chief executive officer, but we do so through varied areas. I report up through Ms. Johannes, who is chief quality officer. And the physicians will report up in various ways depending on where they are located within our organization and what solutions and services they are associated with. The overall answer is no, but the overall answer is also yes because we have multiple physicians, clinicians, etc. working on and worrying about the safety and efficacy of our solutions and their use within the public health.


Jeanie Scott – VHA – IT Patient Safety Office Director
I‘d like to also respond to your comment. I also am a medical technologist. I consider myself part of the clinical team. I am the director of information technology patient safety. I‘m not the single person. It‘s a team of people. When we were developing this position and this office, we struggled with those questions in VA, at that time VHA. VHA is both a consumer of the electronic health record, and a producer of it. We are homegrown, probably more city-grown, the size of us. We also use vendor products.


One of the key things that we determined was that it didn‘t really matter where this fell. We discussed whether it needed to fall within our software development. Did it need to fall within our national center for patient safety? Right now it resides in Veterans Health Administration in our office of health information. What is key is that we communicate among our patient safety. Our office of health information does our requirements and budgeting and other parts of our health information management … and we communicate with our IT … and that a key thing is having that open communication so that our software developers, our customer product support understand that it‘s not always a technology issue. It is a clinical issue, and they understand how to address it.


Joseph Heyman – AMA – Board Chairman
Okay. Yes, Paul? You‘re going to go through every one of them? I didn‘t see any other hands. I‘m sorry.


Carl Dvorak – Epic Systems – EVP
I was going to add a little hint. I was going to answer.


Joseph Heyman – AMA – Board Chairman
Okay.


Carl Dvorak – Epic Systems – EVP
In our organization, that person is me. I work with a chief quality officer and a chief process officer for patient safety. We meet 90 minutes a week on Friday afternoons and look at what the current trends are, what opportunities do we have to improve on it, and what can we further drive back into the original quality assurance to make sure that items that could create concern don‘t get out the door. We use a physician review committee to look at issues, to ascertain patient safety risk and impact. But that‘s our structure.


Joseph Heyman – AMA – Board Chairman
Go ahead.


Michael Stearns – e-MDs – President & CEO
I have a very brief comment that patient safety is our greatest fear that we‘re going to wake up one morning and find out that something in our software led to a patient injury. Fortunately, we have not experienced that, but it‘s something we really are very concerned about, so it‘s really a number one priority. Thank you.


Joseph Heyman – AMA – Board Chairman
Go ahead.


Paul Egerman – eScription – CEO
This is Paul Egerman. Actually, I appreciate your statement, Dr. Stearns, because that actually leads into my question. You say that sort of EHR Association patient safety is your number one priority. And we heard from the previous panel about the importance of disclosure of sort of publishing the problems. And Jeanie from VHA talked about a very effective system. Dave, ePatientDave, I guess that‘s your name now, but also spoke about the value of that.


When I read through, maybe I missed it, what the EHR Association‘s commitments are. I didn‘t see a commitment to disclosure. And so my question is, what is the EHR Association doing to get all of its members to voluntarily disclose any patient safety defects that may exist in their software?


Michael Stearns – e-MDs – President & CEO
That‘s a very fair question. We had about a week to prepare for this session. I was asked about a week ago to get information together. We had discussed many mechanisms internally, but we have not yet come to a concensus. So I‘m able to speak from a standpoint of e-MDs, but I‘m not able to speak from the standpoint of the EHRA as a group.


As you‘ve heard, there are slightly different opinions from two of the other panel members here as to how we should proceed, so all I can really say comfortably is that it is something that we are very much engaged in right now. It‘s a very high priority with an organization, and we‘re going to move forward as quickly as possible.


Carl Dvorak – Epic Systems – EVP
One thing I can add to that, I work also on the committee that‘s been driving unified EHR patient safety process for the vendor association, and one thing we did observe in interviews with vendors is that they all did have a very strong process, or the ones we talked to, of notifying customers of safety risks they found internally and were notified from other customers. So there was a pretty good concensus around internal reporting back out to customers to avoid multiple people hitting the same patient safety issue.


What we did find, however, that there were organizations that had many thousands of customers that found that sometimes they lost touch with who those customers actually were through time, so there were

a variety of different circumstances based on the kind of vendor, the kind of community they serve, and I just wanted to comment that most all the ones we talked to did in fact have a very strong process to notify in the event of a problem they recognize could affect patient safety.


Paul Egerman – eScription – CEO
Thank you. Actually, I also have a question for you, Shelley, from Cerner. I appreciate your statement that you are voluntarily disclosing everything through the Cerner Med Watch system, and that all of the issues have been very minor. And so, I mean, that‘s my question. Has nobody ever sent your president a letter or ever had any notification of a serious patient safety problem?


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
Yes, we have had notification of a serious safety issue that was sent to our president. We investigated it the very same way we do with all of our, what we call service requests, where our clients log a point or call in a point, however they choose to do so. And it was investigated. We identified the route cause. It turned out that it was a physician at a client site. The client site itself had not gone through and logged an SR, but the physician himself had a complaint that he made. And, as I said, we investigated thoroughly. We did not find any error or issue with the actual software or the implementation or their database, but it was addressed with the physician as a member of that healthcare community that was also a client.


Paul Egerman – eScription – CEO
But you didn‘t publish anything about that incident. You didn‘t think that was necessary to share that information.


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
No, nothing about that particular incident was published with the Med Watch Report.


Paul Egerman – eScription – CEO
Dave?


Dave deBronkart – Society of Participatory Medicine – Cochair
Hello. This is ePatientDave. I‘ve got a thought. Something the government could do, consistent with Government 2.0, it might be scary to a lot of people, but it would be technologically simple would be to provide some sort of an open to the public incident reporting system where patients who encounter an error in medical records can report it. And they may not know what system the hospital was using, but they can report it, and patterns could be detected.


This comes to mind because, in my own kidney cancer patient community on acor.org, it‘s common for patients to discuss with each other, oh yes, well, if you have that particular medication, watch out for this, this, and this. Now that doesn‘t have to do with EHR errors, but empowered, engaged patients who once they wise up and know that they can look out for things, and they can also, like, go looking for known glitches, and the same way you can shop for reliability reports on cars could be simple and useful.


Joseph Heyman – AMA – Board Chairman
Marc? Why don‘t we just move down the line there? Everybody has got their cards up.


Marc Probst – Intermountain Healthcare – CIO
An interesting situation that I was thinking about, as you were speaking, is we don‘t live in a homogenous system environment, and VHA is a good example. I think it‘s when you were speaking, Jeanie, that we use vended products. Some of us have self-developed products. And I think a big challenge that comes up is the pointing of fingers game. We may find a patient safety issue or an issue with our systems, and

suddenly it‘s difficult for someone to take accountability for that other than us as the users themselves. Any thoughts on how to at least relieve some of that challenge that exists, because it‘s very real, and it‘s very slow. So even once that is reported, it becomes a very slow process to have resolved.


Justin Starren – Marshfield Clinic – Director BIRC
I think one of the issues that comes up is that most of these errors are due to complex interactions so that assigning blame becomes very difficult, and this is compounded by the presence of many software vendors pushing for hold harmless clauses so that the provider is very reticent to say, our use of this software resulted in harm because they know that they will be left holding the entire bag. So I think that dealing with the issue of liability and who ends up holding the bag for what needs to be clarified before people will report.


I was recently looking through Edwards Demming‘s quote, and he has a wonderful quote. He says, ―Whenever there is fear, you can‘t trust the figures.‖

Dave deBronkart – Society of Participatory Medicine – Cochair
This is Dave again, and that‘s exactly why I recommended amnesty. We‘ve got to get over the idea that the goal is to figure out who is guilty.


Marc Probst – Intermountain Healthcare – CIO
Yes, I think, in the short run, I mean, I don‘t see that happening real fast. Maybe I‘m a cynic, but are there any mechanisms, particularly from the stakeholders of the vendor side that could help facilitate this problem?


Dave deBronkart – Society of Participatory Medicine – Cochair
Pardon me for interrupting, but who is going to stand in the way of that? I mean, that‘s an honest question. I‘m not trying to be obnoxious.


Marc Probst – Intermountain Healthcare – CIO
I don‘t have that answer.


Dave deBronkart – Society of Participatory Medicine – Cochair Because we simply could legislate or regulate reporting an error, I mean, there are apology laws for medical errors, right? And they have tremendous statistics very quickly on that.


Michael Stearns – e-MDs – President & CEO
Very often when a patient safety issue is identified at the software level, making the change is not that hard. However, doing the testing to make sure the new problem is not introduced in another are of the application is the biggest challenge, and the testing, the beta testing, etc. That, unfortunately, is inherent in the process, and leads to a number of delays. If there‘s a three-month turnaround might be fairly quick. There are mechanisms that vendors use to get patches out, etc. to repair these things. But it is a challenge right now, but I don‘t think, from my own experiences, there‘s ever been any delay in accelerating this. It‘s absolutely the pedal to the metal to get these things fixed and out the door as quickly as possible without introducing new problems.


Jeanie Scott – VHA – IT Patient Safety Office Director
And I would like to just make one comment back to your question is, I think that when we say certain words—defect, error—they have that connotation of a very negative, and so in some of my experience with vendors, once they‘ve understood what the safety risk is, they‘ve been very open to us, and a couple

of them said, well, tell us about your patient safety program. And I explain to them about a near miss, a close call, the opportunity for learning.


And it opens up the receptiveness to either start building into their IT reporting system or some other way to even begin to notify us, as a consumer of their product, as it integrates with us. Here, this almost happened. We want to tell you about it. And so I think, in answer to your question, one of those ways is to get away from saying defect and error, and encouraging, as I had mentioned, close call reporting and near miss, which is what the aviation safety system is based upon.


Joseph Heyman – AMA – Board Chairman
Latanya?


Latanya Sweeney – Laboratory for International Data Privacy – Director
Yes. I just have two questions. So one of the things that‘s happening as a part of our work is not only producing more EHR systems, but actually interconnectivity among them. And that obviously introduces the whole question much more so of security vulnerabilities. We‘ve certainly seen that a lot in regular computer systems. Do you think that you‘re equip to handle these kinds of security vulnerabilities, sort of virus attacks, things like that? Are you seeing any now? Do you think they‘ll increase?

Jeanie Scott – VHA – IT Patient Safety Office Director
I‘m going to say, from the VA … widely publicized for security, and one of my counter directors is our security person. The answer to your question is no. We use the same security, whether it‘s in-house or vendor product, and the security is determined upfront. We ask for a compliance part, so the answer is no. We don‘t see that as a risk.

The risk that I do see though for security is when security regulations are placed so tight that they interfere with the use of the EMR. For example, in surgeries, and a screen saver has to come on. I see some smiles here. I think you get the drift on that, and that‘s why, as the security person and I are in the same office. We are balancing between privacy, security, and safety. Those are all terms that raise red flags, but they need to be balanced together.


Carl Dvorak – Epic Systems – EVP
I think I could add that, from a security perspective, I don‘t think exchange really ups the ante on that. I think that‘s manageable. From a privacy perspective, I think exchange has some deep concerning elements right now with the notion that you might segment a record. I don‘t believe there‘s a practical way to safety segment a record where a patient can really trust one across that line.


I think there is a third category related to exchange, and that is the safety discussion we‘re having today. An HEI sitting in the middle of multiple different paradigms of electronic records each using multiple different content sources, possibly different mapping schemes, different coding schemes, I think the HIE will become the new source of patient safety concern, as information tries to successful flow through those and come out the other side and mean the same thing as it meant when it went in.


Latanya Sweeney – Laboratory for International Data Privacy – Director
The second question I had, and I really thank you for bringing up the fact of password problems because in the Institute of Medicine report, the use of passwords, systems actually created a lot of chaos in various hospital systems, and so that is an area that the first panel didn‘t talk about, but we should probably put on our list.


But my second question has to do with bug reports. The first panelists talked a lot about almost sort of pushing this idea that we have to have openness of reporting and so forth. And so what actually is your current model that you use for bug reports? The other thing that came up in the first panel is not just an idea of I wanted to open this for reporting, but a confusion by me as to how that could actually be generalized to be efficient unless you take a chunk of the data that generated it, or unless your system keeps the providence of the sequence of events that led to it, and that that becomes a part of the bug report. I know that does happen in current computer systems and current software generally. I don‘t know if it‘s happening in your systems, or if that‘s the way you intend to move so that you can get a much more comprehensive view.


Justin Starren – Marshfield Clinic – Director BIRC
Our systems log roughly 100 million events in our audit logs per day, so we can reconstruct for any clinician what was happening down to the millisecond level, which we then can use to try to debug specific reports. I think the challenge though is what is a bug and what is a feature because we have several thousand requests in our wish list log of things where the clinicians say, I think the system is wrong. It should do it this way. Well, most of them are not actual software bugs.


Jeanie Scott – VHA – IT Patient Safety Office Director
I would like to address that as well. I guess bug also goes to that same thing of defect and error. It‘s something that‘s wrong when you have a bug. One of the things that we encourage is not only for the defect, but for the concerns of what were you trying to do with the software, and if I was to quote something, you know, functioning as designed, if the design is error, it is functioned to error.


What we look at, to give you an example of a context that you could use as a model is, my team actually takes in the IT safety reporting and does the analysis of it. The first thing that before we go, and we start analyzing what is the level of risk is we get the story. Not the story of what the software did, but the story of what the clinician was trying to do, and then we go to see how well did that software match with that mental model, and we identify those gaps. The gap could be a software bug, but it could also be a usability type of issue or an interface exchange issue, or a mental model issue, among other types.


Michael Stearns – e-MDs – President & CEO
One of the challenges we face is software developers and users is how to recognize these errors. Lots of times the people looking at these errors are not … domain expertise to see them, so we really have to escalate that. What we‘ve done at our company is if there‘s even a hint of it, we actually have three physicians on staff, and they get involved with the processing. So the error handling, it‘s just not easy to detect all the errors. Now as we go with more complex systems and more interactions between various systems, that‘s going to escalate. Just the challenge associated with the use of claims data in HIEs, as associated with a number of challenges, which we‘re going to be facing over the next few years.


Joseph Heyman – AMA – Board Chairman
George?


George Hripcsak - Dept. of Biomedical Informatics Columbia University – Chair
Thanks. George Hripcsak. If meaningful use incentives are successful, this is mainly for the vendors, then we‘ll have a lot of people rushing to install these systems, which we then worry about patient safety issues that occur. But it‘s not just an individual problem, but maybe, in aggregate, do we overwhelm our national resources in effect? Number one, have you seen any effect yet? And have there been any consequences with meaningful use driving things forward? Do you foresee it? And do you foresee, number three, having to cap the number of installs you can do so that your trained people can keep up with it, or how are you planning for this?


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
This is Shelley Looby. I think we have seen a minor increase. It‘s the wait and see at this point for a lot of our clients. We will have to evaluate, as it becomes more clear what is going to happen and how it is going to happen, as to how we develop our internal systems to support design development, support implementation, etc. And I think one of the important factors there is that we maintain close working relationships with our clients, so we can understand their timelines, their goals, and how quickly or how slowly they want to install and eventually use it. I think, here the big point is, open communication and honest communication about what their goals are and the timeline for those goals.


Carl Dvorak – Epic Systems – EVP
Carl Dvorak, Epic. We‘re seeing a modest uptick as well. I think it‘s mostly those who are excited about doing it before the economy collapsed, now finally being able to feel they could do it, but not much beyond that at this point. I think, historically, we‘ve always moderated our own customer intake so that we wouldn‘t experience a problem where service would be degraded or quality would be degraded. I assume we‘ll continue that, as we always have.


Michael Stearns – e-MDs – President & CEO
Thank you. Mike Stearns, e-MDs. We have seen a lot of interest. We‘ve seen a little bit of uptake, but a tremendous amount of interest, people asking questions, much more knowledgeable users or potential clients, which is interesting. In order to plan for this surge, though, where everyone, I think, is organized, all the companies are organizing their internal processes, so the can more quickly sort of certify their internal staff, and also to get their users to a point where they‘ve also reached the certification level. So right now it‘s a great time to kind of build up your internal resources so that you can move forward once we think the rush hits.


Joseph Heyman – AMA – Board Chairman
Joan?


Joan Ash – Oregon Health & Science University – Associate Professor
I have a question about adoption as well, but from the other point of view. It seems like we‘re talking about reporting in two different ways. One is reporting from the vendor side, and the other is reporting from the organization side or the provider side. Where adoption comes in is, my question is, do you think that additional reporting requirements for safety purposes will have an effect on adoption? In other words, will organizations, and maybe the vendors can speak for their customers here, will organizations hold back on adopting if they think there will be additional reporting requirements?


Joseph Heyman – AMA – Board Chairman
You might want to separate your answer into two groups. One group would be the large systems and hospitals that have the feasibility of adopting all of the meaningful use requirements for reporting, and the others would be people like me who are in solo practice and our ability to report honestly. If we have to attest, we‘d still have to be able to show that we could do this.

Justin Starren – Marshfield Clinic – Director BIRC
I do not think, from the discussions I‘ve had with clinicians, that the meaningful use reporting or adverse event reporting by itself is going to significantly impede adoption. I think that there is already concern among a number of clinicians that the … ability of electronic health records may put them at increased risk, and that remains to be sorted out. What I‘m also seeing is that the ARRA privacy requirements and the minimum penalties for that are already greater than the entire financial benefit of adopting an EHR, and I think that that is causing a number of groups to say, well, with an electronic record, I can have a

much larger exposure than if people have to physically steal paper, or I‘m limited to every time I stuff a piece of paper in the wrong envelope, I have to get our compliance attorney involved, which is already the case.


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
Shelley Looby. We had, I‘ve had minimal experience here, but I have had a couple of clients call me personally, old blood bankers, talking to another old blood banker, and they are worried about the privacy situation with the patients and the reporting. And they‘re also worried about how much, how little, and to what context do I have to add to this report about the issue that occurred, and will it be too time consuming? Will it add any benefit to me? Will it add any benefit to anyone else? I think I kind of echo what Justin just stated is that there‘s a privacy concern on the part of our clients, as well as maybe the cumbersomeness of the actual reporting that may be required.


Joseph Heyman – AMA – Board Chairman
Paul? I‘m sorry. Go ahead.


Adam Clark – Lance Armstrong Foundation – Director for Health Policy
That‘s quite all right. This is going to be more directed at you, Dr. Stearns, but Dave or any others, feel free to comment because I want to bring the focus a little bit back to the patient and the patient advocacy community here. The Lance Armstrong Foundation is actively engaging in this area, and we‘re conducting focus groups right now to talk with some patients, get their impressions of EHR. We‘re asking them to actually bring in their paper records, the stacks and stacks of records, to start to demonstrate really how this can help with the patient.


Now in the previous panel, there were some analogies to designing a car. And when I landed here yesterday and got to the hotel, there was some discussions going on at the Hill with Toyota and what happened with their cars. And I see some similar unintended consequences here that a few bad events, and they are serious events, can really have negative repercussions across the entire system, particularly with patients and patient advocacy communities who don‘t really understand the depth of all the relationships here.


Is HIMSS actively engaging with patient advocacy communities and other patients out there? Do you see a role for that? And, subsequently, are there things that maybe the policy committee here should be considering to help bring the patient into the conversation?


Michael Stearns – e-MDs – President & CEO
I think that‘s an excellent idea. I can‘t speak directly for what HIMSS is doing with patient advocacy groups. Maybe someone else on the panel is familiar. I just wanted to say though that the work that we‘ve done to reach out to the – I‘m sorry. I lost my train of thought. Give me one second. Let me regroup here. Carl, do you have anything to say about HIMSS patient advocacy?


Carl Dvorak – Epic Systems – EVP
I don‘t. I don‘t actually work with that group…. What I can add, though, in terms of patient involvement, we do an application that helps patients directly connect to their EHR, and not to a copy of it, but actually to the core EHR. Organizations that use that have really seen an uptick in actual patient engagement with making sure that the information accurately reflects their health status, so it‘s opened up a new avenue for patients, like Dave, to communicate with regard to what is actually stored in the health record about them.


Joseph Heyman – AMA – Board Chairman
Go ahead, Paul.


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
I have an observation, and not a criticism, and then I have a question. I‘m feeling a bit of a disconnect between panel number one, which is enumerating a lot of the risks that we suffer, and the responses from a lot of the developers of software. Except for Jeanie, and I‘m trying to decompose that, I think it‘s because the VA is both the developer and the user, and the payer. And so, in a sense, like many other self-developers, they‘re embedded in their customer base and the workflow that causes, as we all talked about, the interactions that lead to potential harm.


I‘m wondering if I can push the commercial developers a little bit more on what concrete proposals do you have to accommodate the fact that you don‘t have that same opportunity of being embedded with your customer base and being able to know about these potential risks? One avenue, of course, is reporting, but also, as Carl mentioned, notification. So our free flow of reporting into you about these things, and your also dissemination of some of both the problem and potentially the solution because it‘s not clear to me, or at least I certainly have anecdotal evidence that waiting and reacting as a mode for vendors is accomplishing this, you know, is good enough. Maybe some … proposals on how would commercial vendors like to see it, because we‘re going to have the third panel talking about potential solutions or approaches. What do you see as the way that you can get closer to your end customer and the complex interactions with your software?


Carl Dvorak – Epic Systems – EVP Yes, I agree. I see panel one as very different and distinct from panel two, and I could characterize it as, it‘s not the doctor‘s fault, and vendors sit on the other side and say, well, but it‘s not just us either. I think there is a middle ground, and it needs to be a healthy and vigorous middle ground.


In terms of transparency and reliability and sharing information back and forth with customers, it would be shameful to not do that and to do that well. I think, in terms of the next category you‘re talking about, what could we do to prevent problems from emerging in the first place, not just handling them after the fact. I do think there are organizational things that we do today, working with groups like Institute for Safety Medication Practices, the Jayco things. There are many things that are focused through those kinds of organizations that, as EHR developers, we stay very glued to because we know that they are bodies of knowledge around safe medical practice.


I would suggest supporting those kinds of organizations and using that as a model for the future and possibly weaving some elements of that into the certification process. I know that the certification process was slightly denigrated this morning. Suspicious to see how the HHS certification turns out. Will I be a superset or a subset? In fact, I know the CCHIT requirements do not generally please most vendors. They‘re seen as a bar that has even additional hurdles that may not be required. But I do think focusing on organizations like the Joint Commission, like ISMP, and what they‘ve done to provide a reference point for safe implementation of things in healthcare would be a solid recommendation.


Shelley Looby – Cerner – Director of Regulatory Affairs Quality Assurance
Shelley Looby, I agree with Carl‘s comments, and all of those are good areas, and areas that we currently work with and can strengthen. I would like to go back to something Dr. Classen said earlier, maybe trying to conjoin the panels a little bit and not be so polarized. The certification upfront is great, but he mentioned about the ongoing testing as it is being utilized, as it is being sophisticated, and as end users are realizing more benefits. I think the communication of that in-house or end user type experience and testing being given back freely to their vendor/vendors is going to be critical in us being able to keep up with the pace, so to speak.


Carl Dvorak – Epic Systems – EVP I will also note that in the self-developed arena versus a commercial arena, you have to be careful in both. And I think there‘s a hybrid vigor for multiple, large organizations participating in the development of a standardized EHR that many of them use, you see some advances that you don‘t see in some of the self-developed categories. That said, the folks who work with self-developed are often so energized and so focused on it, they do extraordinary things with what they do.


I don‘t see one or the other as being inherently bad or inherently good. I think both could be executed well, both could be executed poorly. And I think, rather than to try to draw them out as a stark contrast, I think, look at what elements work well across both of them.


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Let me follow up on that a little bit more. I wasn‘t trying to draw a contrast. I was trying to get the lessons learned from what Jeanie was saying in terms of how they approach it and their attitude, and how can we give you some of that advantage of that end user experience so that it can not only inform you what‘s happening with your current products, but as you mentioned, proactively get embedded in the design process for your future functionality and products. And I don‘t know that I would rely on, you know, Joint Commission, etc., as your conduit. That‘s where I‘m….


Carl Dvorak – Epic Systems – EVP I think every vendor will have to answer their approach to that for themselves. What we tend to focus on is direct engagement with programmers and clinicians in the field, in the cardiology office, gowned up in surgery, to participate in those events, to gain the depth of understanding that you need to own before you go and write software for it. Software design is one of those fields where people will comment on the importance of a design and design document. That‘s good. That‘s important.

But for every design decision document and the design document, there are probably 10 to 100 little design decisions that need to be added to round out what really becomes the software, and that‘s where having your programmers in direct contact routinely with clinicians makes a difference. How we approach it, we approach it through that channel to try to make sure that programmers spend time with doctors. And I assume that folks who do it in the self-developed world focus on a similar kind of thing as well.


Jeanie Scott – VHA – IT Patient Safety Office Director
I would just like to make a comment. Even though the VA is our major product manufacturer for our software, the organization of VA, VHA, VBA, and NCA, as of several years ago, our product development is not within VHA, so we are actually a vendor within ourselves. But we have the same struggles as if we did have a private commercial vendor. They just happen to be the VA.


Some of the same struggles in that certain development groups will understand the impact, and some of that might be that they have clinical people who have come in through the computer science world. There are other groups who really struggle and say, you know, this is a patient safety possible event or close call, and it‘s, prove it to us. You‘re the customer. Prove it to us. I just don‘t want you to get the misconception that it‘s easy for us. We have, I think, the same struggles as if I was ABC medical facility in any state.


Michael Stearns – e-MDs – President & CEO
I think that the real difference when looking at the so-called internal systems versus the vendor systems, and there was a comment made by the panel that many sites moved from internal to vendor systems and still achieved excellent results, is that these systems have a huge amount of local configuration. Nancy

Leveson, the disaster expert at MIT, noted that most disasters related to software can be traced to software that wasn‘t written originally for that purpose, but was adapted from a previous version or a previous site or a previous product.


Essentially, almost every large EHR installation has so much local configuration that it is that kind of repurposing, so it‘s very important that the local implementation team be very quick to respond and very quick to adapt because many of the errors are things like are the correct default choices ordered in the correct order in the pull down menu? And that‘s often a local configuration, not a piece of code in the software itself.


Carl Dvorak – Epic Systems – EVP
Also to reinforce, I think, Dr. Koppel pointed out this morning, there are combinations that simply do not work well and have been proven to be dangerous. We will refuse to do a CPOE install that interfaces with a third party pharmacy system. It‘s not any competitive practice. It‘s simply because it‘s a dangerous thing to do. So there are those known combinations that should be avoided. And I think what‘s changing in the industry is that people are looking for combinations of applications that are pre-built to work together. That is a change that we‘re seeing, and although I think there‘s interest in the hybrid vigor of individual niche pieces and parts, the problem is that the benefit that they put forward, it‘s hard to actualize when you have to connect all those wires behind the scenes, and those wires become a safety concern that the organization is left to deal with on their own.


Justin Starren – Marshfield Clinic – Director BIRC
The first panel discussed what is the problem, and that‘s a challenge right now, what really is the problem. I know, in the ambulatory sector, we just don‘t see a lot of issues related to patient safety errors, and when we do, we‘re very responsive, very reactive. Sometimes they‘ve very difficult to understand. It takes a lot of in depth analysis of the process, a lot of thoughts. Different clinicians have different viewpoints. But it‘s really a challenge right now to identify, so the FDA mentioned they had some reported errors. We‘d be very interested in learning about those, what exactly the circumstances were surrounding those, so we can get a better understanding and move forward with that information. Thank you.


Joseph Heyman – AMA – Board Chairman
Dr. Blumenthal?


David Blumenthal – Department of HHS – National Coordinator for Health IT
Thank you, and thanks to our panelists for a diverse set of very informative views. One of the things that the VA experience highlights is the value of information, not just clinical information, but information about systems. And I was wondering if any of you have thoughts about how reporting of problems should be done so that the information reported is as accurate and useful as possible. The electronic records are complex, and they are subject to complex ergonomic relationships and human factors, issues.


And the question is, what form should reporting take so as to reduce the noise and maximize the value of the information that‘s available? There are reporting systems right now. For example, Med Watch is one of them, which are far from perfect in terms of the actionability of the data that‘s provided. So I would be interested in knowing. The VA may have the most experience with this, when you're trying to disaggregate and learn from experience. Having a form of reporting and a mechanism of reporting that works is very desirable.


Jeanie Scott – VHA – IT Patient Safety Office Director
That's a very good question because we‘ve had several, several ways of receiving our reports. And actually, it‘s my office that gets all that noise in there, and then we have to discern what needs to go back out to our physicians, or what needs to go to our development staff.


I can say that we did have an experience with a model of the aviation safety reporting in which I did receive IT reporting. It was not helpful. It was noise because it didn‘t continue with that analysis, so I think it is key that you do need to be able to take that noise in, and then have a methodology for determining what that is.


And there will be some. I will stand before you today, sit before you today. There will be those single points of noise that will just be noise until you see the second noise, and the volume gets higher. I think one thing is, I would encourage that noise, but not for that noise to go outward facing.


I guess, as an example of that is, we will see some of that noise and say, well, it‘s a low risk. It‘s a low risk. But then when we start seeing the volume of it go up, and the volume does not have to be 10, 20, just a few because we understand that if it happens one or two times at three facilities, it‘s going to happen probably in all of our facilities. We will aggregate that up and go to our development teams and say, we are detecting a problem.


Of that same sense is, our developers will also report to us single occurrences, and I have to take those in as a single piece of noise. But when a customer then reports it to me, we can then go through and say, well, we‘re getting it from both our product developers and our customers. There‘s a problem there.

Latanya Sweeney – Laboratory for International Data Privacy – Director
This is Latanya with a quick follow up to both the comment David made, as well as Paul earlier. And that is, what exactly is the format of the reporting? Do you have standardized formatting? Is it somebody just calls you up or sends you an e-mail screaming?


Jeanie Scott – VHA – IT Patient Safety Office Director
All of the above. Literally, all of the above. We have, in our IT reporting system, we customized it when we were bringing it into the VA. We have a small section of it, three little fields. The customer can go in, flag it as potential patient safety. So when the Blackberry system goes down, somebody might flag that as potential patient safety. We have to go through and discern that. We ask that they give us a justification. We have a set of questions on there.


Then we also give feedback back to them with a tracking number. Beyond their IT tracking number, what is their safety tracking number? We also will get feedback from our national center for patient safety, Dr. James Beijing‘s department. They will go through and analyze the route cause analysis reporting for their healthcare failure mode and will send us an e-mail. I‘ll get a phone call from a facility or a patient safety manager and will call. I‘ll get a phone call from a developer, so we encourage every avenue. And we do track it then in a standardized database.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Yes, so then what do you do? For example, suppose using lots of wireless devices, and I report an error that I‘m getting bad values in my system. What would you do? What would any of you do?

Carl Dvorak – Epic Systems – EVP
Basically if someone were to report a concern that to them looked like loss of data, corruption of data, anything out of the ordinary, that escalates immediately. There‘s a team appointed to look into it, and to work with the customer to first try to recreate it.


Latanya Sweeney – Laboratory for International Data Privacy – Director
On one call?


Carl Dvorak – Epic Systems – EVP
Typically on one call, within minutes of that first call, people are mobilized. Those don‘t happen often, thank goodness, but they‘re very serious when they do happen. Typically you need a multi-party taskforce because, in your example, something going wrong with the wireless network is going to need to involve not just the developer, you know, back in the VA shop or an Epic shop. It‘s going to require somebody in the facility who understands wireless networking, someone in the data center. You have to get a multi-stakeholder team put together immediately, and try to see if you can recreate, and try to look at the patient records that might have been involved to see if you can find out what‘s happening, a bit of forensics there, but basically something of that nature would get a near instantaneous response.


Jeanie Scott – VHA – IT Patient Safety Office Director
And I would add to that, as part of that investigation series, we attempt to do an initial safety assessment. We look to see, well, just because the wireless network went down, what would happen to our patients? What would be the level of severity? Would it be a catastrophic, or would it occur no harm? And we have a gradient system.


Then we look to see, and this is based on, I worked with the National Center for Patient Safety on this, and engineering risk principles…. We would then look to say, well, what is the likelihood of that level of severity occurring? I think the third dimension that we use that is very important as to what we do then is how likely would that person at the very end of that who is delivering the medication or is processing the lab results, how likely are they able to determine that this is just not right? It is that third dimension that allows us to really see what is the potential impact? Was it a low risk, moderate, or high risk?


Then we can work with the development teams to then determine what are the actions on there. And we do that for every case that comes in, and so as I talked about earlier with Dr. Blumenthal is, that first noise that comes in there may be a low risk, but then as that noise level goes up, we can then look at our frequency, and we may have more information in our investigation to assess the severity or the detectability of it, and our initial rating may change.


Joseph Heyman – AMA – Board Chairman
I want to thank the panel. You folks were terrific and very interesting. Judy, do you do have any announcements or anything?


Judy Sparrow – Office of the National Coordinator – Executive Director
No. I think we‘re just breaking for lunch until 1:00, I believe. Right, chairmen?


Paul Egerman – eScription – CEO
Yes.


Joseph Heyman – AMA – Board Chairman
Okay. Thank you very, very much.


(Participants Break)


Judy Sparrow – Office of the National Coordinator – Executive Director
Hello, everybody. I think we‘re ready to begin if you could take your seats, please. We‘re about ready to begin. Thank you. Operator, could you bring in the public, please?


Operator The public is on the line.


Marc Probst – Intermountain Healthcare – CIO
Good. I think we had a good morning, and really had some great panelists, and appreciate the effort people have put into that. This afternoon, we want to look at some possible approaches, and Latanya Sweeney has volunteered to help lead this panel and this discussion this afternoon.


Latanya Sweeney – Laboratory for International Data Privacy – Director
The first panel earlier today introduced the issues by focusing on usability factors, workflow conflicts, software testing of valid and consistency values. And a common theme that tended to be throughout the day was this issue of perhaps reporting and how you‘d deal with bug reports and so forth. We‘re looking for this panel to solve all the problems, so that we can go home feeling quite successful, and that would be great for us. It would at least point us into some interesting directions.


Let me just introduce everyone. Dr. Shuren is the associate commissioner for policy and planning at the Food and Drug Administration, and directs the agency‘s office of policy and office of planning. Dr. Munier is the director of the center of quality improvement and patient safety at the Agency of Healthcare Research & Quality within HHS. Dr. Walker is the chief healthcare information officer at the Geisinger Health System in central Pennsylvania where he leads Geisinger‘s development of integrated outpatient/inpatient and patient EHRs. Dr. Shortliffe is the president and CEO of the American Medical Informatics Association, AMIA, which many of us know and love well, based here in Maryland. Thank you.


I think we‘ll start with, we‘ll again try to keep everyone‘s comments to five minutes. As you can see, there‘s a lively number of questions, and I think those are some of the best interaction, so even if you don‘t cover something in your original testimony, I‘m sure you‘ll be able to get it in, in the question and answer. We‘ll start with Dr. Shuren.


Jeff Shuren – FDA – Director of Center for Devices & Radiological Health
Good afternoon. I‘m Jeff Shuren. Actually, I‘m the director of FDA Center for Devices and Radiological Health. I used to be an associate commissioner. Thank you for the opportunity to participate in this workgroup discussion and share the FDA‘s perspective on potential approaches to address health information technology related safety concerns.


By taking a balanced public health approach, the FDA believes it can enhance the benefits that HIT can bring, while minimizing the risk that this technology can potentially create. The FDA‘s center for devices and radiological health, or CDRH, is responsible for protecting and promoting the public health by assuring the safety, effectiveness, and quality of medical devices, including software devices throughout their total product lifecycle.


Under the federal Food, Drug, and Cosmetic Act, HIT software is a medical device. Currently, the FDA requires that manufacturers of other types of software devices comply with the laws and regulations that apply to more traditional medical device firms. These products include devices that contain one or more software components, parts, or accessories such as ECG systems, as well as devices that are composed solely of software, such as laboratory information management systems.

To date, FDA has largely refrained from enforcing our regulatory requirements with respect to HIT devices. Nevertheless, certain health information technology vendors have voluntarily registered and listed their software devices with the FDA, and some have provided submissions for pre-market review. Additionally, patients, clinicians, and user facilities have voluntarily reported HIT related adverse events to the agency.


The FDA recognizes the tremendous importance of HIT and its potential to improve patient care. However, in light of the safety issues that have been reported to us, we believe that a framework of federal oversight of HIT needs to assure patient safety. Any such framework would need to take into account the complex and dynamic nature of HIT systems. Given the FDA‘s regulatory authorities and analytical tools, we could potentially, at a minimum, play a role in preventing and addressing HIT related safety issues, thereby helping protect patients while fostering confidence in these devices.

The FDA could consider a range of approaches. One possible approach would be to focus on postmarket safety by requiring HIT establishments to electronically register and list their HIT devices, and to submit medical device reports, MDRs, about adverse events or malfunctions to the FDA. Under this approach, HIT device manufacturers would be responsible for correcting identified safety issues.


The FDA could also use our authority to require postmarket surveillance or tracking of selected higher risk devices, which would provide more detailed information about the use and potential safety risks associated with these products. The FDA could share our postmarket information with vendors, certification bodies, and users to help improve the design of currently marketed and future products. The FDA would exercise our enforcement discretion to not enforce other applicable requirements.


A second possible approach would be to focus on postmarket safety and manufacturing quality by requiring HIT device manufacturers to comply with the requirements I already described, and also to comply with the FDA‘s quality systems regulation or QSR. QSR requires manufacturers to adhere to specific minimum guidelines to assure the quality and consistency of products on the market. For example, the regulation requires that device manufacturers establish procedures for handling complaints from users, for correcting and preventing recurrence of problems, and complying with appropriate design controls to reduce the potential for problems. Design controls are an interrelated set of practices and procedures that are incorporated into the design and development process of a device in order to check for problems and make corrections in the design of the device before its put into production. Manufacturers should already have good quality assurance, so the cost to them of implementing QSR should be low, but QSR at least creates a floor for quality that can be adjusted based on new knowledge.


In addition, based on data collected, the FDA could recommend design controls that would mitigate the risks that are unique to HIT devices such as those that pertain to the device/user interface or device/device interface. Such design controls would help to preserve the ability of user facilities to innovate and tailor the installation and use of these devices to their practical needs, while reducing risk to patients. The FDA would exercise our enforcement discretion to not enforce other applicable requirements.


And, lastly, under a third approach, the FDA would apply our traditional regulatory framework in which HIT device manufacturers would be required to meet all the same regulatory requirements as other more traditional devices, including risk-based premarket review. Through premarket review, the FDA could assess the safety and effectiveness of high and medium risk devices before they go into market use. The FDA could also require postmarket studies or specific product labeling for particular HIT devices, as conditions of approval. We welcome the committee‘s insights regarding these three potential approaches

or other possible approaches the FDA might take in collaboration with, as a complement to the work of other government partners to mitigate risk to public health, while promoting innovation. Thank you.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Dr. Munier?


William Munier – AHRQ/HHS – Director CQIPS
Thank you very much. It‘s a pleasure to be here, and I thank the HIT Policy Committee for inviting me and inviting AHRQ to present today. I want to say, it‘s also a pleasure to be at the other healthcare summit….

AHRQ is an agency that does research into quality and safety. Its current budget is about $397 million, and the areas that are directly germane to what we‘re talking about today are information technology research that we do tens of millions of dollars a year, which is, of course, publicly available and also in support of the Office of the National Coordinator. In addition to that, we‘re implementing the Patient Safety Organization program, which sets up privilege and confidentiality conditions to organizations that apply and are listed by the department, and also gives us the authority to standardize patient safety event reporting, about which I‘ll talk more in just a minute.


We‘ve been talking today, in general, I think, about three different IT areas, which conceptually are helpful to separate out, although I might say a software vendor developing an EHR has to think about all of them simultaneously. One is IT as a device to increase that allows increased quality and safety in terms of delivering care to patients. In other words, it enhances or provides the opportunity for providers to enhance the care that the give to patients. And that‘s inherent in the IT itself or proactively designed into the IT such as decision support. An example of inherent would be increased access. That's separate from the concept of safety of the device, whether it‘s functioning as intended, and whether it‘s designed in a safe way, and those are different issues, both of which Dr. Shuren addressed.


A third issue is IT employed to measure quality and safety, which has also been referenced a number of times this morning, and they are really three different things. In point of fact, the third one, I might suggest, is thought about less than the other two, probably understandably. But if one wants to measure adverse events and know what‘s going on, it has to be done thoughtfully. And, in point of fact, with IT, if you want to report using the IT, it has to be built into the IT at the time the code is written.


I would also say that Dr. Classen mentioned in his testimony that there were no baselines, and I would say there not only are not IT baselines, there are no paper baselines either. So when some bad thing happens with an IT device, you can‘t say, well, it‘s much worse than paper because you don‘t know. The paper situation might have been much worse as well. So measurement is an important thing if we‘re going to get a handle on whether we‘re helping patients or hurting them.


That brings me then to the patient safety organization program, which we‘re implementing, which gives us the authority to issue common definitions and reporting formats to be used by PSOs, but of course by anybody else who wants to as well. And I might just add that to support that development, we inventoried 67 systems across the country and even the world, and not just an inventory of what they were. We actually put them into a relational database, so we could look at the way they looked at different events. What I can tell you is, with a few notable exceptions, it‘s a mess, if I can use that scientific term. Nobody keeps track of anything the same way. Therefore, when you try to add things up or understand what‘s going on, you can‘t do so.

We are working. We have an expert committee at the NQF, which garners public and opines on public comment, which David is a cochairman of, David Classen, and we have issued a first version of our formats. We‘re in the process now of doing technical specification, so essentially we‘ve designed them from the end user standpoint because, by the time they would be collected at a national level, it‘s too late if you didn‘t collect the right information. So we‘ve put ourselves in the position of user, which is a hospital, a nursing home, a doctor‘s office, or whatever. And we started in the hospital, so they‘re designed from the end user standpoint. They‘re clinically designed, and we think, in binary terms, very specifically defined. Then we not only issue English language description to what needs to be measured, but we actually, and you have not seen these yet, but they‘ll be coming out at the end of March, are issuing technical specifications that will allow software coders to know exactly what we mean.


I don‘t have time. I have 23 seconds left or something like that. But I just want to say that there are all kinds of issues here. There are patient centered issues that come out of the EHR. There are event centered issues that come out of a reporting system where near misses and unsafe conditions never get into the patient record, and there are device and manufacturer centered reporting, and there are ways to make these all dovetail, but one has to think about them.


I just want to finally, and if I run over just a tiny second here, I want to talk about four quick concerns. One is regulation. It‘s a double edge sword. I was involved in the 3.5 years to get the PSO regulations out, so that‘s one to think about in terms of flexibility and the rest of it. Confidentiality is a huge issue in all this, public reporting versus protected reporting. Then, finally, two considerations, and I just am going to read from my written testimony, so I get this right.


It‘s important to harmonize the approach to HIT with that of other important areas of patient safety and quality. There‘s a temptation for each area to contemplate itself in great detail and isolation from other important areas with the result that protocols, expert committees, and oversight boards are established and function in silos. The end user suffers by having incompatible, multiple data collection and compliance requirements that are inefficient at best and potentially harmful at worst. The very advantages that IT offers—modularization, harmonization, etc—are lost.


Second, experts in a specific area often assemble very long lists of issues that should, in an ideal world, be subject to scrutiny. Operationalization of such systems that reflect laudable thoroughness may be too labor intensive to be practical. If such systems aren‘t adopted, or are adopted but not used, or are used other than as intended, the desired improvement does not occur. In sum, if tools, products are adopted to be adopted on a widespread basis and used effectively to improve patient care, all approaches need to be developed with the end user and other relevant areas of quality and safety in mind. Thank you.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Thank you. Dr. Walker?


James Walker – Geisinger Health System – CHIO
Thank you. It‘s a pleasure to be with you today. I‘m going to talk about hazard control. That‘s not quite what I called it in the written testimony. That‘s another title.

Why hazards? Nancy Leveson, the preeminent software safety engineer says hazard analysis is accident analysis before the accident happens. It‘s one of the truisms of safety engineering that hazard control is critical to producing a safe product or service or anything else.


The cartoon there is useful to illustrate this. For one thing, there‘s been a little bit of a tendency to separate users and vendors and implementers and others. One of the things about this cartoon is it

makes it clear that everybody is in the same hazard production and control game together. Another thing is emphasizes, although it isn‘t quantitated in the cartoon, is most, probably 95+% of near misses and patient harm began life as a hazard.


I want to tell you a brief true story about Geisinger to illustrate the importance of hazards. In 2005, we were preparing to implement in our first hospital, and the business analysts who create the workflows that will build into the system came to me, or came to the project director and said, we can‘t build safe workflows between the pharmacy system and the order entry system. This was dismaying because the pharmacy system was hands down the best of breed pharmacy system in the business, and our pharmacists had done a really beautiful job of making it safe and effective. The order entry system wins all of the polls on what‘s the best order entry system, so it wasn‘t about somebody building a crummy product. Maybe they‘re both primitive. Maybe they‘re not.

When we looked at it carefully, we agreed with them that it wasn‘t safe to proceed, so you can imagine pharmacy‘s response. They were deeply pained, but went through the analysis with us and said, okay. We‘ve got to rip it out. Then we went to executive leadership and said, at the cost of nine months on the timeline and several hundred thousand dollars, we recommend that you rip out the pharmacy system, and we put in the same pharmacy system built by the same vendor as our order entry system because we just don‘t think anything else is safe.


They said yes. We went ahead and did it, not knowing, of course, whether we were right or wrong, and found out a couple of years later when I was presenting at a conference. I mentioned this experience, and Dave Classen said, oh yes. We have a careful study of 62 healthcare organizations, and we concluded it just flat isn‘t safe to use a pharmacy system and an order entry system from two different vendors.


Now what does that illustrate? Well, first of all, no doctor or nurse or anyone else is going to report that as a problem because we eliminated it before it ever got into production, before it ever got anywhere close to production. The second thing is, if you did route cause analyses and other forms of retrospective accident analysis or near miss analysis, that kind of problem would almost never show up as a cause in one of hose analyses. It‘s too deep into the system and too, more or less, invisible.

The third thing is that it was found by implementers who had been by that time, for ten years, living in outpatient clinics, inpatient settings, emergency departments, and operating rooms learning how people work and mapping their processes and trying to create safe processes. And, of course, learning from the situations when we hadn‘t created safe processes. So all of this kind of coalesced for me, and I say, you know, every year there are several hundred, literally, of these that happen.


Every time we do an upgrade, we spend a couple hundred person hours going through all the possible things we could turn on or not turn on. And if we do turn them on, what we would have to do to configure them to create a pick list or whatever we would do, and then how would we train that to people, and how would that fit into existing workflows. Much of the time, we say, we‘re not turning that on. It‘s not safe. We‘ll wait until the next upgrade or whenever we think it meets our safety standards.

And so I thought, well, you know, it‘d be smart, having done this now for about seven years and not recorded much of this, to start recording these, and see, you know, start making estimates about how likely harm we think it is, and then tracking and see which ones cause harm because one of our experiences is, when you have trainers out there at go live, they can report back and say, you know that thing we were worried about? It doesn‘t confuse everybody. Everybody gets it right away. There‘s no problem. And we take that off our list. Others, they identify things that we hadn‘t found yet.


So the idea of the hazard manager is that organizations that are obsessed with the possibility of failure, which is safety-oriented organizations, will have a way to record hazards in a highly structured way. We‘ve created some terminology that‘s in the handout. And we continue to iterate it. They‘ll be able to manage those. They‘ll be able to keep track of this is one we felt was the vendor‘s issue, and the vendor said they‘d do it, and we don‘t expect it for six months. This is one that‘s our problem, and so-and-so is responsible for it. This is one that we can‘t fix, and we‘ve just got to train and monitor carefully and make sure it doesn‘t blow anything up.


At the next level, that same data, because almost all of it is in structured, codified form so that it‘s easy, relatively easy to deidentify, would be available to all of the customers of a particular product, so they could see all of the things that any of their fellow customers have identified as potential hazards. The vendor would also be able to see that, and it would be deidentified by healthcare organization. Obviously everyone in the community would know that it was that vendor‘s product.

At the third level, these same reports would be viewable and completely deidentified form so that researchers and policymakers and vendors and others would be able to see the whole universe of potential problems that have been identified, and also, over time, be able to see which ones of those actually turn into a care process compromise, so-called near miss, or actual patient harm.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Thank you. Dr. Shortliffe?


Edward Shortliffe – AMIA – President & CIO
Thank you. I‘m pleased to be here representing the American Medical Informatics Association. I won‘t say a lot about our organization since at least half the people in the room, I think, are members, on both sides of the table, so I appreciate the opportunity for AMIA to have a chance to comment on this important issue.


I‘ve written down a fair amount, and I didn‘t feel I‘d go through it all in grade detail, given the short time for the presentation today, but I would like to just touch on a few key points. First, something that I don‘t think has been really discussed yet at all by anybody, which is one of the approaches, I think, to dealing with the patient safety issue in the HIT arena, is to consider the workforce issues and the people that are in fact available to do everything from design systems to actually implement them to build them in the vendor environment, or to implement them once they are productized in the actual patient care environment, to evaluate them, to monitor them, and to do the research that creates safer systems for the future, and keeps a pipeline full with new ideas, new technologies, new concepts that will, in 20 years, be what we‘re all sitting around this table talking about.

My concern is that right now we‘re not producing enough people of this sort that I‘ve described. I‘ve been delighted to see ONC‘s emphasis on workforce development. If you imagine a pyramid with a lot of workers at the base, and more and more expert know-how higher up in the pyramid, much of the emphasis of ONC‘s workforce initiatives to date have been in the lower end of that pyramid, and so there are two areas that I think have so far not been addressed that are important to patient safety.


Very briefly, they are in the testimony. One is the issue of graduate education for the people that will do these tasks, masters and Ph.D. type training, often for health professions who seek this as a subspecialty area of training, clearly a greater demand than supply in that arena right now. And the second is recognizing that it‘s the physicians, the nurses, the pharmacists, and other health professionals who actually are being asked to use these systems who are often remarkably ignorant about the issues that

we are discussing today. And many of us are believing that this is an important part of medical and other health professional education that that safer systems are identified when you have better trained individuals actually using them who understand the risks, the benefits, the tradeoffs, and so some effort to try to do the work that's necessary to bring new curricula and new modes of educating health professionals in the informatics areas that are most pertinent to practice seems to us to be another important patient safety consideration not yet addressed.


Now it is our belief that essentially every affront to patient safety is in fact unintended. Nobody actually sets out to hurt patients in any of the work that we‘ve been describing today. And so, AMIA has been looking at the issue of unintended consequences, not only of the technology, but of policies, which sometimes have Machiavellian side effects not originally anticipated when they were set. So it is a tribute to this committee and to the health policy committee in general that these kinds of careful looks are occurring because many decisions or many proposals that could arise from this environment could have unintended consequences if effort were not made to really try to see the pros and cons of all the potential solutions.


Our meeting last September was totally on the subject for a day and a half of unintended consequences of health IT and health IT policy. I‘ve described it briefly with some of the key recommendations from that meeting that are pertinent to your consideration in my testimony, and I would simply direct you to that for a sense of how unintended consequences in general of … policy and HIT are pertinent to the considerations today of patient safety that we‘re focusing on.


But I realize that one of the biggest issues that the committee is addressing, and I have 45 seconds to address it, is regulatory options that might exist. This is an issue that we decided was so important that we wanted to do an internal, formal study. It was partly in response to the now 10 or 12-year-old last statement that we had on this subject, which was AMIA‘s paper published in the Annals of Internal Medicine and in JAMIA regarding regulatory approaches to clinical decision support software. The emphasis really had been on clinical decision support in those early efforts.

Now I think the issues are much more complicated. We recognize that many of the motivating discussions we‘re having today are really not about decision support software per se, but about impacts on workflow and interacting aspects of human and computer based systems that can lead to totally unanticipated problems, even with a lot of pre-testing the work. So this committee, the taskforce has been in position with a mix of individuals of varying perspectives on this subject.


They‘ve been working pretty hard since last fall. I think the first four months of their deliberations has reached the point where they now realize that they‘re going to have to give a balanced analysis of the pros and cons of various options rather than come up with a single consensus, which is, I think, a reflection of just how complicated these problems and these questions are. There are certain things that we can say, however, and I want to close with the few things that I think there‘s clear concensus on with a promise that the fire has now been lit under this taskforce, and we definitely hope that before you‘re done with your deliberations, we‘ll have a written document to share with you in the next few weeks, hopefully.

The three points then to close with: Recognition of the importance of scientific transparency in all that we do, and that patient safety thrives on such openness. Second, accountability, healthcare improves when stakeholder successes are rewarded and failures are acknowledged and corrected, but I didn‘t say necessarily punished. And, third, veracity, vendors, governments, clinicians, and patients all share the same duty to communicate sincerely and truthfully with one another, and those concepts are kind of guiding the taskforce in its consideration of the options, and I assure you that they are looking at the

regulatory question and will comment on the pros and cons, as they see it, of the options that are available and have been discussed.


Latanya Sweeney – Laboratory for International Data Privacy – Director
I want to thank you for all of your comments. They were really good. I want to sort of open it up to questions, and I‘ll start with myself with a couple of simple questions. In some sense, AMIA is saying, well … a little bit later, but I would be interested in, given the kinds of issues, the very concrete kinds of issues that were talked about earlier today, are there any recommendations of specific actions you think we should take in the policy committee? This is to everyone on the panel.


James Walker – Geisinger Health System – CHIO
I think it would be useful to take a reasoned stand on making reporting of hazards at least, and probably near misses and harm, protected under a patient safety organization. It‘s pretty clear that if you want people to report those sorts of things, they‘re going to need confidentiality. Now one of the advantages of hazards, as opposed to near misses and accidents, is that hazards haven‘t happened yet, and you can kind of brag about hazards that you found and mitigated. One of the virtues is they‘re easier to report, but I still think we‘re going to need a clear and firm framework of confidentiality.

William Munier – AHRQ/HHS – Director CQIPS
I didn‘t know he was going to say that, but since he said that, he would support use of PSOs. That would inherently mean use of common formats, and we would obviously support that. We‘re trying to harmonize the world of patient safety reporting so that people use the same format for how they report events. One thing that I neglected, a detail I neglected again during my general comments was the fact that, and in particular, at the urging of Dr. Classen, we have a simple question now on our common fronts. We have a device specific format, but that doesn‘t get into the IT specific nature of it. And we do have a simple question in addition to the generic formats, the device specific, all of which add together to give rich detail about what happened. We also have a question about whether IT was involved in the incident, but we plan to develop an IT specific format that will get in greater depth about exactly what went wrong involving IT.


It is complex because obviously it can be all IT. It can be no IT, or it can be a mixture of It and other things like training and use and so on and so forth that propels a given event to unfold the way it does and injure a patient the way it does. And that‘s why we like the idea of integrating it. But we do need to get more specific about IT and some of the work that Dr. Walker and others have done will be helpful to us, as we do that.


Jeff Shuren – FDA – Director of Center for Devices & Radiological Health
I think reporting is very important, but we can‘t lose sight of the fact that reporting is a tool. It‘s not a solution. What kind of reporting we‘re talking about depends on its purpose. I think, when we‘re looking at getting, it‘s really about getting a better understanding of what our experiences with different technologies, and based upon that experience, how we can actually approve them or address weaknesses, deficiencies, or problems to try to protect patients, and improve the benefits from those technologies. And you‘re probably talking about using a variety of different tools, not just simply a simple report.


I‘ll give it to you from an FDA perspective. We, for reporting, we have requirements that go to manufacturers of what they would tell us. We have requirements that go to user facilities of what they would tell us. Those are simple reports. They‘re good for red flags.

But on top of it, we have voluntary reporting to add to that where anyone can report. Patients can provide. Clinicians can provide. We set up a medical device surveillance network where we work with 350 facilities, where they are trained in actually looking for problems, getting to the route cause, and reporting, and it‘s a very collaborative relationship that the with the facilities try to find out the sources, work with them, and we drill down even in the institution, not just the high level. We get down to the department level. We‘re getting down to communities. We‘re getting down to physician practices, all to understand the use of the technology and what the problems are.


Then lastly is looking for problems through active surveillance. We are actually doing data mining with more sophisticated analytical tools, and that's an area that we‘re going into as well. And it‘s the combination of all of these that can help you identify problems, help figure out what the causes are, and what the best solutions are for moving forward. So, from our perspective, I would really think about, you need to bring a lot of things to the table. The PSOs have a role to play. I think FDA has a role to play. I think a number of different entities at the table and who have spoken all have a role to play. And it‘s only together that we‘re going to strike that right balance about trying to do the best for patients, for safety, and for benefit, at the same time allowing that environment for constant iteration and innovation….


Edward Shortliffe – AMIA – President & CIO
I‘d like to make a brief comment. As I indicated, I feel I can‘t preempt an internal process that‘s going on in AMIA right now, but I would like to say that I think there would be concensus supporting a point that was made in this morning‘s first session, and that there‘s been a lot of discussion about it, and that is the importance of recognizing that certification of a product does not address the general issues of the certification of an implementation of that product or assessment of the implementation of that product. There would be broad AMIA recognition of and support for the notion that assessing actual implementations and the safety considerations that went into the way in which that system was actually implemented is an important part of the process.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Sort of a follow-up to that, recognizing, as Dr. Shuren also pointed out, no one thing is a remedy unto itself, but in the space of where we are in our processes, we‘re in this meaningful use comment period that Dr. Blumenthal mentioned earlier. What are your thoughts about whether or not one of the meaningful uses should be basically providing an automated reporting function within the EHR so that it‘s easy? An incident happens, and should the technology make it easy to help provide that report?


James Walker – Geisinger Health System – CHIO
Yes. Our experience is that the feedback from clinicians, which they are fairly happy to give, is seriously limited by the time it takes them. No clinician can stop seeing when they come across something, and call up and wait for one of our production support people to come on the line, even if it only takes two minutes. And even if they do, they don‘t feel like they have time to go through the ins and outs of the incident.


And so, it would absolutely be a huge benefit if every health IT product had a button on ever screen that just said, I need to report trouble, and that automatically captured, you know, whatever, the last few screens and the last 20 or so keystrokes. And gave the person an opportunity to write a little narrative if they wanted to. But then, that could go to the team that would have the expertise to think it through, and then go back to the clinician if they needed to, to get the extra information to really understand it. That would get us a long way down the road.


William Munier – AHRQ/HHS – Director CQIPS
What I would say about that issue is that ideally you would, what you said would be exactly right that there would be requirements built in for embedding the measures in the electronic health record so that you could do safety and even quality reporting. There would be all kinds of things you could do.


I think the issue right now is what would you require get built in. And if everybody builds in their own measures, then you‘ve just perpetrated the kind of chaos that I documented earlier. And other than what we‘re doing for hospitals right now and a few other examples, there aren‘t really very many widely accepted measures. There are some, but what I would say is I think you would want to deliberate very carefully on what it is that you require got built in, or it could actually make the problem worse.


Edward Shortliffe – AMIA – President & CIO
Our experience too with reporting is that if you want to get from a lot of people, you have to make it easy and simple, and so building it into the HIT would be very important. And you do need to have standards. I mean, when we have reporting, we use a Health Level 7 standard, so it can be used across the board by others, and we can collate data.


James Walker – Geisinger Health System – CHIO
Just to follow up, just to show how hard this problem is, we did exactly this, and we had an unintended consequence and a medical error because of it, so we put in a feature that allowed doctors to report HIT medical errors, and the error was that the doctor put a medical order into the medical – the HIT error part, thinking that the nurse was going to pick it up as an order, not understanding what that thing was. So, you know, it‘s just third order errors can occur, so we fixed it.


William Munier – AHRQ/HHS – Director CQIPS
Actually, you jumped all the way to the devastatingly absurd example, but I was going to say, in between that, but in building on what you‘ve said is that, generally, people providing care want to do it as quickly and as efficiently as possible. Productivity is very important. And you can build measures in that are built on the clinical information. It‘s been the failure of administrative measures. They were built for billing, not for measuring quality. So now we have the opportunity through electronic record to actually collect clinical information that you get at the point of care, and it‘s essentially free once you collect it if you‘re using an EHR.


But there are certain parameters that you need to collect in order to be able to segment populations, make sure definitions are uniform, establish denominators, and so on that actually if you want to have good measures, add a little bit of time, and do slow down productivity. So that is another issue, and that‘s always a tradeoff. The better the measure, the more extensive the look, the more productivity is hurt. And o that‘s a tradeoff that I‘m not sure the regulators need to be probably not driving that decision, but following it, I would think.


James Walker – Geisinger Health System – CHIO
Yes. I would, just quickly, if I were designing it, all the clinician would need to do is click the button. If they wanted to write a narrative, they could, and then the team‘s responsibility would be to get back to them often. The issue is obvious. Sometimes they would have to get back to the person and talk with them.


Edward Shortliffe – AMIA – President & CIO
I think you‘d have to be concerned about an approach to this problem that depended solely upon capturing mouse clicks, however. I mean, I really agree with the notion that it would be great to have a single button click that somehow captured the scenario at a moment in time that could then be analyzed by staff and later contact the clinician if necessary. But I think the reality is that lots of things happen in

busy environments where people will not take the time to record it that way, and you shouldn‘t rely on that as your sole way of finding out when there are problems.


Latanya Sweeney – Laboratory for International Data Privacy – Director
All right. Let me open the floor up to questions. I see Paul. Actually, I skipped over Paul. Yes, which Paul?


Paul Egerman – eScription – CEO
Thank you. This is Paul Egerman. I have a couple questions. First, for Jim Walker, I was going to say, I‘m very impressed with what you wrote about the hazard analysis. I thought that was all terrific. I was trying to understand, as we talked about reporting, one of your comments was. But fundamentally, when I talk about reporting, I‘m thinking about reporting outside the organization, so reporting to somebody, the FDA, to a private safety organization, some information. I thought I heard you say that that should be done in order to make it happen can be done in confidentially. But that seems a little inconsistent with the idea of transparent, scientific analysis. Shouldn‘t there be two things? Shouldn‘t there be some way to do these things confidentially or privately, but also a separate public approach?


James Walker – Geisinger Health System – CHIO
You know, in the first place, at the highest level, the universe of reports deidentified would be available to whomever it was fully, they had a right. I don‘t think it would do the public much good, but certainly researchers and reporters and policymakers.


The reason we conceived it as being in the first instance, a tool for the healthcare organization, is because very few organizations have the wherewithal to create a system like this themselves. And so if there were a usable system were it was easy, relatively easy to record hazards that are identified in a structure way, along with narrative, and use it as a way of tracking the history of those hazards, you know, are we expecting something from the vendor in June, and it‘s July now, and we haven‘t gotten it. And couple that, so that there‘s an incentive for the organization to use it.


And remember, internal reporting is a critically important issue. In a healthy system, is probably the most powerful form of reporting, to the extent that organizations are motivated to improve their internal safety. But then provided in increasingly deidentified ways to other audiences that can benefit from it.


I mean, there are clearly compromises with this. There‘s data lost. There‘s data lost in any reporting system. But it‘s an attempt at least, and we‘re about, we think, to get funding to beta test it. We have 500 hazards built into it right now that we sort of used as the alpha test. But we have about 15 organizations that are interested in using it, again, primarily internally for themselves, but will be testing usability and usefulness and whether we can extract the information in a way that looks to them like it‘s confidential and acceptable.


William Munier – AHRQ/HHS – Director CQIPS
I‘d just like to add that what Dr. Walker said about internal reporting, I think, is as critical an issue as there is because if you think about care being delivered today, and tomorrow you want it to be safer and better quality, who is going to make the changes? It‘s certainly not the Fed sitting at the table here. Hopefully what we do enhances and facilitates that improvement, but it has to be done by the people delivering the care at the bedside and in the doctor‘s office, etc. So the internal reporting systems are critical, and when we‘re putting out our technical specifications to the common formats, we intend to include local specification so that, for instance, the specifications will be there for vendors so that there can be local reporting of a single event, as well as aggregate events, at the local level before they ever transport it on

up to their PSO or what have you. So I completely agree that that‘s where the change has to happen. That‘s really important.

In the tension between public reporting and confidential reporting, that‘s a bigger issue than we can probably get into today. It‘s a very hot topic these days where understandably the public wants to know everything, and probably in an ideal world would, but if providers who know that what they report is going to be used to sue them or on the front page of the newspaper don‘t report. And if you look at some of the states that have mandatory public reporting, you look at the number of events per year, it‘s in the hundreds, if that. If you look at Pennsylvania, which has a confidential reporting system, and they‘re up to about 200,000 events on an annual basis now because they‘re reporting in confidence.

The Institute of Medicine, if you go back to a report they did on patient safety a number of years ago, they recommended a public, sort of a mix of voluntary reporting that would be confidential and mandatory, reporting that would be public. And we‘ve kind of de facto evolved that way, and there are more than half the states in the country now require state reporting of very serious events. And we have had to work in the PSO program to allow a mechanism for those to go onto the states without getting into the protected space of the PSO, which will be voluntary and include things like near misses that aren‘t serious. Your question about how do you walk this line, that's one answer is the mandatory state reporting of very serious events and the voluntary reporting of everything, including incidents, near misses, and unsafe conditions.


Paul Egerman – eScription – CEO
Thank you. That‘s very helpful. I have sort of a separate question for Jeff Shuren, which is, a number of people in this morning‘s presentation, when they talked about FDA regulations, said it would hinder or hamper or throttle innovation. In fact, one person said that they have something that‘s currently regulated by the FDA and takes nine months to get a response. And so, I‘m curious if you would like to respond.


Jeff Shuren – FDA – Director of Center for Devices & Radiological Health
Sure. I won‘t go into the particular technology discussion this morning only to say that what‘s at case there is a little bit different than some of the other things we‘re talking about today, but won‘t go into details on that one. But to say we‘re stifling, here‘s where the concern is, and it‘s a legitimate concern, and that is on premarket review side. So what I sort of laid out for you were three different approaches that sort of build on each other.


You can think about the one that‘s on postmarket is everything is going out on the market. It‘s being used. User facilities are making changes. They‘re working with the vendors, and this is about sort of gathering information and feeding that back. And then if there‘s a really big problem, to then be able to use authorities the agency has to help address those. That‘s allowing that back and forth and that innovation.


There‘s a next level that then gets to what the manufacturers, insuring that as they‘re designing technologies, they have the good quality assurance practices in place. Again, they‘re making the technologies. They‘re putting that out there for user facilities. User facilities are making changes, and there‘s the back and forth. Both of those are more like a lighter touch.


The third one where there‘s premarket review, that is what gets most people very concerned, and that‘s where, for medium and higher risk technologies, before they come on the market, they‘re coming to FDA. I will tell you, for us, that would be unbelievably challenging. We talked about unintended consequences. Well, this would be intended consequences because our models for premarket review do not fit well at all for these technologies, and that‘s why we‘ll put it out there as an approach. But I‘m telling you, that last

approach about with premarket review gets other people concerned. It actually gets us concerned as well.


Paul Egerman – eScription – CEO
Are you saying that your other two approaches do not hamper innovation?


Jeff Shuren – FDA – Director of Center for Devices & Radiological Health
Done appropriately, no. No. In fact, if anything, we think it enhances it because it gathers information, and it‘s feeding that back into the system. I 100% agree about you want facilities to gather information, look at it themselves, learn from that experience, deep dive into route cause analysis. That‘s exactly what we do with our Med Sun Hospitals to have them do that.


At the same token, you want to learn from that experience, and you want that knowledge to not only get back to that one vendor. You want that information shared with the user community. You want that shared with other vendors. This was, I think, a little bit about the dialog of the top down, the bottom up, and they‘ve got to drive each other. That‘s exactly what we need to do.


James Walker – Geisinger Health System – CHIO I agree entirely that at the highest level needs to be available at the national level. It really should have dedicated analytics, which you also said, so that it‘s going to be a large enough amount of information that, raw, it will be useless to anyone. It will require something like ISMP where someone goes through the literature and analyzes it and comes up with big messages that really are important lessons for everyone to learn, and everyone to build either into their product or their configuration of their software.


Paul Egerman – eScription – CEO
Thank you.


Paul Tang - Palo Alto Medical Foundation - Internist, VP & CMIO
Thank you. I think that‘s a perfect segue into what I want to try to propose as more concrete in terms of approaches. We talked in the first panel about how important iteration was to continuously improving the safety of our systems. In order to do that, we need a conduit of data. We need the analytical tools that Jim just mentioned, and we need the commitment to do something about it.


In the second panel, we talked a little bit about the commitment. I‘m not necessarily sure I‘m happy with … but let‘s move on to the data and analytic side, and particularly on the federal role in doing that. So it seems like one, we need reporting in using common formats and structure. The second, we need protection for at least part of it. We talked about the mandatory and the voluntary. Third, we need the learning, and that‘s where the analytics come in, and even like in the NTSB, going out to the field to figure out what really happened. Here‘s my symptom or outcome, and let‘s figure out what caused it.

That‘s a very labor intensive process, and then we need dissemination. Of all those things that we need, what‘s required by government or regulation and so on and so forth? Question one is is the PSO construct, as in federal statute, cover the things we need from that point of view. Does it have the protection, etc. in the common formats? Then on the other side, more on the FDA side for other agencies that would have jurisdiction, we do have to have a mandatory part of it, and we do have to have the protection. What role does the FDA have to play? Is it basically role number one as you described is the post-marketing surveillance, and this is what I want to hear about, and who will provide that analytics and the NTSB function? How do we recreate that sort of system of getting the data, analyzing it, and making the system better?


James Walker – Geisinger Health System – CHIO
Since you mentioned the PSOs first, I‘ll answer that. You actually did a very elegant summary right there. That really sort of put the different pieces together very well, and PSOs do provide the protection for the reporting, and they complement the mandatory state reporting systems where they exist.


By the way, we‘re convening a meeting with states in the next couple of months to talk about how we can dovetail because they‘re all using different formats as well, but one thing I did want to just add to your paradigm there is the fact that deidentified, the PSO data does come into a national database. It‘s called a network of patient safety databases, and there are analytics that can be applied there, and then that can be disseminated. What we don‘t know yet is there are very strict deidentification rules, and what we don‘t know is whether those fight at all with the clinical detail that we hope, the richness that we hope to have. I think on that axis of the confidentiality versus the clinical detail, how that works out and also the mandatory reporting to FDA and to the states and whether that‘s as smooth as it can be with the program or whether that needs some adjustment over time, we‘re not sure, but we play a piece, but only a piece of course in this whole puzzle.


Paul Tang – Palo Alto Medical Foundation – Internist, VP & CMIO
My understanding and summarizing of what you said is if there were EHR safety reports, a PSO could help that because it has a lot of these attributes. Is that—


James Walker – Geisinger Health System – CHIO
Absolutely, and when we do a common format specifically for IT problems, of course, then everybody will be collecting those problems the same way and that information once deidentified can be shared publicly.


Paul Tang – Palo Alto Medical Foundation – Internist, VP & CMIO
Then to make it happen, to get a reliable source of data, would we need FDA to intervene and put in place, say, option one?


M

No, you can speak ….


James Walker – Geisinger Health System – CHIO
Well, we always need the FDA.


M

I knew it would be a good lead-in, so I went ahead—


Jeff Shuren
Yes, option one or option two gets you those pieces, and they are complimentary in terms of what we would do. The other reason also for FDA, what we in addition bring to the table is you can think about each user facility. They have whatever technologies they‘re dealing with, and they can learn from it, and even if that experience goes back to the vendor, that‘s a silo with just that technology. What FDA brings is that we see things across all the manufacturers and the technologies, and we bring that kind of an experience to there and can connect the dots, so that‘s another piece.

On the analytics there are a lot of smart people in government. There are a lot of smart people outside of government, and what we need to do is to think about taking this information and having people in government and outside government going through it. Obviously, we need to protect. We‘re not going to put out the names of patients or the physicians, but we should bring back to bear, and I think the way we

would view our analysis of reports we‘re getting is now to also leverage expertise that‘s outside of the FDA as well. That‘s the model that we would like to pursue.


Paul Tang – Palo Alto Medical Foundation – Internist, VP & CMIO
That‘s very helpful, but we would need the FDA in order to get the data to flow it seems. Is that correct?

Jeff Shuren
That‘s right. For certain data and the mandatory for us, you‘re talking it‘s mandatory for a manufacturer when you‘re dealing with death, serious adverse event, or malfunction that if it occurred reasonably likely would result in death or serious adverse event. We use our network, our surveillance network which is purely voluntary participation for getting at the near misses, and they‘re very good at that, and that‘s why this is a complementary approach. Then the PSOs are another layer on top of that.


Latanya Sweeney – Laboratory for International Data Privacy – Director
All right, thank you. Steve.


Steve
Actually, Paul just started a very concrete and productive line of questioning, and I was going to do something different, so I don‘t know if, Jodi, if you wanted to pursue this task, I can wait.


Latanya Sweeney – Laboratory for International Data Privacy – Director
I should just tell you that working with Paul over these last many months, that‘s the one thing I‘ve learned about him that he takes a very complicated area, and he‘ll come up with these great summary, so we‘ll all take a moment of pause.


Jodi Daniel – ONC – Director Office of Policy & Research



Steve
I was actually going to sort of make the observation that increasingly health IT is happening outside of the healthcare system and moving into the home and people‘s lives, whether it‘s services like HealthVault and Google Health or the thousands upon thousands of iPhone apps that focus on health, the heart rate monitors to the computer on the elliptical that you might use. I‘d also say that in some cases the line between provider health IT tools and patient health IT tools is starting to blur as well, and I just wanted to sort of open it up to the panel to reflect a little bit on how do these trends sort of have an impact on the discussion we‘re having about safety and how best to ensure it.


M

You‘re right. We‘re trying hard to blur them. We have lots of tools that patients and clinicians use fundamentally the same tool, and it feeds the same database. I think it‘s a matter of locus of responsibility. One of the things you have with healthcare organizations is a fairly stable, fairly adequate acceptance of responsibility for what happens to the patient. I think one of the challenges at least is to create that same kind of locus of responsibility with the newer applications that aren‘t in a healthcare organization in the traditional sense. I think that‘s a very big issue, very hard issue.

M

It‘s a great question, and it‘s something we‘re looking at, and whatever that technology you put out there, the deciding factor shouldn‘t be whether or not it‘s on an iPhone. It‘s on what that technology is, and we‘ve got to think about it from the patient perspective and what we‘re doing to patients, and that‘s really the way to approach it.


I‘d add something else on the table to be thinking about. We talk about HIT, but as the world is changing, HIT, and people may define it differently, is also interfacing with other technologies, too, and that is making it much more complex. From an FDA perspective, things that are very traditionally devices are now starting to talk to one another and talking to one another through HIT platforms and now affect the performance of what people would consider traditional medical devices, and we ought to think about that as well.


Steve
I can‘t resist raising the reporting issue again, and reporting isn‘t everything, I agree very much with that, but it‘s a start. You can‘t fix what you don‘t know as Jim … always says. When you get into devices that are essentially IT devices around the patient, you raise the issue of patient reporting, and that‘s something we just led a fairly substantial research study dealing with patient reporting.


Come to find out, there really isn‘t a whole lot known about it, and in terms of continuity of care, there are some things that the patient knows that nobody else knows, and if the patient can‘t report, it‘s lost really, and so as good as the report, the provider reporting systems are, they‘re missing something. Yet, very little is known about how to do effective patient reporting from a whole number of standpoints. It‘s not the same as provider reporting. Questions have to be worded differently, and there‘s a whole series of issues about compliance and so on, but we are looking into that, and there are a lot of people that are going to be interested to see what we find out.


M

Just to reinforce that point, we had a fascinating presentation recently in a meeting by one member of our community who‘s very HIT savvy who was just well enough during an acute care hospitalization to really pay attention to what was happening to him and in particular was noting interactions between the nursing system and the HIT environment in that particular institution. He is now writing up these examples, fascinating ones that the typical patient might not notice, some of which they would.


The question is what can you do with that information once these incidents happen to you? How can you be constructive about it? He wasn‘t out to sue anybody. He just said I‘m noticing some problems in the way in the way this environment is actually being used. What do I do? He found that just telling the nurse about it didn‘t necessarily get a good response, and he felt there had to be another way to somehow or another get these problems addressed. They were real design issues. I could give you an example if you wanted, but the main point is a knowledgeable observer on the patient side often will notice things that no clinician would because they‘re not in the room, no physician would at least.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Okay, Jodi.


Jodi Daniel – ONC – Director Office of Policy & Research
Thank you. My question is for Jeff Shuren. You had laid out three different potential approaches the FDA can take, and you mentioned the third approach, the premarket approval would be somewhat challenging with respect to health IT. I was wondering if you could talk a little bit about the second option, the quality system regulations and how they align with health IT and how you see that as a good or challenging fit. Then my other question related to that is we heard this morning about the insufficiency of sort of certifying up front the safety of a particular product because it‘s implemented very differently in different environments and different settings and that there might be some customization, and so I want to know if you had any thoughts about that.


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
The second approach starts to factor in manufacturing quality, and the medical device approach is something we call quality systems. This morning Cerner had mentioned that they‘re following the … and then Quality Systems. It‘s really a flexible approach to manufacturing that goes at the processes you put in place and that you adapt to your individual manufacturing approach to say, look, you‘re going to look at and try to assess where are the risks going to be in your manufacturing.


What do you put in place to sort of reduce those risks? What process do you have in place to monitor if there‘s going to be problems in manufacturing because you make changes. You think about with software you are changing things …. What are you doing to actually then assess what that impact may be, for example, on performance? Then when you identify a problem, do you have good procedures in place to take corrective action? Then if you get complaints, are you actually getting them, analyzing them, doing something about them if you find a problem?


That‘s what Quality Systems is about. For any manufacturer you would think, well, shouldn‘t you be doing that anyway, and that‘s our approach, and that‘s our perspective, and putting in place Quality Systems just says, all right, well let‘s just make sure everyone in fact is doing it, not just the people who are saying that they‘re doing it.

Jodi Daniel – ONC – Director Office of Policy & Research
Thank you.


Latanya Sweeney – Laboratory for International Data Privacy – Director
I think I skipped you, Carl, sorry.


Carl Dvorak – Epic Systems – EVP
I put my thing down, and then I put it back up. One of the questions, Jeff, that I had for you, in preparation, we don‘t do anything that‘s FDA related ourselves, so I went and talked to a few customers, and I was surprised by the emotion behind their comments with regard to the blood bank. There was a definite and independent collaborative from a couple of different perspectives here, but they felt that the blood bank software had been very much stifled and very much nonresponsive to even addressing known safety concerns once they had been identified. I just wasn‘t prepared for the emotion behind those comments when I reached out and talked to those people.


Is there any method or is there any data set that you use to decide? Are you actually having a net positive outcome, or even if not a net positive outcome, are you actually producing an outcome where there are fewer errors per line of code, for example, per thousand lines code? Is there any method to regulate or to monitor what you do, and did it turn out to be worth it? Was it actually a net positive at the end of the day?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
First let me say in terms for blood banking, it‘s for a different part of the FDA than here for the center devices. That would be our center on biologics, so I can't speak to the specifics and whatever people were reporting to you, but usually when there then are questions or people are saying the experience, we had disagreements here, they‘re often talking about a premarket review on the software. I think what we‘re discussing here is possibly moving forward without having that component of the premarket review and a postmarket or postmarket and sort of manufacturing quality approach.


But just generally in terms of what the impact is on premarket review, we do think in those times when we have applied it where people will consider more traditional medical devices, absolutely we think there‘s a

benefit. That‘s really the look to make sure that do we have a safe and effective medical device. If I told you I had a heart valve and I just put that on the market and no one bothered to look at if it was safe and effective, would it be value to take a look. I don‘t think anyone would say, well, of course you‘d want to do that. You would never do that.


The same would apply in software in a number of cases, too. Think about your ECG and you‘ve got software built in there that‘s doing analysis. If it‘s wrong and it‘s going to pop up and someone‘s reading that and they‘re not really good at otherwise reading an EKG, do you want to be wrong on the heart attack that patient is having? … apply it appropriately.


Carl Dvorak – Epic Systems – EVP
Yes, I do agree. I guess my question is does the method and manner in which you do those checks actually produce an outcome that can be measured? Is it actually making it better?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
If you were talking about traditional premarket, you wouldn‘t know in the sense of what are you comparing it to.


Carl Dvorak – Epic Systems – EVP



Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Your baseline would be I put it out there, and I wouldn‘t check it. Remember all the checking is then we‘ve made sure is it safe and effective. If it‘s not, we wouldn‘t let it on the market in the first place, so you wouldn‘t necessarily have that experience.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Joan.


Joan Ash – Oregon Health & Science University – Associate Professor
Thank you. I‘d like to explore this idea of certification of implementation or tracking of implementation. I‘d like to know what all of you think about that, and give us some ideas about how that could be done.


M

We took the Leap Frog test, the arc test of order entry, and found it challenging and educational, and we passed, so we think it‘s an okay test. Seriously, as a little bit of an education theorist, it‘s a well-designed test. It addresses a lot of the most important things in a reasonable way.


David‘s still here. David is working on creating a tool that would run all the time in a system like ours or anybody else‘s EHR that would sort of be running that test all the time unannounced on test patients and send unsafe orders and that sort of thing, and there again, that would just be reported internally, but we would be able to say, okay, given this new material that‘s David‘s put into the test, given a larger population of challenges than we had when we took the official test, how does this system stack up? That would give us the opportunity then to fix things then and not every six months or every year or every how often, and then it would also have the effect of making the test instead of being quite such a challenge particularly to smaller organizations. We had a team of 12 people that took the test. A lot of people have three people in their whole IT department.


It would mean that when you came to taking the test you‘d be confident if you had been doing your homework that you were going to pass it because you were passing the automated constant checker. I

think that‘s a beautiful example of sort of formative assessment followed by summative assessment that really should be win-win. The effect of that system should be that patients are in a safer environment all the time and that healthcare organizations have a very fair opportunity to pass the test and go on about their business.


M

I‘d point out the implication of that approach is that the best practices have been defined and become part of both the curriculum and the testing process and the evaluative step. Right there I think there‘s huge opportunity for deliberation discussion and definition of best practices, especially doing so in a way that still leaves open the opportunity through innovation.


Again, you don‘t want to be so high bound with your requirements for best practices that you squash all the great new ideas as well. I just would point out there this interaction between whether it‘s a … or a Leap Frog because … obviously goes in and even now is looking a fair amount at HIT implementations as part of the regular review process. A more focused … test or Leap Frog test and so forth would still presumably have to be based upon some accepted definition of what in fact was acceptable. What were best practices? I point out that that‘s not yet without debate.


M

Just a quick anecdote, sort of where we are now and why I like the idea of that formative test so much, I was reading Medical Letter I guess this week, and it was talking about this drug, and it said don‘t give this drug with other drugs. It‘ll prolong the QT interval, and I don‘t know what other drugs belong to QT interval. How would I, and then I thought oh gosh, I hope that our system knows that, and so what‘d I do? I sent an email to the person who manages the medication part of the EHR, and I said, ―Jane, if you try to order these two drugs, does it fire an alert?‖ She sent me an email back. It said, ―Yes, it does.‖ Well, that‘s obviously not a very good way to manage safety in any systematic way.

The real problem is that you also shouldn‘t give this medicine to anyone who‘s at risk for QT prolongation. Well, that‘s back to Ted‘s points. What does that mean exactly? Is that in SNOMED, and could we put that in a patient‘s problem list so that it would actually trigger if we tried to order medicine for that patient?

Ted‘s right. There‘s a whole ocean of work to do about what actually is best practice, and then once we knew what it was, do we have the standards built that support our actually running it, but at least this test running would capture a lot of that sort of thing in a way that otherwise is just hit or miss.


M

I would have a general comment which may be so bland as to be unhelpful, but I‘ll take the risk. That is it‘s sort of general advice for standard setting is that standards are difficult to set. They‘re often controversial. I‘ve been on both sides of it. I‘ve been standardized, and I‘ve been stanardee, and when you‘re writing them you want to be very thorough, and you feel like you have a public responsibility, and it tends to get longer and longer, but I have found an interesting sort of principle to go by is to keep the standards as short as possible and to keep them around things where there‘s general consensus. There‘s less code to write. There‘s less stuff to enforce, so fewer reports to read, and then when people are out of compliance with them, there‘s less argument about whether you ought to be doing what you‘re doing. The simpler and less controversial you keep it, the more successful it‘s likely to be.


Latanya Sweeney – Laboratory for International Data Privacy – Director
Okay, Paul and then Marc.


Paul Egerman – eScription – CEO
I think Marc was first.


Marc Probst – Intermountain Healthcare – CIO
It doesn‘t matter. I‘ll go ahead and ask because mine is probably an easy and simple question because it‘s coming from an easy and simple person.

Paul Egerman – eScription – CEO



Marc Probst – Intermountain Healthcare – CIO
Yes, right. There you go. It just seems to me there‘s a fairly large problem looming out there, and you probably anticipated it, but part of it‘s the tool, the technology that we build, but more and more of that tool‘s moving out into the hands of the providers themselves to create knowledge, and that knowledge could be very specific to an individual, and that knowledge is something that makes a huge difference.


I‘m with Intermountain Healthcare, and we‘ve actually talked about a few of the things we‘ve done today, but those are happening in very real time, and it‘s that content piece. How are you going to make sure that‘s safe? I understand looking at the hazards in the software itself ultimately, but if there‘s a clinician out there that‘s creating knowledge because of their experience and their interaction with the patient, how are we going to control, or what can put be in place to help assure that that content that‘s in there is actually safe?


M

This is a question that was asked way before computers. It has to do with knowledge in general and when you believe that a source is authoritative. We let you publish anything you want in books pretty much, but we don‘t consider all book authoritative, and there‘s a kind of presumption I would believe that we are in general in the organized dissemination of knowledge that would go on through the HIT world you would try to make sure that knowledge that is being codified is being made available after it has been suitably approved by authoritative entities of one sort or another. It doesn‘t guarantee anything.


Now we get back to the … intermediary issue of the old FDA regulations questions on software regulation in clinical decision support. At what point is it the person who makes the decision how to treat the patient who ultimately is responsible for properly interpreting the knowledge in the context of that specific patient at that specific time, and there‘s a lot of training that goes on to try to make sure that clinicians are in a position to do that well, but it seems to me that it‘s a notion of authoritative knowledge as distinguished from all knowledge. It‘s the reason for doing Medline searches instead of Google searches for your medical questions. … you get anything with Google.

Marc Probst – Intermountain Healthcare – CIO
Thanks.


Paul Egerman – eScription – CEO
Paul Egerman. I have another question for you, Jeff, which is I think about your second approach which apparently relates to manufacturing quality. My question is how does that concept relate to open source software?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Well, for open source software once it‘s out there and anyone can take it on their own, they‘re using it for themselves, for their own purposes. In a number of cases, that may not even be something we‘d do. It depends on the circumstances even under our purview. We would simply, it would be out there. People

would do their own tinkering. Within the facility itself when they‘re tinkering, we would have a completely hands-off approach on that.


As they took a software and they made changes to it, we would not get involved. If we‘re dealing with a commercial vendor who‘s now putting it out there for lots of different people, then we‘d kind of look at in terms of that particular manufacturing. Your risk is much greater in that case if there‘s a problem in manufacturing. As opposed you should bring it to a more granular level and when you deal with open source, something is out there, but then people are taking it for the whole purpose of … making changes to it.


We do that ourselves by the way with software, too. We‘ll put software out there for open source. It‘s really with the intent of use it to the best purposes, but that itself we‘re not imposing a QSR requirement on.


Paul Egerman – eScription – CEO
Does that same answer apply to self-developed systems?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Well, now we get so legalistic about where the line‘s drawn. I think what we‘re putting on the table is not the legalistic approach because if I was going to be legalistic I‘d say we‘re approach number three. We‘re really thinking outside of the box and just more focusing on what makes sense and then apply law accordingly.


The approach FDA would take is saying even in these cases if there was a requirement to do something we would say and we‘d sort of put this out in a more codified fashion to say but we‘re not exercising … which means you don‘t have to comply with this requirement which is different from saying we‘re imposing a new requirement. Here what we‘re suggesting is we may, even though these requirements are really out there and depending upon the circumstance, we would say, no, we‘re going to pull back even though we‘ve not been active here. If we‘re going to be active, let‘s not go all the way. It‘s almost like we shouldn‘t run a sprint. We should think about a marathon where you pace yourselves and steady and slow to sort of get we‘re going to be in this long term.


Paul Egerman – eScription – CEO
I guess I don‘t understand. In terms of manufacturing quality standards, this level two, would a self-developed system have to respond to whatever reporting requirements you put forward?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
The answer is we would not then treat the individual facility as a manufacturer—


Paul Egerman – eScription – CEO
You would not.


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Would not and expect that for a QSR requirement. A facility has certain reporting requirements to FDA, and the only reporting requirement they have to FDA is reporting deaths through the technology, and to a manufacturer under the law it‘s death and serious injury, those two, and all the others there is not a mandatory reporting requirement in that circumstance. It‘s a much scaled back reporting requirement.


Paul Egerman – eScription – CEO
Doesn‘t that approach create a couple of different problems? One is if we want to create sort of a learning environment, we‘re taking out a good chunk of the community as it were in terms of the data. Does it also create a problem where there‘s now sort of like an unequal playing field. In other words there are some burdens on vendors that aren‘t on these self-developed or open source systems. It gets particularly complicated because a lot of these self-development systems become commercialized, so the fact that both of the self-developed vendors who presented here, Intermountain Healthcare and … Clinic, in different ways have commercialized their products.


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Right, and we deal with this already. That‘s sort of the federal landscape right now that we apply to all the medical devices where we complement to save for the lost data is why we have different kinds of tools for getting information. We have the voluntary reporting where that reporting actually can get beyond death or serious injury. We use our surveillance network to actually get information beyond death and serious injury. That‘s voluntary but where we‘ve established these very close linkages with healthcare facilities.


We also have other tools when there really is a serious problem that we could then (and we haven‘t talked about this) kind of go back to the vendors and say, look, there‘s been a serious problem here. Maybe you need to take a closer kind of assessment on it, collect this kind of information on the technology or do this kind of a more formalized assessment, and we exercise that now or postmark it surveillance authority in selected situations where appropriate and completely tailored. What we‘re saying is that authority is available to use wisely if we so exercise to do it.


Paul Egerman – eScription – CEO
But the entire process is based upon vendor reporting if I‘m hearing it right. I guess one question I have is, is there any alternative where it‘s the healthcare organization is the one that does the reporting of the incidents?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
I apologize for terminology. A user facility for us … that‘s the healthcare facility, so there are some reporting requirements by the healthcare facility to FDA as a mandatory. There is then voluntary reporting that can occur, and then there is the network where we have established, and we actually either do that with them sending it in or we put out queries.


In fact, we do that now with different technologies, and we can actually either have them put it in or say specifically we‘re interested in the following. Will you go ahead and take a look, and they‘ll run through and look within their systems for issues, and we‘ll have a dialogue, and it will come back. That‘s why I said reporting, there are several different layers to it, some mandatory for manufacturers, some for healthcare facilities, voluntary reporting system, a much more sort of middle ground active reporting system, and then active surveillance tools that we‘re working with.


Paul Egerman – eScription – CEO
Thank you.


M

If I heard correctly, I think you said for open source FDA would take a completely hands-off approach and that the risk is greater with vendor-provided software than with open source. I wanted to make sure that that‘s what you said. Secondly, if that is what you said, can you share a deeper understanding of why you believe that to be true?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
For all open sourcing, I think we‘re going to have to, it‘s one of those where further discussion in terms of the circumstance of what‘s being put out there.

M

By whom?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Again, what we‘re talking about here, though, is on reporting, talking about reporting.


M

I guess I understand the reporting aspects. My question is does the agency believe that vendor-provided software is inherently higher risk than open source software?


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
The risk is in terms of and it‘s a fair question. The risk, it depends on some circumstances, maybe not on open source. It really depends what you‘re putting out there.

Latanya Sweeney – Laboratory for International Data Privacy – Director
All right, well, thank you very much for those comments, and there was a lot of great discussion there, and I‘m going to turn the mic back over to the chair.


Paul Egerman – eScription – CEO
Thank you very much, Latanya, and we thank the panel. This has been extremely helpful, so I really appreciate it, and so thank you very much. I think we‘re running a little bit ahead of schedule, but I think we‘re running right on schedule, exactly right on schedule. … in the software industry are not actually used to doing things on time, so this is a totally new concept for me, but now we are happy to have and looking forward to the public comment section. Judy, if you could start that process.


Judy Sparrow – Office of the National Coordinator – Executive Director
—State your name, your organization, and the time limit is three minutes on the public comment if you go to the microphone in the middle aisle. Anybody on the phone who‘s listening, if you just push *1, the operator will queue you up, and if you‘re listening on the Web if you would phone in to 1-877-705-2976, and we‘ll begin with the first lady at the microphone.


: Thank you very much. My name is Trisha Kurtz, and I work for The Joint Commission. We appreciate the opportunity to offer these comments. For those of you who may not know us, the Joint Commission has it‘s roots in hospital accreditation, and today we evaluate and accredit more than 17,000 healthcare organizations and programs in the United States. While the Joint Commission appreciates the benefits of health information technology, we are concerned about potential gaps in the delivery of safe, quality healthcare that occurs as a result of poorly selected, planned, or implemented technology.


To mitigate any unattended patient safety consequences, we offer the following recommendations. First of all, define a standard for minimum requirements for safe adoption of technology with input from healthcare organization vendors and other stakeholders. Create a best practice clearinghouse on patient safety innovations that occurs with the use of technology. Third, foster research to identify failure modes for technology, best practices, and outcomes … of the work. Fourth, convene an expert panel of thought leaders to share and collect experiences of those who have implemented safe solutions. Finally, and I think this is a really important one, design and execute an effective dissemination strategy and vehicles so that all future users will reap the benefits of the collective experience and wisdom of those that have and will deploy safe technology.


We look forward to working with the committee and other stakeholders to ensure safe adoption of health information technology. We provided a written statement too, and I hope the committee members have that statement. Thank you.


Judy Sparrow – Office of the National Coordinator – Executive Director
Yes, they do. Thank you and next in line.


: Yes, my name is Lindsey Hoggle, and I‘m a registered dietician and a health IT consultant to the American Dietetic Association. ADA is the largest organization of food and nutrition professionals in the world with over 70,000 members. The ADA appreciates the … work of ONC in creating an infrastructure for successful adoption and especially this EHR patient safety risk discussion.


We have two areas addressed in both the NPRM and IFR that provide examples open to risk mitigation from our standpoint. The first concern addresses consistent documentation of patient allergies and assuring that these allergies are communicated effectively through all transitions of care. In spite of the general practice of grouping individual patient allergies together within EHRs, both medication and nonmedication, both the NPRM and IFR consistently address only medication allergies. We strongly endorse grouping all allergies together in one location in the EHR using a standard vocabulary such as UNII to avoid against any unintentional separation down the road.


Of note, patient food allergies are becoming more problematic. Severe food allergies affect between 6-7,000,000 adults and 4-8% of children which is 2,000,000 children in the United States. According to the CDC, each year there are approximately 30,000 episodes of food-induced anaphylaxis, 2,000 hospitalizations, and 150 deaths due to food allergies.


In a typical clinical setting, HMT work flows to inquire about any patient allergy or intolerance and document both the allergen and the severity together in one location. Particularly in the hospital or any inpatient setting, the responsibility for assuring avoidance of patient-identified food allergies lies with the food and nutrition department and the registered dietician. While many patients identify food intolerances which may affect food intake and/or nutritional status, we consider these two distinct categories.


Identification and avoidance of food allergens is a patient safety issue and one that we take seriously. We embrace standardization in allergy vocabulary, coding, and transfer mechanisms. We hope that this will mitigate against unintended negative consequences associated with the potential separation of these allergies in the EHR.


The second issue addresses CPOE in hospitals. Certification criteria to support achievement and meaningful use of stage one by an eligible hospital requires that the EHR store, retrieve, and manage at a minimum 11 different order types. Diet orders are not included in this category. You might imagine omitting diet orders in a hospital setting almost always guarantees a unique version of the engaged patient. People like to eat.


In the U.S. less than 10% of hospitals have a departmental nutrition information system. Of those that do, the HL7, ADT, and diet order interfaces are standard practice and help assure efficiency, patient safety, and reduce administrative workload. We ask that diet orders be added to the 11 other CPOE minimum order types, both the diet order and food allergy data are utilized by hospital nutrition departments to deliver quality care and are considered part of the treatment regimen.


In order to keep nutritional care within the realm of treatment and quality intervention, diet orders, food allergies, and nutrition intervention should be a part of clinical decision support as well. We thank the ONC policy committee for thoughtful consideration of our request and opportunity to comment.


Judy Sparrow – Office of the National Coordinator – Executive Director
Thank you very much, and Dr. Morris.


: Is it okay for me to make a comment? Alan Morris. I wanted to bring up some issues that I thought might be worth considering as the committee pulls together the summary, and I will try and submit a diagram perhaps tonight or tomorrow to help, but it occurred to me that the vendors are not likely to be interested in providing any very detailed decision support by which I mean adequately explicit decision support that deals with complex clinical problems like managing asthma or heart failure or mechanical ventilation and so forth. I wondered if considering the vendors providing what would be functioning as an infrastructure to interface over the Web with decision support protocols validated, vetted, and demonstrated to be safe that might be archived somewhere. I don‘t know where that would be, but perhaps someplace like a library of medicine might be a place where protocols could be archived.


Then we have a number of other considerations that are quite important. A protocol that works and manages patients well has to be maintained and kept up-to-date, and the people who would oversee that maintenance are not currently present. It seems to me it‘s most likely to fall to young faculty members who might have an investment in T3 translation and in modification of protocols, so there are a number of elements that might be viewed as component parts of an integrated system with a vendor system providing what I would call a relatively straightforward or low level, not unimportant, but low-level decision support, for example, warnings about potassium in somebody who‘s receiving digoxin or who‘s about to receive a diuretic and so forth, but not complex protocols for management of issues that involve extensive rule-based systems. I‘ll try and send a diagram about that and hope that that might be helpful to you.

Judy Sparrow – Office of the National Coordinator – Executive Director
Thank you. Dr. Koppel.


: Thank you. If Alan gets to speak, I get to speak. Start the clock. I‘m going to be short. Thank you. If doctors, nurses, or pharmacists are about to make a mistake, they stop it. They don‘t intentionally make mistakes. If patients take a turn for the worse in hospitals, it‘s not extraordinary. The patients in hospitals are very sick, and bad things continue to happen to them. We don‘t know 99% of the medication ordering errors that are made. We simply don‘t know them. If 100% of the known errors were reported, that would be 1%, but the data suggests that the maximum on voluntary reporting is about 5%, so we‘re talking 5% of 1%. That is what we know is reported. Voluntary reporting is a wonderful idea, but what does it get us in terms of the true universe of errors? Thank you.


Judy Sparrow – Office of the National Coordinator – Executive Director
Thank you, and we do have one caller on the phone, if you would please identify yourself.


Operator
Our question comes from Dr. Scott Silverstein.


: Can you hear me?


Judy Sparrow – Office of the National Coordinator – Executive Director
Yes.


: My questions are very brief. I just wonder if in terms of terminology for error reporting, the MedDRA terminology used in the pharmaceutical industry and medical device industries might be considered, and the other comment is that there seems to not have been much attention to the early 2009 report from the United States National Research Council that was entitled ―Computational Technology for Effective Healthcare: Immediate Steps and Strategic Directions.‖ Rather than reinvent the wheel, a lot of wisdom might be reviewed in that report, and that‘s all I have to say. Thank you.

Judy Sparrow – Office of the National Coordinator – Executive Director
Thank you very much, and I‘ll turn it back over to Marc Probst and Paul Egerman.


Paul Egerman – eScription – CEO
Great. Thanks very much, Judy. Let me ask, do any members of the workgroup want to make any comments or have anything they want to say about today‘s hearing? I don‘t know if there are any workgroup members still on the phone. That‘s great. I‘m sorry. We have one, yes—


Jeffrey Shuren – FDA – Associate Commissioner for Policy and Planning
Sorry, Jeff Shuren, to clarify because it sounded like I was being invasive, and I wasn‘t trying to be. The issue on open source software, I was referring to if I had this very simple system versus something very complex coming from the vendor. If I‘m dealing with the two same things, one open source, one vendor, it would likely be safer coming from the vendor because you have a whole different kind of oversight for it, so I want to clarify that point.


Paul Egerman – eScription – CEO
Very helpful, I appreciate that. I also want to thank everybody, thank the public for participating. As I mentioned we have a conference call on March 12 at 11:00. If people want to follow up, that‘s when we will be doing our deliberations, and then our policy committee meeting is on March 17, so thank you very much.

No comments:

Post a Comment