IT in health care has produced modest changes, so far

doctor computer

It has never been hard to imagine how information technology (IT) might improve health care services. Fast messaging replacing faxes. Electronic health records that can be accessed more easily. Software that can inform doctors’ decisions. Telemedicine that makes care more flexible. The possibilities seem endless.

But as a new review paper from an MIT economist finds, the overall impact of information technology on health care has been evolutionary, not revolutionary. Technology has lowered costs and improved patient care—but to a modest extent that varies across the health care landscape, while only improving productivity slightly. High-tech tools have also not replaced many health care workers.

“What we found is that even though there’s been this explosion in IT adoption, there hasn’t been a dramatic change in health care productivity,” says Joseph Doyle, an economist at the MIT Sloan School of Management and co-author of the new paper. “We’ve seen in other industries that it takes time to learn how to use [IT] best. Health care seems to be marching along that path.”

Relatedly, when it comes to heath care jobs, Doyle says, “We don’t see dramatic changes in employment or wages across different levels of health care. We’re seeing case evidence of less hiring of people who transcribe orders, while for people who work in IT, we’re seeing more hiring of workers with those skills. But nothing dramatic in terms of nurse employment or doctor employment.”

Still, Doyle notes that health care “could be on the cusp of major changes” as organizations get more comfortable deploying technology efficiently.

The paper, “The Impact of Health Information and Communication Technology on Clinical Quality, Productivity, and Workers,” has been published online by the Annual Review of Economics as part of their August issue.

The authors are Ari Bronsoler Ph.D. ’22, a recent doctoral graduate in economics at MIT; Doyle, who is the Erwin H. Schell Professor of Management and Applied Economics at the MIT Sloan School of Management; and John Van Reenen, a digital fellow in MIT’s Initiative for the Digital Economy and the Ronald Coase School Professor at the London School of Economics.

Safety first

The paper itself is a broad-ranging review of 975 academic research papers on technology and health care services; Doyle is a leading health care economist whose own quasi-experimental studies have quantified, among other things, the difference that increased health care spending yields. This literature review was developed as part of MIT’s Work of the Future project, which aims to better understand the effects of innovation on jobs. Given that health care spending accounted for 18 percent of U.S. GDP in 2020, grasping the effects of high-tech tools on the sector is an important component of this effort.

One facet of health care that has seen massive IT-based change is the use of electronic health records. In 2009, fewer than 10 percent of hospitals were using such records; by 2014, about 97 percent hospitals had them. In turn, these records allow for easier flow of information within providers and help with the use of clinical decision-support tools—software that helps inform doctors’ decisions.

However, a review of the evidence shows the health care industry has not followed up to the same extent regarding other kinds of applications, like decision-support tools. One reason for that may be patient-safety concerns.

“There is risk aversion when it comes to people’s health,” Doyle observes. “You [medical providers] don’t want to make a mistake. As you go to a new system, you have to make sure you’re doing it very, very well, in order to not let anything fall through the cracks as you make that transition. So, I can see why IT adoption would take longer in health care, as organizations make that transition.”

Multiple studies do show a boost in overall productivity stemming from IT applications in health care, but not by an eye-catching amount—the total effect seems to be from roughly 1 percent to about 3 percent.

Complements to the job, not substitutes, so far

Patient outcomes also seem to be helped by IT, but with effects that vary. Examining other literature reviews of specific studies, the authors note that a 2011 survey found 60 percent of studies showed better patient outcomes associated with greater IT use, no effect in 30 percent of studies, and a negative association in 10 percent of studies. A 2018 review of 37 studies found positive effects from IT in 30 cases, 7 studies with no clear effect, and none with negative effects.

The more positive effects in more recent studies “may reflect a learning curve” by the industry, Bronsoler, Doyle, and Van Reenen write in their paper.

Their analysis also suggests that despite periodic claims that technology will wipe out health care jobs—through imaging, robots, and more—IT tools themselves have not reduced the medical labor force. In 1990, there were 8 million health care workers in the U.S., accounting for 7 percent of jobs; today there are 16 million health care workers in the U.S., accounting for 11 percent of jobs. In that time there has been a slight reduction in medical clerical workers, dropping from 16 percent to 13 percent of the health care workforce, likely due to automation of some routine tasks. But the persistence of hands-on jobs has been robust: The percentage of nurses has slightly increased among health care jobs since 1990, for example, from 15.5 percent to 17.1 percent.

“We don’t see a major shock to the labor markets yet,” Doyle says. “These digital tools are mostly supportive [for workers], as opposed to replacements. We say in economics that they’re complements and not substitutes, at least so far.”

Will tech lower our bills, or not?

As the authors note in the paper, past trends are no guarantee of future outcomes. In some industries, adoption of IT tools in recent decades has been halting at first and more influential later. And in the history of technology, many important inventions, like electricity, produce their greatest effects decades after their introduction.

It is thus possible that the U.S. health care industry could be headed toward some more substantial IT-based shifts in the future.

“We can see the pandemic speeding up telemedicine, for example,” Doyle says. To be sure, he notes, that trend depends in part on what patients want outside of the acute stages of a pandemic: “People have started to get used to interacting with their physicians [on video] for routine things. Other things, you need to go in and be seen … But this adoption-diffusion curve has had a discontinuity [a sudden increase] during the pandemic.”

Still, even the adoption of telemedicine also depends on its costs, Doyle notes.

“Every phone call now becomes a [virtual] visit,” he says. “Figuring out how we pay for that in a way that still encourages the adoption, but doesn’t break the bank, is something payers [insurers] and providers are negotiating as we speak.”

Regarding all IT changes in medicine, Doyle adds, “Even though already we spend one in every five dollars that we have on health care, having more access to health care could increase the amount we spend. It could also improve health in ways that subsequently prevent escalation of major health care expenses.” In this sense, he adds, IT could “add to our health care bills or moderate our health care bills.”

For their part, Bronsoler, Doyle, and Van Reenen are working on a study that tracks variation in U.S. state privacy laws to see how those policies affect information sharing and the use of electronic health records. In all areas of health care, he adds, continued study of technology’s impact is welcome.

Source: Read Full Article