Contemporary Issues in Medical Informatics: Good Health IT, Bad Health IT, and Common Examples of Healthcare IT Difficulties
Some time ago, I wrote that:

Operation Aurora And a Widespread Reluctance to Discuss IT Flaws:  Is Universal Healthcare IT Really a Good Idea in 2010?

Some time ago, I wrote that:

... The administration has promised an atmosphere of national accountability and responsibility. Why, then, has it simultaneously employed the coercive force of government (payment penalties for HIT non adopters after the absurdly short period of five years from now, 2014) to push an exploratory medical device from an unaccountable industry of unproven ROI at a cost of tens of billions of dollars on to the medical profession?

This reality raises another question as I suggested in my WSJ Letter to the Editor of February 18, 2009. I wrote:

Dear WSJ - You observe that the true political goal is socialized medicine facilitated by health care information technology. You note that the public is being deceived, as the rules behind this takeover were stealthily inserted in the stimulus bill. I have a different view on who is deceiving whom. In fact, it is the government that has been deceived by the HIT industry and its pundits. Stated directly, the administration is deluded about the true difficulty of making large-scale health IT work. The beneficiaries will largely be the IT industry and IT management consultants.

HIT adoption, a voluntary process, has now been pushed along by government at a frenetic pace. What was once a voluntary process is now a government-driven gold rush.

It was reported in 2009 by the Washington Post that the huge vendor trade organization HIMSS was behind the lobbying and stealth insertion of HIT provisions in ARRA. It would not be a surprise if the HIT provisions in ARRA themselves were written by HIMSS members with conflicts of interest.

Former ONC Director David Brailer is also critical of Congress rushing to put health IT in the American Recovery and Reinvestment Act (ARRA). "It puts more risks on healthcare IT adoption than are necessary," Brailer said.

I agree.

One of these risks - perhaps the most significant one - is the experimental, unpredictable and unreliable nature of IT itself.

Let's look at the very latest IT flaws exposé: Operation Aurora.

Operation Aurora was a cyber attack, conducted in mid-December 2009 and apparently originating in China, against Google and more than 20 other companies, including Adobe Systems, Juniper Networks, Rackspace, Yahoo, Symantec, Northrop Grumman and Dow Chemical.

The attack used 0-day vulnerabilities in Microsoft's Internet Explorer. One target was Google's email service, Gmail. It is not unrealistic to suspect that successful break-ins to that service could have gotten dissidents jailed or killed.

Now, the news gets worse:

Jan. 10, 2010
17-year-old Windows Flaw Affects All Since NT
Tom's [one of the very best IT technical websites - ed.]

We often hear of Windows security bugs that plague a recent version of the operating system that many are still using today, but rarely do we hear of a bug that reaches all the way back – 17 years – to Windows NT.

Tavis Ormandy, a security researcher at Google, discovered a security flaw in the Virtual DOS Machine that can allow a nefarious user to inject code into the kernal and possibly install malware.

Given that all modern versions of Windows still feature the Virtual DOS Machine, this is a vulnerability that still exists today.

Ormandy wrote:

"All 32-bit x86 versions of Windows NT released since 27-Jul-1993 are believed to be affected, including but not limited to the following actively supported versions:

- Windows 2000, Windows XP, Windows Server 2003, Windows Vista, Windows Server 2008, Windows 7

Microsoft has yet to respond to the flaw, and until it does with a patch, Ormandy recommends the following as a way to mitigate the hole:

[Technical workaround follows - ed.]

Not only do new IT flaws appear daily, but we also need to worry about unknown flaws lurking from operating systems past. C'mon guys ... you're trying to sell universal, interoperable, nationally networked health records under these conditions?

It gets worse still. Not only do these vulnerabilities exist and attacks occur, but (like health IT defects themselves) a shroud of secrecy surrounds these attacks. We as a society do not know how bad the security situation truly is...

Back to the Wall Street Journal:

Jan. 19, 2010
Private Sector Keeps Mum on Cyber Attacks
Companies Are Loath to Disclose or Share Information on Breaches for Fear of Bad Publicity and Loss of Business to Rivals


The biggest surprise to computer-security experts isn't that Google Inc. was targeted by attackers from China. It's that the Internet giant chose to disclose the incident.

Despite repeated efforts by the U.S. government to get the private sector to share information about threats, many companies have long kept such incidents confidential.

There's a culture of secrecy around any bad news, and data breaches are always bad news," said Larry Ponemon, a security and privacy consultant with the Ponemon Institute. "Organizations don't like to reveal it."

The IT culture appears shrouded in secrecy. The reasons really don't matter all that much to individuals whose information may be at risk. The effect is that nobody really knows just how insecure computer networks are in 2010. One simply cannot assess a pandemic or epidemic if information on the affected is kept secret.

The reticence can apply both to public disclosure of attacks as well as information-sharing among companies and government agencies—exchanges that can help organizations prevent future break-ins.

Google said last week its systems had been breached in a December attack, and the criminals made off with some of its intellectual property and accessed email accounts belonging to Chinese human-rights activists.

People familiar with the attack on Google say as many as 34 companies were targeted, but only two others—Adobe Systems Inc. and Juniper Networks Inc.—have so far publicly acknowledged they were targets.

Other companies that were targeted include Yahoo Inc., Symantec Corp., Northrop Grumman Corp. and Dow Chemical Co., according to Internet-security experts, but the companies have declined to say whether they were targets in this attack.

This means there may be more, and the true extent of damage unknown.

... Google said, in a blog post disclosing the attack, that it was taking the "unusual step" of sharing the information "because of the security and human rights implications of what we have unearthed." A spokesman declined to comment further.

... But Warren Axelrod, a research director at the U.S. Cyber Consequences Unit, a nonprofit organization that studies the effects of cyber attacks, said efforts to get companies to share information about incidents in an organized fashion "haven't really advanced" since the late 1990s. Instead, he said, most exchanges take place privately between people who trust one another.

The gaps where close relationships don't exist can create vulnerabilities.

For instance, hackers in late 2008 broke into the systems of Heartland Payment Systems Inc., a credit card processing company, and stole account information belonging to millions of people.

The company disclosed the breach in January 2009, but only because it was required to by law, said Bob Carr, Heartland's chief executive, in a June interview. He added that as many as 300 other companies were targeted by similar attacks but that most have never come forward.


This implies nobody has a truly robust idea of just how severe the IT security issue truly is. Why, then, are we rushing to universal HIT as if these problems are minor?

.. Robert Rodriguez, a consultant to Heartland and chairman of the Security Innovation Network—which tries to foster information-sharing between the public and private sectors—said law enforcement officials knew about the attack method the perpetrators used but didn't inform Heartland.

Does a "Keystone Kops" analogy apply?

... Though security experts praise Google for coming forward, some wish it had done so sooner. "In their announcement they said that they've known about this for a month and they are just now getting around to telling people about it," said Daniel Castro, an analyst at the Information Technology and Innovation Foundation.

Perhaps someone at Google has a conscience with regard to previously-mentioned endangered dissidents.

Can it get worse? Of course it can.

I recently purchased a brand-new Intel Core I5 750 CPU-based computer with a motherboard (main circuit board) also made by Intel.

The machine had to be returned. Cooling fan control, a function of embedded software known as the "BIOS", was erratic and the machine sometimes went into "vacuum cleaner" mode. This was despite - uh - upgrading the system BIOS software three times over multiple bug fixes that had been released in only a few months. I exchanged the computer for a new one.

A replacement exhibited the same problem, and was returned. I did not intend to find out if there were other bugs lurking that could compromise my data or my security. (In fact, the replacement just for good measure had a nonfunctional USB port that I traced back to a defect on the Intel mainboard, via swapping cables to an unused but working internal USB connector header.)

Any time a new IT technology appears, it seems bugs are the order of the day. So, not only do we need to worry about bugs from 17 years ago, we need to worry about new ones, constantly.

Can it get any worse?

Unfortunately, it can...

Being an inquisitive computer professional, I find computer hardware of interest (dating back to the early 1970's when I pulled flip chip circuit modules from a transistorized Digital Equipment Corp. PDP-8/S computer to see which machine level instructions were affected.)

I looked up the technical data on the Intel Core i5 750 processor, the one advertised in full page color ads in the WSJ and elsewhere as "adjusting to the user's needs for speed" or similar puffery (yes, one of the I5's cores ramps up from 2.66 to 3.2 GHz when necessary, so I can complete my Word documents 1.2 times faster. Wow).

The latest Intel technical document is entitled "Intel® CoreTM i7-800 and i5-700 Desktop Processor Series Specification Update" and is available as a PDF here.

This "specification update" contains specifications of the known hardware bugs of the new microprocessors themselves! The list is extensive. Some examples (they are technical):

AAN3. Code Segment Limit/Canonical Faults on RSM May be Serviced before Higher Priority Interrupts/Exceptions and May Push the Wrong Address Onto the Stack

Problem: [long technical explanation, see PDF if interested in the convoluted details- ed.]

Implication: Operating systems may observe a #GP fault being serviced before higher priority

AAN4. Performance Monitor SSE Retired Instructions May Return Incorrect Values

Problem: Performance Monitoring counter SIMD_INST_RETIRED (Event: C7H) is used to track retired SSE instructions. Due to this erratum, the processor may also count other types of instructions resulting in higher than expected values.

Implication: Performance Monitoring counter SIMD_INST_RETIRED may report count higher than expected.

Workaround: None identified.

AAN6. MOV To/From Debug Registers Causes Debug Exception

Problem: [long technical explanation, see PDF if interested in the convoluted details- ed.]

Implication: With debug-register protection enabled (i.e., the GD bit set), when attempting to execute a MOV on debug registers in V86 mode, a debug exception will be generated instead of the expected general-protection fault.

Workaround: In general, operating systems do not set the GD bit when they are in V86 mode. The GD bit is generally set and used by debuggers. The debug exception handler should check that the exception did not occur in V86 mode before continuing. If the exception did occur in V86 mode, the exception may be directed to the general-protection exception handler.

There are a total of one hundred nine such "errata" to bugs and flaws at the level of the microprocessor itself.

In summary, then, HIT in 2010 at the applications level is often poorly designed to the point of being outright mission hostile, found of little value or an actual hindrance by many clinicians and researchers, and yet it is being pushed as a panacea that will revolutionize healthcare.

Networked IT in 2010 is subject to flaws right down to the level of the microprocessor that executes the instructions provided by programmers, such that I am not sure how programmers can stay entirely ahead of the bugs curve; communications infrastructure is subject to vulnerabilites and exploits, sometimes dating back two decades; and when such exploits occur, mum is the word, no nobody really knows how severe the problem truly is.

Now, IT industrialists and their government benefactors feel our citizens should trust their most private information to, and make their care providers dependent on, this experimental and not-quite-reliable technology -- a technology where the extent of flaws are unknown due to corporate secrecy?

This current HIT movement, this HIT culture is as alien to the culture, the caution and the science of modern medicine as can possibly be.

Even as a medical informaticist and IT enthusiast I am disturbed by this state of affiars.

These are the reasons I believe universal HIT is not a good idea in 2010.

I believe we need to s-l-o-w  d-o-w-n.

We also need to open up in regards to transparency: transparency about flaws, and transparency about conflicts of interest in those pushing this experimental technology in recent years at a rapid pace.