How Google Glass may have saved a life in a local ER

Since December, four emergency-room doctors at Beth Israel Deaconess Medical Center have been experimenting with Google Glass - colored bright orange so patients would immediately know the glass would see them now.

Dr. John Halamka, hospital CIO, describes the pilot project and lessons learned - and recounts Dr. Steven Horng on one particular incident:

I was paged emergently to one of our resuscitation bays to take care of a patient who was having a massive brain bleed. One of the management priorities for brain bleeds is to quickly control blood pressure to slow down progression of the bleed. All he could tell us was that he had severe allergic reactions to blood pressure medications, but couldn’t remember their names, but that it was all in the computer. Unfortunately, this scenario is not unusual. Patients in extremis are often overwhelmed and unable to provide information as they normally would. We must often assess and mitigate life threats before having fully reviewed a patient’s previous history. Google glass enabled me to view this patient’s allergy information and current medication regimen without having to excuse myself to login to a computer, or even loose eye contact. It turned out that he was also on blood thinners that needed to be emergently reversed. By having this information readily available at the bedside, we were able to quickly start both antihypertensive therapy and reversal medications for his blood thinners, treatments that if delayed could lead to permanent disability and even death.



Free tagging: 


Google Glass totally over

By on

Google Glass totally over-hyped and people in the real world are and look ridiculous in them. But

When a clinician walks into an emergency department room, he or she looks at bar code (a QR or Quick Response code) placed on the wall. Google Glass immediately recognizes the room and then the ED Dashboard sends information about the patient in that room to the glasses, appearing in the clinician’s field of vision. The clinician can speak with the patient, examine the patient, and perform procedures while seeing problems, vital signs, lab results and other data.

is very cool.

A nit? Looks like one or more people did some non-trivial software engineering to make this happen. Unless it was Dr. Halamka himself, a missed opportunity for some attribution?

Funded by Creepy Nerdwads

By on

We all knew these things had real, grown-up, critical info applications. Google simply charges the vain big money to be "glassholes" and be seen in them, so that expansion of their use is possible where it counts.

For serious, though, how many

By on

For serious, though, how many people use their cell phones in the bathroom? Do those not have cameras/camcorders? Audio recorders? The only difference is that the Glass is located on one's face even when it's not in use.

It's just another new technology that people need to absorb for a while before it becomes no big deal.


By on

Did you read the story above about the ER doctor and what he was able to do to properly care for his patient in an emergency situation? Daaaah,'s not lookin like it's gonna be "no big deal". Hopefully one day you'll experience the benefit and 'think' otherwise.


By on

but this is worded so poorly...."colored bright orange so patients would immediately know the glass would see them now." Is this supposed to be a pun on the doctor will see you now or what. Reading this is making my head hurt.

Google is looking for these vertical markets...

... and seeding Google Glass all over the place to find them.

This technology would also be incredibly useful for any kind of inspector; my dad was an airline overhaul base inspector, and would have killed for Google Glass and a tablet rather than having 30 sheets on a clipboard and a stack of manuals on the base design and all replacements. Note: stack of manuals for each plane, not just each plane model.

I don't need one (I'd qualify as nerdy but too old for a hipster nerdwad); but I can see all the places we need to have instant access to targeted, localized and relevant information.

Biggest jump to adoption

By on

The biggest jump to adoption that I expect to need tackling is context-specific information. If I'm an inspector, then when I'm looking at a 24" flange adjuster, the notes for it should jump forwards. I shouldn't have to say "Ok, glass, 24" flange adjuster notes". I know there's some of that being worked on (like the auto-barcode reading that it sounds like they are using in the ER), but seamless "tell me more" when the right app is open and you want to keep being given more information on your current task is important for more adoption.

But are the glasses really necessary?

By on

As one of the commenters suggests, the glasses could be used to detect the current location and bring up the medical record. Seems like we can do that with a monitor on the wall, no glasses needed. If the issue is voice control of the medical record views, there are other technologies that could be employed to change the view. The example is really cool, but are glasses the answer.