In testing, it greatly reduces disruption and improves the likelihood of patient survival.
Researchers from the University of San Diego have developed a navigation system that can move robots safely through hectic emergency departments. Called SafeDQN, it uses computer vision to avoid medical workers who are treating patients in serious or critical condition.
Why it matters
Medical professionals are often overworked, making emergency departments an ideal application for support robots. These autonomous machines could reduce the strain on clinical teams. They could deliver materials, restock supplies, and perform other relatively simple tasks.
The big problem is that emergency departments are not ideal for robots. They are often overcrowded, making navigation challenging. The rooms are also filled with workers handling cognitive overload, making collisions even more likely.
To navigate an emergency department successfully, a robot would need to understand when medical professionals are treating a patient in serious or critical condition. This is what SafeDQN does.
How it works
SafeDQN uses a computer vision algorithm that reads contextual cues in a live video feed. It looks for two indicators that a patient in serious condition. First, a higher number of workers, and second, workers moving more quickly.
When the robot’s observes a team, it detects the number of people present, and then tracks how quickly they’re moving. This gives it enough information to model the patient’s condition. If the condition is serious, the robot stays out of the way to avoid interrupting care.
The paper’s authors note that the robot has not been tested in real-world environments yet. Still, they are bullish on its potential in the emergency room, as well as more complex applications like disaster response or search and rescue.
From our point of view, it seems that a similar system could be useful for robots in even more settings—like a hectic, crowded, and dangerous construction site.