Emergency departments (ED) can be crowded, busy, face-paced spaces manned by healthcare workers (HWC) who are overworked and often under stress. Robots have been used in some clinical settings to help lessen the burden of HCWs, by engaging in non-value added tasks, such as material delivery and patient triage, as well as and providing support to patients at the bedside when staff are not available.


In order to perform these tasks, robots must be able to understand the context of complex hospital environments and the people working in and around them. Because a robot will encounter HCWs performing differing work, computer scientists must design robots that can navigate through these safety-critical environments, while incorporating key clinical contextual information, to ensure that they are assisting rather than disrupting delivery of patient care.


A team of such scientists at the University of California San Diego have developed a more accurate navigation system that will allow robots to better negotiate emergency departments and other busy clinical environments. The researchers have detailed their findings in a paper for the International Conference on Robotics and Automation and also developed a dataset of open source videos to help train robotic navigation systems in the future. 


“To perform these tasks, robots must understand the context of complex hospital environments and the people working around them,” said Professor Laurel Riek, who led this project and is an associate professor of computer science and holds an appointment in emergency medicine at UC San Diego. 

 

You might also like: A new study reports on how wider public and patients perceive the use of a mobile robotic system (MRS) in healthcare settings. Learn more


The navigation system, called the Safety Critical Deep Q-Network (SafeDQN), is built around an algorithm based on observations of clinicians’ behaviour in the emergency department. The algorithm assesses the number of people in a space and how quickly and abruptly these people are moving. For example, when a patient’s condition worsens, a team immediately gathers around them to  provide aid. In this scenario, clinicians’ movements are quick and precise, so the navigation system directs the robots to move around these clustered groups of HCWs, staying out of the way.


“Our system was designed to deal with the worst case scenarios that can happen in the ED,” said Angelique Taylor, who is part of Riek’s Healthcare Robotics lab at the UC San Diego Department of Computer Science and Engineering. Recorded documentaries and reality-type shows, such as “Trauma: Life in the ER” and “Boston EMS” were used to create the algorithms and train robots. 


The team tested their algorithm in a simulation environment, compared its performance to other robotic navigation systems currently in use, and found that the SafeDQN system guided the robot on the most efficient and safest paths in all scenarios. 


Researchers plan to test the system on a physical robot in a realistic environment in a partnership with UC San Diego Health’s healthcare training and simulation centre. The set of more than 700 videos that the research team has developed is available for other research teams to train other algorithms and robots. 


In the future, these algorithms could also be used outside of the emergency department, such as by first responders during search and rescue missions. 


Source: UC San Diego 

Photo: iPhoto

«« New Rapid 3D Printing Technique Paves Way for Life-like Organs


Novel 3D 'Lung-On-A-Chip' Model to Test New COVID-19 Therapies »»



Latest Articles

Navigation, Robots, Emergency Rooms Emergency departments (ED) can be crowded, busy, face-paced spaces manned by healthcare workers (HWC) who are overworked and often under stress. Robots ha...