When emergencies happen in the backcountry, wilderness search and rescue (WSAR) personnel—often volunteers—put themselves at risk to help others. Coordination and communication over large areas in dynamic conditions is key to a successful rescue, and ensures the safety of the WSAR field team.
Simon Fraser University (SFU) School of Interactive Arts and Technology (SIAT) Professor Carman Neustaedter recently collaborated on RescueCASTR—or Search and Rescue Contextual Awareness Streaming Platform—a new digital technology that streams live video and photos from the rescuer in the field to the central command post.
Neustaedter is Dean of the Faculty of Communication, Art and Technology (FCAT) and director of the Connections Lab (cLab) research group in SIAT. His work spans many disciplines including computer science, social psychology and design. He researches human-computer interaction, interaction design, computer-supported cooperation, social culture and group collaboration. He has studied, designed, and evaluated a range of technologies involving communication and coordination.
The RescueCASTR project was led by SIAT Visiting Scholar and University of Calgary PhD student Brennan Jones, who now works with Meta, and co-supervised by Neustaedter and University of Toronto Professor Anthony Tang, now a visiting professor at Singapore Management University.
Jones designed RescueCASTR with input from Canadian WSAR teams, and the WSAR members themselves evaluated its performance during simulated search and rescue scenarios. The results are outlined in: RescueCASTR: Exploring Photos and Live Streaming to Support Contextual Awareness in the Wilderness Search and Rescue Command Post. (Access the PDF version here).
The WSAR managers in the command centre found the ability to see video and pictures from wearable cameras could be useful in providing contextual awareness of a team’s progress and status. Command managers also pointed out that the camera footage could be useful for planning and reviewing activities, both during and after a response.
“Search and rescue operations happen year-round and are often life-critical. It is highly important that WSAR team members have ways to easily share information in order to productively search for and find missing people in the wilderness,” says Neustaedter. “Our work continues to explore new and innovative ways of utilizing wearable cameras and drone technologies.”
We spoke with Professor Neustaedter about his research.
What motived you and your collaborators to develop this technology and to tackle the problem of communications in the backcountry? What technologies are WSAR teams currently using?
This research stemmed out of ongoing work we have been doing related to emergency situations and emergency response. Jones is an active outdoor enthusiast who enjoys hiking and cycling and was interested in emergency rescues in the wilderness due to his personal passions. We felt this was a great opportunity to explore the research space more and help support local and provincial WSAR workers by advancing the field.
Currently WSAR teams use radios, cell phones for text and picture messaging, drones and laptops.
Tell us about some of the problems and challenges you were hoping to overcome by designing a wearable image technology for rescuers. How does the device communicate off grid?
The overarching challenge we were tackling through the design of RescueCASTR was building and maintaining a shared mental model between the command centre and the rescue teams. To tackle this challenge, we hoped to bridge the perspectives of the field teams and command by bringing more of the field perspective to the command post. We also wanted to introduce additional communication modalities and information channels beyond just audio and text, and provide additional opportunities for asynchronous communication and information sharing between the field teams and command.
Can you tell us about what worked well with RescueCASTR and what were your conclusions? What did not work well?
From these findings, we concluded that an interface design like RescueCASTR can provide rich and actionable contextual information about a field team’s activities, status and surroundings, all while requiring little effort from the field. Body camera footage can be a bridge between the focus and context of other data channels. For example, it can add context to radio updates, text messages, and clue photos, while providing more focused detail and depth to information sources such as maps and satellite imagery.
Command managers commented that being able to see conditions helped them make better decisions. However, camera streams should not be thought of as a tool to replace more direct and explicit communications or even as a means of providing highly-detailed video. Rather, the implicit information source should be treated as a tool to augment existing explicit communications to help command build and expand their understanding of events in the field and help them narrow down what to focus on next.
Have you completed any further work with RescueCASTR or considered any other applications for this technology? For example, firefighters, paramedics, divers, etc.
We have yet to explore other emergency scenarios with RescueCASTR, though the types of camera streaming that the system provides could be broadly applicable to many types of emergency situations involving firefighters, police, paramedics, etc.
For more: Visit the Connections Lab (cLab) at SFU, and read the SFU News story Body cameras, live streaming bring search and rescue into the next generation