The idea of smart cities—those that use connectivity to derive data that improves quality of life for its residents—isn’t new, but as more cities adopt these technologies, the opportunities for putting this critical data to work are growing exponentially. Especially when it comes to optimizing how we respond to and prepare for emergencies.
This conversation, moderated by Ian Pitts, senior vice president general manager at Intrado, and featuring leaders in connectivity, emergency response and smart cities, explores some of the most exciting opportunities to drive the best possible outcomes in public safety—namely, faster response times, better use of resources and healthier, safer communities.
Ian Pitts, Intrado (moderator): We’re thinking holistically about how we can use smart city communication technologies to get data that’s truly relevant and valuable into the hands of our first responders so they can make more informed decisions. There are many technologies that are going to make the lives of our first responders much more streamlined and give us the ability to impact the communities around us and ultimately drive a better outcome for emergency situations.
One example is the work we did in Mexico City, a municipality of 8.5 million people, to improve mass notifications. We integrated our software into IoT [Internet of Things] sensors for seismic detection. We then take the data from those sensors and drive it into our notification platform that can notify the city’s 8.5 million residents through 25,000 Internet Protocol (IP)-enabled speakers mounted on polesof an imminent earthquake. For city leaders who are considering deploying IP audio like Mexico City, what should they be thinking about?
Kevin Taylor, Axis Communications: We can look at it from two perspectives: One, what are the use cases for audio in cities? And then, what are the advantages of IP audio versus analog audio? At Axis, we come from a physical security background, and security devices and systems can be a deterrent simply through their visual presence. But that’s somewhat of a passive deterrent. But we can couple that with audio that’s pre-recorded messages or, potentially, even live, two-way communication. This can become an active deterrent where we can thwart or prevent unwanted behavior.
Another potential use case would be the mass notification application you talked about. In the event of a known, broad emergency situation, how are you pushing information out to the community? And another possible use would be a more targeted emergency response. Let’s say there was an incident and the PSAP is being flooded by calls regarding that incident. How do we push out communications so that people are informed and know how to respond, either to evacuate or shelter in place, and know that emergency personnel are already informed of the situation?
If you look at the advantages of IP audio over analog, it is much more dynamic and elastic. You can scale it up and down to be as targeted or as broad-scale as you need via the software. So it’s very, very efficient. IP audio also gives you the opportunity to supervise and test the system on a periodic basis. One of the challenges of analog audio systems is that you never know it’s not working until the moment you need it. With IP, you can continually ping those devices and run health checks.
Ian Pitts: Very interesting use cases. We’re no longer limited to just, for example, a tsunami siren that is a nondescript tone that doesn’t give someone intelligible audio instruction. Yoni, I know you have been working on gunshot detection systems that play a part in smart city emergency notification. How do you think that applies to what we’re talking about here?
Yoni Sherizen, Gabriel: Everything from gunshot detection to violence detection and sensing that Gabriel devices can do using a number of different technologies really changes the way that managing a smart city takes place today. To Kevin’s point, number one, it’s about smart mass notifications going out. But it’s not only that. By gathering that rich information—whether it’s through IoT devices or some other things, and whether it’s in schools, offices or public spaces—when something goes wrong or is about to go wrong, you can create a really clear picture of what’s happening. That completely changes the situation for everyone. Obviously, for a first responder going in with visibility or managing a situation remotely with visibility is a complete game-changer for them as well as the occupants of that building or space.
So pulling in that rich data—whether it’s from a Gabriel device, microphones or other sensors—and then providing that rich data through a Gabriel dashboard or Intrado’s Emergency Data Broker or some other system at PSAPs—that reduces the risk dramatically for anyone who’s trying to address a situation like this. When we talk about smart cities, it’s not only about efficiency and faster response, it’s also got to be a safer response.
Ian Pitts: Oftentimes, the systems that organizations put in place never found their way to 911 operators or first responders. In the next few years, I see data coming that is more consolidated, with descriptive data points that really can be leveraged to respond to these situations. Now we’re seeing data and the ability to interact with devices can be tremendously valuable to first responders, whether they’re in the PSAP or the responder actually on-site. I think of telematics in vehicles, which gives us a tremendous amount of data. Niclas, do you want to start?
Niclas Gyllenram, Aiden Automotive Technologies: What we’re building at Aiden is a bidirectional communication pipeline that connects vehicles with anything. The first use case we typically think about is collecting data from cars and from crashes. But you can use the same pipeline to push data and tailored notifications to vehicles and tell people where they should go and what they should do in different situations.
What would be really exciting for the future is the ability to connect 911 responders to push real-time messages to people on their devices, including in the vehicle. There’s huge potential here. Cities can only be smart of they are connected. If things are not connected, they are fairly stupid islands.
Ian Pitts: I’ve always wondered when we were going to evolve from the siren on top of first responder vehicles. Sound-deadening is now a big focus in cars and we even have noise cancellation in them. That’s a great application for smart data: taking information from 911 and getting it on the car’s display.
Niclas Gyllenram: Instead of playing a siren, you could play different audio for different vehicles depending on where they are. You could give people different instructions, like “Stop at the side of the road here,” so that 911 responders can get to wherever they need to go.
Kevin Taylor: The concept of preemption also plays a big part in this. So when that fire truck or ambulance is going along a corridor and speaking to roadside units and calling three or four intersections down the line, saying “Hey, I’m coming through and I’m in an emergency situation,” the signals are adaptively changing. Perpendicular traffic is red-lighted and the emergency vehicle has green lights all the way through.
Yoni Sherizen: We’re bringing connection not just to the security side, and specifically around active shooters, but for all sorts of emergencies and mass casualty threats—what we’ve called a community approach to security. We created a network effect where, when one location is in lockdown or addressing a major issue, we blast out a notification—what we call a “yellow alert”—across the geographic area. We’ve done this on a very small scale, but the vision is that the world should be moving to that place. I think what we’re all discussing on this panel is, how do you take a smart city and create a community approach to sharing information? So if one car picks up on a threat, how can it alert another? Or if a responding officer is going into an environment, how could that information or warning get out to the people nearby? So that instead of people walking right into the line of fire, heaven forbid, we get them away from trouble.
The challenge is bringing all that information together and turning it into smart, actionable information. I would love to see that grow at the city level.
Ian Pitts: You mentioned Intrado’s Emergency Data Broker. This is an area where Intrado is really trying to change the paradigm to make sure the community approach to sharing information is open for all to participate in. This is a really exciting time in the future of 911 because we have organizations that know we have to work collectively for the better of our communities. That requires breaking some of these silos, working on things like industry standards to make sure we can all communicate properly.
We’ve seen a huge uptick in organizations focusing on in-building detection, like shot detection and aggression detection. Yoni, what do you think are the biggest driving factors for these types of technologies?
Yoni Sherizen: Unfortunately, we’ve seen a dramatic spike in violent crime. Just in the first quarter of this year alone, there were 147 mass shootings in the U.S., so what you’re seeing is a response to that.
You have some people at the front of this uptick who were very early adopters and forward-facing leaders in the world of public safety and security. Unfortunately, though, it often takes a real tragedy to get things to start shifting and moving much faster. We’re living in an era where we’re seeing those violent acts happening more frequently with greater damage. And that’s driving a spike in interest in gunshot detection technology, weapon detection and aggression detection.
On the other side, we’ve seen the ability to reduce false alarms, too. By being able to sense and create an instant picture of what’s going on on the other side of a sensor or a call from someone to 911, instead of launching a huge response and creating a certain level of chaos you’re able to very quickly verify that this is a real emergency, or cancel if it is not. Then you’re saving public resources so they can be used where really needed. That’s another value we’re seeing coming out of the sensing and that smart information from the environment.
Ian Pitts: So let’s go beyond the role of the building in emergency notification. Niclas, how do you see telematics in vehicles playing a part in smart cities and emergency notifications?
Niclas Gyllenram: Cars are huge producers of relevant data. You’ll know about the movement of people, traffic incidents and, if there has been a crash, you will get a lot of data about the severity of the crash, G-forces the car has been subjected to, number of occupants and whether it spun over or not. Having all of that information available to pull out of the vehicle will make it possible for 911 responders to know what the appropriate response should be. And when it comes to pushing data back to vehicles, you could use that to direct people away from a scene or to a certain place.
The automotive industry has always been very fragmented. Even the largest car companies are less than 10% of the global car market, and everyone has their own proprietary solutions. Android Automotive is basically taking over the infotainment domain in the car industry; almost all car manufacturers are moving to Android. So we’re building an Android-based solution that will run the same way with all vehicles, whether it’s a Volvo, GM or Ford. That solution provides not only a way to read data, but to actually engage with drivers and occupants in real time. Vehicles and mobility are essential parts of smart cities.
Ian Pitts: The promise of autonomous driving is always just around the corner. When it gets here, it’s going to change the way our first responders work. Maybe there’s nobody in the car, or the person is in the backseat. How do you think autonomous driving will impact emergency response in a smart city?
Niclas Gyllenram: Autonomous driving is currently being developed in standalone islands where no car is talking to any other car. There’s no communication between vehicles and infrastructure, either. But being able to connect with other vehicles and with infrastructure will be one of the key technologies to even enable autonomous driving at all.
Most of the car industry has realized this is a more difficult problem to solve than anyone thought five years ago. And it’s going to take a bit longer before we truly have autonomous vehicles. One key factor we need to solve is definitely connecting all the vehicles.
Ian Pitts: We have a number of solutions to inform people widely, like our wireless emergency alert systems like an Amber Alert or an earthquake warning. The challenge is the inability to specifically address a subpopulation. Do you see these technologies evolving to more IP-based technologies where we can target notifications? This also makes me think about video; we have cameras everywhere on poles. How do we bring all this together?
Kevin Taylor: Amber Alerts are a different level of notification than a push notification that’s directly related to the personal safety of the person receiving it. So we run into legislative questions. If it’s optional for someone to opt in for the alerts, what do you do about people who don’t opt in? People who don’t have broadband internet or cellular connectivity? As we look to institute these technologies, communities have a responsibility to deliver these services with equity across all parts of the community. And that’s where IoT and network (IP) audio can be a strong complement to things like a cellular-based Amber Alert.
Ian Pitts: One of the things we learned early on with Syn-Apps’ Revolution is that the more devices and modalities you can engage people on for a mass notification, the more quickly we could inform people and ultimately the more positive the outcome was. As we look to the future and our ability to deliver to the public more rapid notification about things happening around them, or share data from our cars, or drive greater awareness in the 911 operations center, these are all things that will significantly transform how we interact as a society.
Artificial intelligence (AI), machine learning, deep learning, edge computing and 5G are part of this future, of course. How do you think these are going to impact what we do, both in our organizations and as a society, especially when it comes to 911?
Yoni Sherizen: AI and machine learning are allowing us to create faster, smarter automated responses. With legacy technology, you have multiple channels and you need someone handling each of those channels to make sense of it and turn it into actionable information. We can already start automating these processes, cutting down on precious time and creating flows that put things in motion or stop things that shouldn’t be in motion. And, again, saving resources and saving lives.
Then there’s greater connectivity, whether that’s 5G or IoT, that is faster and cheaper. And more and more devices—cameras, microphones, speakers, sensors, dashboards. We’re getting smarter and smarter inside of buildings and these faster, smarter responses are not only saving lives, they’re saving money. And that’s going to drive even greater proliferation of this.
Kevin Taylor: When we throw out new buzzwords—whether it’s 5G, AI, machine learning or deep learning—these words tend to be adopted into the marketplace as if they’re fairy dust. Like they are the new, magical thing that will solve all issues and we’ll live in a utopia with this new technology. The safer thing is to take a more pragmatic approach to how we embed these new technologies into our infrastructure and into our platforms so they add real, long-term value.
Take video. Most of the improvements in video in the last 10 or 12 years have assisted the 911 operator from a visual standpoint, delivering better-quality video that’s easier to stream on the network. But there are other developments happening behind the scenes now that will eventually allow us to embed AI and machine learning. So we wouldn’t just push raw video to a telecommunicator, but the device and platform would queue up the most important data to the call-taker. So now that person can perform their task based on the most relevant and most important information, instead of having to go through all possible data, which they don’t have time to sift through. We can get a lot more granular by way of these new technologies.
Ian Pitts: As Kevin notes, AI and machine learning can help us distill the overwhelming number of data points into something that’s meaningful for a given situation. A 911 operator doesn’t have minutes, she has seconds to make good choices. Niclas, when you think about AI and what it means for cars, what comes to mind around these technologies and what you’re trying to build?
Niclas Gyllenram: We’re trying to build anomaly detection algorithms and networks running in the vehicle and on the cloud side so that we can share information about any kind of anomaly. Typically, the first thing that indicates a car has experienced some kind of event is data from the accelerometer or gyroscope. But it could be for other things, because we are pretty good at building AI to consume vast amounts of data and find patterns.
You will instantly know that something has happened, even before anyone has picked up a phone and made a 911 call. Then, within 10, 20, 30 seconds, you’ll start to receive 911 calls about this incident. But it could be that well before that that you have AI that’s able to predict that on this road, in these conditions, with these traffic patterns, the risk of incidents is increasing. And there could be a way to push messages to a car to actually lower the risk of something happening based on what AI sees in the driving behavior. These are applications where we can use AI heavily to make roads safer and even predict where accidents will happen and lower the risk.
Kevin Taylor: Anomaly detection is really interesting. “Near-miss detection” is the way it’s commonly referred to in the Department of Transportation (DOT) and ITS space. Once we determine where there are a large number of near-misses, it could be as simple as changing the striping on the roads or adding new signage so that the approach to that intersection is more clearly defined for all motorists.
Ian Pitts: Lastly, I want to talk about data integrity and security. How do we as a community and as manufacturers of this type of technology help alleviate concerns about this?
Niclas Gyllenram: We need to build trust around how we handle data and the way to do that is to make sure we are very clear about what data we collect and how we handle that data, how long we store it for, and so on. When we connect more and more devices, we need to be upfront about what we’re doing, what the purpose is, and how people’s data will be shared, and also put them in control of it so they can opt in or out.
Yoni Sherizen: With Gabriel, we really wrestled with that. This is a really important question. If there’s a bad actor and they’re feeding the wrong information or taking information from someone, it could be the difference between life and death. So we have to be exceptionally careful with the quality and validity of the data we’re getting. What we’ve done is set up verification for where we’re getting information from.
Ian Pitts: This is something that we as a group need to be aware of as we build these technologies to make sure that we are being transparent in the way we collect and share data. As we think of about life and safety and emergency notification, it’s more critical for us to consider that and to make sure we build trust with our first responders, our communities and everyone else.
This session has been edited for clarity and brevity.
Watch the full “Leveraging Smart City Data to Deliver Safer Outcomes” session and sign up to learn about future “911 Live” events.