The rapid decision-making and real-time data requirements of firefighting will see emergency services increasingly turn to fog computing, according to a US university.
Wayne State University has explored this use-case as part of a collection of new perspectives on fog computing produced by members of The OpenFog Consortium and leading network and technology companies. Titled Bridging the Cloud-to-Things Continuum, the report has been published by Tech Idea Research.
While cloud-only networking works well in some scenarios, it leaves a gap in others, according to the report. That’s where fog computing comes in.
Firefighters increasingly depend on up-to-the-minute data, including locations, physiological conditions, building floor plans, hazard information, and the number of trapped occupants. Together, these data streams help ensure the safety of both firefighters and those they are attempting to rescue, and keep fire damage to a minimum.
Increasingly, this data is itself collected via an expanding range of technologies, including drones, communications systems, and sensors carried by firefighters, before being relayed to machine learning-based data- and video-analytics platforms to generate the insights that inform firefighters on the ground.
Such machine learning algorithms require powerful computing resources and large amounts of storage. This means a local, centralised data centre – on a fire vehicle itself, for example – could prove invaluable.
Meanwhile, a laptop or tablet operated by the incident commander could provide the interface for monitoring the fireground, while a mobile broadband router converts mobile internet signals into a local Wi-Fi hotspot.
Beyond this, edge nodes including routers, base stations, and switches could offer the fog computing infrastructure that processes and stores data at speed, relaying data back to the fireground – or onto the cloud when needed.
The argument for fog computing
Several solutions have looked to a cloud-only approach to support the storage and processing capabilities that firefighters need in these situations, such as localisation and tracking from TRX System’s NEON Personnel Tracker, and the Precision Location and Mapping System.
However, the time-critical nature of firefighting makes the latency introduced by cloud computing problematic, which is why the researchers have explored the potential of fog computing, identifying three distinct use-cases.
The first is hazard detection and occupant counting, as the report describes:
“While the rescue team is on a search mission, correctly detecting flashover and toxic gas and then quickly broadcasting its location to all fire fighters is extremely important to allow them to avoid a risky area. Hazards such as a fallen ceiling or a wall collapse, or even chemical gas emission could also be identified.
“In addition, the images and videos from the surveillance cameras are good sources for discovering how many occupants are trapped and even their location,” continues the paper. “However, these video and image analytics require large compute power, and extensive video data transmission over the internet will adversely affect the real-time performance.”
Second, fog computing could enable an intelligent, automated safety-decision system that helps sift through the mountains of data that could be useful in the event of a fire.
The rapid machine learning and AI capabilities of fog computing would allow fire services to analyse information automatically, including building blueprints, fireground data, social media posts, and demographic reports, and turn them into real-time recommendations.
“Imagine the next generation of firefighters: Various wearable sensors and devices as part of the firefighters’ uniform can sense their position, health condition, the presentation of dangerous chemical gases, environmental heat, and much more; drones can see the fireground’s aerial imagery; robotics with cameras and sensors can enter dangerous areas to see how much debris is in the way and gauge other important environmental parameters such as heat and smoke density.
With this data as its input, the automated intelligent safety decision system can help the fire firefighter find the safety exits, a nearby propane tank, warn when the temperatures around them are elevating, and even estimate the probability of an explosion.
NASA JPL is already carrying out AI research for firefighters, with a project known as AUDREY.
Last, fireground 3D modelling has the potential to equip incident commanders with situational information in a 3D space, says the report.
When integrated with a localisation system, it could display meaningful points of interest, such as firefighter positions. Parameters such as the building’s height, shape, number of floors, and interior floor plan could be obtained from drones or from data banks on city construction.
Internet of Business says
With net neutrality already under the spotlight in the US, the importance of a reliable network infrastructure was laid bare when Verizon faced strong criticism recently for apparently throttling the Santa Clara county fire department’s access during the recent wild fires.
The case for fog computing in firefighting brings together a whole host of IoT technologies to obtain and process data, including from drones.
The value of drones in disaster and emergency response has already been established. Recently, they have been used to identify chemical attacks and leaks, for example, while over 160 lives have been saved with the aid of unmanned aerial vehicles this year.
In a sense, fog computing represents a halfway-house between cloud and edge computing. Where cloud typically sees data storage and analysis handled in third-party data centres, edge puts these processes as close to the data origin as possible, such as the controllers on robots and manufacturing machinery, for example. Fog computing hosts these intelligent processes at local network level in servers known as fog nodes or IoT gateways.
The distributed horizontal architecture of fog computing allows critical functions such as communication, processing, storage, control, and decision-making to be distributed closer to the data’s origin. This helps to tackle latency, network accessibility, rising costs, and data security concerns.
Which processes operators decide to host in different environments will depend on the sensitivity of that data, how quickly it needs to be analysed, the impact of any downtime or latency, and budgetary constraints.
Many companies opt for edge ahead of fog computing (though the lines between these approaches are blurred), as it offers a less complex networking infrastructure and reduced latency.
Our guide to understanding the shift from cloud to edge computing explores the potential of placing processes as close to the metal as possible.
For more on cloud and edge computing, IoTBuild is taking place on 13-14 November 2018, Olympia Conference Centre, London