A system to detect potential fires using a thermographic camera

Nat Hazards https://doi.org/10.1007/s11069-018-3224-0 ORIGINAL PAPER A system to detect potential fires using a thermographic camera Chijoo Lee1 • ...
Author: Alfred Franklin
9 downloads 0 Views 1MB Size
Nat Hazards https://doi.org/10.1007/s11069-018-3224-0 ORIGINAL PAPER

A system to detect potential fires using a thermographic camera Chijoo Lee1



Hyungjun Yang2

Received: 18 February 2017 / Accepted: 5 February 2018  Springer Science+Business Media B.V., part of Springer Nature 2018

Abstract This paper describes a fire monitoring system, based on a thermographic camera, for electrical appliances in interior spaces. These appliances are at particular risk because they are vulnerable to the carelessness of users (46% of electrical appliances fires are caused this way). The system compromises a thermographic camera, rotating on a two-axis robotic arm, controlled by a fire monitoring algorithm that detects the appliances’ status. Once the system’s accuracy and ability to identify the status of each appliance had been tested, the camera’s rotation sequence was planned. To achieve the best efficiency, bearing in mind that fires can break out very quickly, the sequence was based on the distance between monitored appliances. Over a nine-hour period, monitoring six appliances, the proposed method resulted in about 295 (about 7%) more rotations than those produced by a method of arbitrary ordering. This effectiveness increases when more appliances are monitored over greater periods. The system’s main contribution to fire safety is the application and full utilization of the thermal camera, detecting the beginnings of a fire before it can break out. Keywords Fire monitoring  Thermographic camera  Robotic arm  Rotation planning

1 Introduction Many devices can be used to detect fires, including smoke detectors, photosensitive sensors, temperature sensors, optical sensors, and video cameras (C¸etin et al. 2013; Cheng et al. 2011; Grapinet et al. 2013; Gutmacher et al. 2012; Ko et al. 2010; Leblon 2005; & Hyungjun Yang [email protected] 1

Department of Architectural Engineering, Yonsei University, Seoul 03722, South Korea

2

Division of Maintenance Bureau of Facility Management, Seoul National University, Seoul 08826, South Korea

123

Nat Hazards

Wang et al. 2009). However, systems based on sensors and video cameras can detect fires only after they start, and each device has its own drawbacks, as the following examples show. Smoke sensors require time for smoke to move from the seat of the fire to the sensor (C¸etin et al. 2013; Hackner et al. 2016; Kanwal et al. 2016; Kru¨ger et al. 2017). Photosensitive sensors are affected by sunlight and lighting as well as fire (Kru¨ger et al. 2017). A temperature sensor can detect temperature change before a fire breaks out, but its ability to do so can be affected by the sensor’s location (Kru¨ger et al. 2017). Existing sensor-based fire detection systems cannot be readily applied to large enclosed spaces such as shopping centers. Finally, none of the devices offer detailed fire information, such as its shape in three dimensions (C¸etin et al. 2013; Kanwal et al. 2016). Video-based systems work over longer distances and in larger enclosed spaces than sensor-based systems (C ¸ etin et al. 2013; Emmy Prema et al. 2016; Hackner et al. 2016) and can yield information such as fire location and size (Beji et al. 2014). However, they cannot detect fire in a dark space until the fire is established (Gade and Moeslund 2014). This study offers a system that can monitor a large enclosed space and detect possible fires. Note that 46% of electrical appliances fires are caused by careless use (National Emergency Management Agency 2012). The system is based on a thermographic camera that can measure the temperatures of objects (San-Miguel-Ayanz and Ravail 2005; Saraf et al. 2008; Sleights 2011). These cameras, which have recently become cheaper to buy (Gade and Moeslund 2014), are now used in several different areas such as nondestructive testing (Khodayar et al. 2016) and the detection of physical and chemical changes in concrete structures (Du et al. 2015). The thermographic camera can provide fire monitoring over long distances and in large spaces and can visualize the 3D shapes of measured objects (Vidas and Moghadam 2013). Furthermore, its use decreases the complexity of the system configuration, eliminates costs of installing addition sensors, reduces uncertainty about compatibility with other sensors, and is applicable in dark spaces because it detects the infrared radiation emitted from objects. However, the system needs more measurement time than other temperature and smokesensing devices, because it uses a single camera to examine and determine the status of many appliances sequentially, rather than simultaneously. This is a weakness of the developed system because a fire can break out and spread in seconds. It is therefore necessary to minimize measurement time, and a method is proposed to plan the rotation sequence of a thermographic camera based on the distance between appliances. It is based on the traveling salesperson problem, the optimal analysis method for passing through each and every device over the shortest distance; it finds the shortest route that examines all appliances (Fischer et al. 2014). The method is useful for planning the motion of autonomous vehicles (Isaiah and Shima 2015) and for developing the system described in this study. After a literature review in the following section, the paper continues by explaining the characteristics of the developed monitoring system from both hardware and software perspectives. It then reports the accuracy of the fire monitoring tests before explaining the rotation planning method for the thermographic camera and its application, based on the location of the monitored appliances.

2 Literature review Previous methods of fire detection were either sensor-based or video-based.

123

Nat Hazards

In sensor-based studies, Cheng et al. (2011) proposed a method to improve the accuracy of fire detection by combining various sensors (including thermal, smoke, and gas concentration) into a neural network fire alarm system. Gutmacher et al. (2012) proposed a fast, accurate fire detection sensor using metal oxide sensors to overcome the drawbacks of gas sensors that do not detect non-smoking fires and are affected by such factors as dust and water vapor, and of temperature sensors that do not detect fires until their heat reaches the sensor. Hackner et al. (2016) developed a fire alarm system using a gas sensor and lowcost charge-coupled device (CCD) camera. Kru¨ger et al. (2017) developed an early warning system that used a hydrogen sensor to detect smoldering in polymeric materials; hydrogen is emitted before carbon monoxide and smoke are generated. Wang et al. (2011) proposed a method to improve the signal strengths of temperature and smoke sensors, thus overcoming the limitation of low fire signals in conditions of low pressure. Wang and Wang (2014) developed a photoacoustic gas sensor using a near-infrared tunable fiber laser and wavelength modulation spectroscopy for early fire detection. However, all sensor-based fire detection systems have the following limitations: • They are generally better suited for the detection of fire in confined spaces (Hackner et al. 2016) and are difficult to apply effectively in large spaces such as shopping centers. • Until the particles or gases reach the sensors designed to detect them, no information is available on fire location, scale, and degree of burning. The sensor must therefore be close to the fire source (Kanwal et al. 2016). For example, smoke sensors require time for the smoke to move from the fire seat to the sensor, while gas concentration sensors need a relatively large amount of CO to be released (C¸etin et al. 2013; Hackner et al. 2016; Kanwal et al. 2016; Kru¨ger et al. 2017). • They do not provide any details of the nature of the fire, nor of its 3D shape (C ¸ etin et al. 2013; Hackner et al. 2016; Kanwal et al. 2016). • Larger numbers of protected potential sources need larger numbers of installed sensors. In addition, accuracy and speed of fire detection require combinations of various sensors, for example, smoke and temperature sensors (C¸etin et al. 2013; Hackner et al. 2016; Kanwal et al. 2016; Kru¨ger et al. 2017). The use of many sensors is expensive, and the complex system configuration can complicate exchange of information among sensors (Jelicic et al. 2013; Kanwal et al. 2016; Karakus et al. 2013; Vidas and Moghadam 2013). • False alarms occur often. For example, photosensitive sensors can be misled by sunlight and artificial lighting; smoke sensors can be affected by various gases; and temperature sensors can be affected by their own locations and need a minimum amount of heat to trigger them (Kru¨ger et al. 2017). • The installed sensors require wireless power (Mahdipour and Dadkhah 2014). For video-based systems, C¸elik and Demirel (2009) developed a rule-based generic color model for flame pixel classification, aiming to improve accuracy. C ¸ etin et al. (2013) proposed a system that detects smoke faster than do sensor-based systems, working accurately even in a large space. Han et al. (2017) developed a video detection system based on the fire’s motion features and color information. Jia et al. (2016a) proposed an adaptive flame segmentation algorithm working on the excessive incoming light from the fire as well as a smoke detection model, based on smoke pixel classification, to locate the smoke region (2016b). Kanwal et al. (2016) developed a low-cost wireless system based on an image processing using a machine vision technique. Kuo et al. (2015) proposed a processing method for various flame images such as vague flame, a detection method for

123

Nat Hazards

quasi-periodic behavior of flame boundaries, and an evaluation method based on compactness, fill rate, corner flicker rate, and growth rate. Qureshi et al. (2016) developed a system for early detection that can calculate smoke motion and flame color cues and uses vision sensors. Rong et al. (2013) proposed an analysis algorithm to detect fire color, motion, and pattern. Truong and Kim (2012) proposed a method to predict fire propagation on a video screen. Wang et al. (2014) proposed a temporal and spatial algorithm to detect flames. Ye et al. (2017) proposed a method to detect smoke and flame simultaneously, using color and wavelet analysis. Several studies have used statistical methods to improve the accuracy of video-based fire detection systems. Ko et al. (2010) proposed a method using CCD cameras and statistical methods to detect fire in its early stages, distinguishing fire-caused motions from similar motions. Kong et al. (2016) proposed a visual analysis technique using logistic regression and temporal smoothing to reduce false alarm rates. These studies determined the superiority of video-based fire detection systems over sensor-based systems for prompt fire detection, showing that they have the following advantages: • The video-based system is applicable to a large space with a guaranteed clear view (C¸etin et al. 2013; Emmy Prema et al. 2016; Hackner et al. 2016). • There is no need to increase the number of cameras when the number of monitored objects increases. • The system can detect the change of spatial–temporal features, such as change of color and object shape, size, location, growth, and direction of the fire (Beji et al. 2014; Hackner et al. 2016; Qureshi et al. 2016). • The system can detect fires quickly and has a lower rate of false alarms than sensorbased systems because it detects visually (Beji et al. 2014; C¸etin et al. 2013; Emmy Prema et al. 2016; Kanwal et al. 2016; Qureshi et al. 2016). • However, a video-based system cannot detect fire in a dark space until the fire starts, and the color and visibility of objects can appear differently depending when affected by an external energy source such as sunlight or artificial lighting (Gade and Moeslund 2014). Both sensor and video systems have the limitation of being able to detect a fire only after it starts. Objects that are likely to catch fire can exhibit an anomalously high temperature before catching alight, and a rapid temperature change is a useful indicator of objects in this state. However, in a sensor-based system, sensors must be installed on every single object; this entails high cost and much effort, and effectiveness can be degraded by the use of incompatible sensors. For this study, therefore, thermographic cameras were used to identify electrical appliances at imminent risk of catching fire. Thermographic cameras capture the infrared radiation emitted by objects and are therefore useful for early fire detection; they also have recently become lower (Gade and Moeslund 2014) while retaining all their advantages for fire monitoring.

2.1 Fire monitoring system The proposed fire detection system, based on a thermographic camera, includes hardware and software components (Figs. 1, 2). The hardware (Fig. 2) includes a thermographic camera, two servomotors with a central processing unit (CPU), and a control computer. The thermographic camera has a 25 9 18.8 (width 9 height) field of view, and images are 320 9 240 pixels (width 9 height). The device measures temperatures from -20 to

123

Nat Hazards

Fig. 1 Conceptual diagram of the fire monitoring system

Fig. 2 Developed fire monitoring system

120 C with a resolution of ± 0.05 C. The servomotor has a minimum control angle of 0.29 and the rotation angle is from 0 to 300. The maximum rotation speed is 64 revolutions per minute (rpm). The control computer runs Windows Vista and has an Intel Processor A110 CPU with 2 GB of RAM.

123

Nat Hazards Fig. 3 A method to calculate the temperature from a thermographic image (e.g., monitor)

Fig. 4 The monitoring algorithm for detecting fire

The software includes a thermographic data reader and a situation identification algorithm. The thermographic camera measures the heat emitted from each object and captures its thermal image. The measured temperature and thermal images are saved as temperature value per pixel, and then temperature data are converted into a text file by the thermographic software for use in the fire detection process (Fig. 3). The thermographic camera rotates on a user-defined path to measure the temperature of each electrical appliance in sequence. The exact angle of rotation is controlled by a twoaxis robotic arm, composed of movable servomotors that control horizontal and vertical movement. The servomotor is a robotic actuator with reduction gear, position sensor, direct current (DC) motor, and central processing unit (CPU). Temperature data are exported to the control computer as a text file from the thermographic camera. The text file is analyzed by the fire monitoring algorithm developed for this study and based on programming language C#; the current status of the electrical appliance is then transmitted to the user. The algorithm follows the procedure shown below (Fig. 4). First, the serial number, location, and size of each electrical appliance are specified. Then, the threshold temperature Tf is determined for each object. Tf values are criteria used

123

Nat Hazards

by the fire monitoring system to identify potential for fire outbreak; for each appliance, Tf is the maximum temperature among the temperature values per pixel of the several thermographic images measured during power on. For this experiment, the threshold for incipient fire is defined as the minimum temperature among 30 measurements of Tf in normal operation. Second, the current temperature Tn of each device is measured during each pass of the thermal camera, and the value saved. The device status is decided by checking whether or not Tn [ Tf. Third, the average temperature of each electrical appliance is measured and saved sequentially until the status of the electrical appliance changes. The new Tn is repeatedly compared to Tf.

2.2 Accuracy test An experiment was conducted at Yonsei University, South Korea, to evaluate the accuracy of the fire monitoring system for potential fire outbreaks in electrical appliances. The subjects monitored, all in the interior of a building, included a computer mainframe, monitor, lamp adaptor, and fluorescent light. The lamp adaptor was chosen as a common source of fire, because these devices are often connected to many electrical appliances simultaneously. Each subject was defined by its corner points on a thermographic image (Fig. 5). The measured temperatures of all pixels were saved. The developed system was tested for (1) accuracy of temperature measurement and (2) accuracy of fire status monitoring. The fire status of an electrical appliance changes to positive at the point at which the minimum rate of temperature change is exceeded.

2.3 Temperature accuracy of the thermographic camera Before starting the experiment, the thermographic camera was calibrated to account for the emissivity of the appliances and for the effects of ambient temperature and humidity. The separate parts of each electrical appliance were measured 10 times, and the results are set out in Table 1. First, the temperatures of each appliance were measured using both a contact thermometer and the thermographic camera; an installed algorithm in the camera then

Fig. 5 Example of thermographic image for fire monitoring: a computer mainframe, b lamp adaptor

123

Nat Hazards Table 1 Accuracy test of thermographic camera

Object

Accuracy

Emissivity

Ambient Temperature (C)

Humidity (%)

Computer mainframe Off

0.16

0.716

19.19

31.07

On

0.20

1

19.93

31.3

Off

0.14

1

19.79

90.93

On

0.43

1

17.58

33.57

Monitor

Fluorescent light Off

0.70

0.633

17.24

34.26

On

0.64

0.956

18.07

32.56

Lamp adaptor Off

0.46

1

15.92

31.35

On

1.03

0.788

19.14

31.39

corrected for emissivity, using the differences in measured temperatures. Second, the accuracy of the measured temperature was analyzed according to change in ambient temperature and humidity. When the lamp adaptor was powered on, the difference between the thermographic camera and contact thermometer readings was at its largest (* 1.03 C). When the computer monitor was powered off, the difference between thermographic camera and contact thermometer readings was at its smallest (* 0.14 C). That is, in the range of temperature and humidity in this experiment, the error of measurement by the thermographic camera was less than ± 2 C.

2.4 The results of monitoring for potential fire outbreaks One object of this study is to inform users of the probability of fire for each object monitored, by triggering an alert if an individually set threshold temperature (Tf of Fig. 4) is reached. If the temperature of an object exceeds its threshold temperature at any time, fire is likely if the device suffers overload or breakdown. In all cases, the monitoring of potential fire outbreak was successful (Table 2).

2.5 Rotation planning method Users of this fire monitoring system would also need to know how to optimize its scanning procedure. This section proposes a method to minimize the camera’s rotation as well as a method to observe all electrical appliances in the shortest possible time, because a fire can break out and spread in seconds. It also describes the application of the method to the experiment. Before starting, a 3D model of the experimental area at Yonsei University (Fig. 6) was created. In the present study, the criteria for analysis of rotation planning were revolutions per minute, rotation distance of the thermographic camera, and time required to monitor each appliance. The maximum speed of the servomotors used here (64 rpm) was applied equally in all cases because a single thermographic camera was used throughout. The maximum rotation angle of the servomotors was 300. The required time for monitoring was

123

Nat Hazards Table 2 The results of monitoring for potential fire outbreaks

Electrical appliance

Threshold temperature: fire (C)

Mainframe

25.76 (:)

Monitor

42.14 (:)

Fluorescent light

33.48 (:)

Lamp adaptor

35.97 (:)

programmed as 1 s for each appliance, and the shortest distance between each pair of appliances was measured. For the optimization method, the traveling salesperson approach was applied; this involves selection of the minimum path that the salesperson (here, the camera) must take from the office to visit each customer (appliance) before returning to the office (Fischer et al. 2014). Six of the appliances were selected: the air conditioner, ventilator, computer mainframe, computer monitor, lamp adaptor, and window (Table 3), and applied to the optimization model of Eq. (1). Minimize c00 x00 þ c01 x01 þ . . . þ c05 x05 þ c10 x10 þ    þ c55 x55 Subject to xi0 þ xi1 þ    þ xi5 ¼ 1; i ¼ 0; . . .; 5 x0j þ x1j þ    þ x5j ¼ 1; j ¼ 0; . . .; 5  x ¼ xij is a tour assignment

ð1Þ

xij ¼ 0 or 1; i ¼ 0; . . .; 5; j ¼ 0; . . .; 5 First, the objective function (Minimize c00x00 ? c01x01 ?  ? c05x05 ? c10x10 ? _ ? c55x55) is to minimize rotation distance in Eq. 1, where rotation distance from

Fig. 6 Rotation planning method for the thermographic camera

123

Nat Hazards Table 3 Time (s) required for the camera to change its aim from one appliance to the next

Air conditioner Ventilator Computer Monitor Lamp adaptor Window

Air conditioner

Ventilator

Computer

Monitor

Lamp adaptor



0.03

0.22

0.29

0.12

0.10



0.25

0.32

0.15

0.08



0.06

0.10

0.31



0.16

0.26



Window

0.23 –

appliance i to j is cij. Next, the decision-making variable is xij (i = 0, …, 5, j = 0, …, 5), which is ‘‘1’’ if it is included in the rotation planning of the thermographic camera, and ‘‘0’’ otherwise. Assignment and rotation are constrained. In the assignment constraint, appliance i can change to only one other appliance. Then, the sum of the variables is ‘‘1’’ for movement of appliance i to all other appliances (xi0 ? xi1 ? _ ? xi5 = 1, i = 0, …, 5), and the sum of the variables is ‘‘1’’ for movement to appliance j, because movement to appliance j is completed by only one other preceding appliance (x0j ? x1j ? _ ? x5j = 1, j = 0, …, 5). The constraint condition of rotation (x = (xij) is a rotation assignment) is to measure the status of all six appliances, so sub-rotations that do not measure all six appliances are removed. When the number of monitored appliances is six, there are six sub-rotations, each composed of only one appliance; fifteen sub-rotations composed of two appliances; twenty sub-rotations composed of three appliances; fifteen sub-rotations composed of four appliances; and six sub-rotations composed of five appliances. For example, the constraint condition for the sub-rotation prevention of one appliance is a B 0, where the appliances are a, b, c, d, e, and f. The constraint condition for the sub-rotation prevention of two, three, four, and five electrical appliances is a–b–a B 1, a–b–c–a B 2, a–b–c–d–a B 3, and a–b–c–d–e–a B 4, respectively. For example, the rotation condition comes into existence in the case of a–b–a = 2 because the measurement of one appliance is ‘‘1’’ due to the assignment constraint. However, the rotation is not completed in the case of a–b–a B 1 because the sub-rotation is completed. For this reason, all sub-rotations are removed from the rotation constraint condition. The required time for the measurement of each appliance with the application of Eq. (1) is shown in Table 3. The measuring sequence for the proposed rotation planning method was analyzed as follows: air conditioner ? lamp adaptor ? computer mainframe ? computer monitor ? window ? ventilator. The total rotation time was estimated at about 1.17 s. The measuring sequence of the arbitrary ordering was analyzed as follows: air conditioner ? ventilator ? window ? computer monitor ? computer mainframe ? lamp adaptor. The total rotation time was estimated at about 1.67 s. The difference in measurement times between the proposed method and the arbitrary ordering was 0.51 s. For example, the required time for rotation of the camera among the six appliances based on the proposed method was 1.17 s, and the required time for monitoring of the six appliances was six seconds, as each appliance took one second to process. Monitoring time was controlled by programming. A total of 7.17 s was therefore required for the camera’s rotation and monitoring of the six appliances. As 9 h is 32,400 s, the number of rotations was about 4519 while the number of rotations by arbitrary order was about 4224. The difference in the number of rotations between the proposed method and the arbitrary order,

123

Nat Hazards

Fig. 7 The difference of the rotation number between the proposed and the existing method

then, was about 295 over 9 h; this difference increases with the number of appliances to be measured, as does the effectiveness of the proposed optimal method (Fig. 7).

3 Conclusion The monitoring system was developed to detect the possibility of fires before they break out. It monitors electrical appliances in a large enclosed space and is particularly aimed at reducing the number of fires caused by users’ carelessness; at 46%, this is the most common cause of fires (National Emergency Management Agency 2012). The developed system comprises an accurate thermographic camera and a two-axis robotic arm to rotate it. A monitoring algorithm detects the possibility of incipient fire by analyzing the temperature change of each object monitored and transmitting the result to the user. A method to optimize the monitoring rotation is proposed, and the algorithm improves utilization of the thermographic camera by reducing measurement time; this reduction is valuable because a fire can break out and spread very quickly. The main contribution of this study is the development of a system that can use temperature change to detect potential fires before they start. It uses a single thermographic camera, so compatibility of devices is not a problem, and the device recognizes the 3D shape of measured objects. The starting point and size of a fire can be identified, and the system is applicable to large enclosed spaces. The system is limited because it cannot measure accurately when walls, columns, or other obstacles obtrude into the monitored space. To overcome this and improve monitoring accuracy, a future study should incorporate a rail on which to move the camera horizontally and thus enable full coverage of the space. Acknowledgements The authors express their thanks to Professor Ghang Lee (Department of Architectural Engineering, Yonsei University, South Korea) for his advice and generosity about information offering.

References Beji T, Verstockt S, Van de Walle R, Merci B (2014) On the use of real-time video to forecast fire growth in enclosures. Fire Technol 50:1021–1040 C ¸ elik T, Demirel H (2009) Fire detection in video sequences using a generic color model. Fire Saf J 44:147–158

123

Nat Hazards C ¸ etin AE, Dimitropoulos K, Gouverneur B, Grammalidis N, Gu¨nay O, Habibogˇlu YH, To¨reyin BU, Verstockt S (2013) Video fire detection—review. Digit Signal Proc 23:1827–1843 Cheng C, Sun F, Zhou X (2011) One fire detection method using neural networks. Tsinghua Sci Technol 16:31–35 Du HX, Wu HP, Wang FJ, Yan RZ (2015) The detection of high-strength concrete exposed to high temperatures using infrared thermal imaging technique. Mater Res Innov 19:S1-162–S161-167 Emmy Prema C, Vinsley SS, Suresh S (2016) Multi feature analysis of smoke in YUV color space for early forest fire detection. Fire Technol 52:1319–1342 Fischer A, Fischer F, Ja¨ger G, Keilwagen J, Molitor P, Grosse I (2014) Exact algorithms and heuristics for the quadratic traveling salesman problem with an application in bioinformatics. Discrete Appl Math 166:97–114 Gade R, Moeslund TB (2014) Thermal cameras and applications: a survey. Mach Vis Appl 25:245–262 Grapinet M, De Souza P, Smal JC, Blosseville JM (2013) Characterization and simulation of optical sensors. Accid Anal Prev 60:344–352 Gutmacher D, Hoefer U, Wo¨llenstein J (2012) Gas sensor technologies for fire detection. Sens Actuat B Chem 175:40–45 Hackner A, Oberpriller H, Ohnesorge A, Hechtenberg V, Mu¨ller G (2016) Heterogeneous sensor arrays: merging cameras and gas sensors into innovative fire detection systems. Sens Actuat B Chem 231:497–505 Han X-F, Jin JS, Wang M-J, Jiang W, Gao L, Xiao L-P (2017) Video fire detection based on gaussian mixture model and multi-color features. Signal Image Video Process 1:1419 Isaiah P, Shima T (2015) Motion planning algorithms for the dubins travelling salesperson problem. Automatica 53:247–255 Jelicic V, Magno M, Brunelli D, Paci G, Benini L (2013) Context-Adaptive multimodal wireless sensor network for energy-efficient gas monitoring. Sens J IEEE 13:328–338 Jia Y, Lin G, Wang J, Fang J, Zhang Y (2016a) Light condition estimation based on video fire detection in spacious buildings. Arab J Sci Eng 41:1031–1041 Jia Y, Yuan J, Wang J, Fang J, Zhang Q, Zhang Y (2016b) A saliency-based method for early smoke detection in video sequences. Fire Technol 52:1271–1292 Kanwal K, Liaquat A, Mughal M, Abbasi AR, Aamir M (2016) Towards development of a low cost early fire detection system using wireless sensor network and machine vision. Wirel Pers Commun 95:475 Karakus C, Gurbuz AC, Tavli B (2013) Analysis of energy efficiency of compressive sensing in wireless sensor networks. Sens J IEEE 13:1999–2008 Khodayar F, Sojasi S, Maldague X (2016) Infrared thermography and NDT: 2050 horizon. Quant InfraRed Thermogr J 13:210–231 Ko B, Cheong K-H, Nam J-Y (2010) Early fire detection algorithm based on irregular patterns of flames and hierarchical Bayesian networks. Fire Saf J 45:262–270 Kong SG, Jin D, Li S, Kim H (2016) Fast fire flame detection in surveillance video using logistic regression and temporal smoothing. Fire Saf J 79:37–43 Kru¨ger S, Despinasse M-C, Raspe T, No¨rthemann K, Moritz W (2017) Early fire detection: Are hydrogen sensors able to detect pyrolysis of house hold materials? Fire Saf J 91:1059–1067 Kuo J-Y, Lai T-Y, Fanjiang Y-Y, Huang F-C, Liao Y-H (2015) A behavior-based flame detection method for a real-time video surveillance system. J Chin Inst Eng 38:947–958 Leblon B (2005) Monitoring forest fire danger with remote sensing. Nat Hazards 35:343–359 Mahdipour E, Dadkhah C (2014) Automatic fire detection based on soft computing techniques: review from 2000 to 2010. Artif Intell Rev 42:895–934 National Emergency Management Agency (2012) The major statistics of emergency management. National Emergency Management Agency, Seoul Qureshi WS, Ekpanyapong M, Dailey MN, Rinsurongkawong S, Malenichev A, Krasotkina O (2016) QuickBlaze: early fire detection using a combined video processing approach. Fire Technol 52:1293–1317 Rong J, Zhou D, Yao W, Gao W, Chen J, Wang J (2013) Fire flame detection based on GICA and target tracking. Opt Laser Technol 47:283–291 San-Miguel-Ayanz J, Ravail N (2005) Active fire detection for fire emergency management: potential and limitations for the operational use of remote sensing. Nat Hazards 35:361–376 Saraf AK, Rawat V, Banerjee P, Choudhury S, Panda SK, Dasgupta S, Das JD (2008) Satellite detection of earthquake thermal infrared precursors in Iran. Nat Hazards 47:119–135 Sleights JE (2011) An evaluation of old armored cables in building wiring systems. Fire Technol 47:107–147

123

Nat Hazards Truong TX, Kim J-M (2012) Fire flame detection in video sequences using multi-stage pattern recognition techniques. Eng Appl Artif Intell 25:1365–1372 Vidas S, Moghadam P (2013) HeatWave: a handheld 3D thermography system for energy auditing. Energy Build 66:445–460 Wang J, Wang H (2014) Tunable fiber laser based photoacoustic gas sensor for early fire detection. Infrared Phys Technol 65:1–4 Wang S-J, Jeng D-L, Tsai M-T (2009) Early fire detection method in video for vessels. J Syst Softw 82:656–667 Wang Y, Yu C, Tu R, Zhang Y (2011) Fire detection model in Tibet based on grey-fuzzy neural network algorithm. Expert Syst Appl 38:9580–9586 Wang S, He Y, Zou J, Duan B, Wang J (2014) A flame detection synthesis algorithm. Fire Technol 50:959–975 Ye S, Bai Z, Chen H, Bohush R, Ablameyko S (2017) An effective algorithm to detect both smoke and flame using color and wavelet analysis. Pattern Recogn Image Anal 27:131–138

123

Suggest Documents