Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM AND METHOD FOR FIRE DETECTION
Document Type and Number:
WIPO Patent Application WO/2024/005701
Kind Code:
A1
Abstract:
The present disclosure relates to a system (100) and method for fire detection. The system comprises a first image capturing device (101) comprising a sensor arranged to detect infrared radiation and a second image capturing device (102) comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known. Thereby, a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The system further comprising a controller (103) comprising a processor (104), said controller (103) being operatively connected to said first and second sensors and being arranged to continuously obtaining first images captured by the sensor (101) of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and to, when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.

Inventors:
AHLBERG JÖRGEN (SE)
HELLSTEN JONAS (SE)
NELSSON CLAES (SE)
Application Number:
PCT/SE2023/050671
Publication Date:
January 04, 2024
Filing Date:
June 28, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TERMISK SYSTEMTEKNIK I SVERIGE AB (SE)
International Classes:
G08B17/12; G08B29/18
Foreign References:
US20020109096A12002-08-15
KR101462247B12014-11-21
US20140028803A12014-01-30
EP3474250B12020-02-26
Attorney, Agent or Firm:
ZACCO SWEDEN AB (SE)
Download PDF:
Claims:
CLAIMS

1. A system (100) for fire detection, comprising: a first image capturing device (101) comprising a sensor arranged to detect infrared radiation, a second image capturing device (102) comprising a sensor arranged to detect visible radiation, wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa, the system further comprising a controller (103) comprising a processor (104), said controller (103) being operatively connected to said first and second sensors and being arranged to continuously obtaining first images captured by the sensor (101) of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.

2. The system according to any of the preceding claims, wherein the controller (103) is arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image.

3. The system according to claim 2, wherein the controller is arranged to form an intensity value based on the exposure time used at image capture and pixel value(s) of the second image and to determine whether the potential fire corresponds to a sun reflection based on the formed intensity value.

4. The system according to claim 3, wherein the second imaging device (102) is arranged to automatically adjust exposure time, or the exposure time is manually adjustable.

5. The system according to claim 4, wherein the exposure time of the second sensor (102) manually or automatically adjustably arranged to automatically adjust the exposure time such that light sources such as sun reflections results in saturated pixels of the second imaging device or pixel values of the second imaging device within the dynamic range, while parts without light sources such as fire get pixel values close to zero. get pixel values close to zero.

6. The system according to any of the preceding claims, wherein the second image capturing device (102) comprises a polarizing filter in the beam path before the sensor.

7. The system according to any of the preceding claims, wherein the second imaging device (102) comprises an optical band pass filter, wherein the band pass filter optionally is arranged to transmit blue and/or ultraviolet wavelengths while blocking red and/or infrared wavelengths.

8. The system according to any of the preceding claims, wherein the second imaging device (102) is colour sensitive and wherein the controller is arranged to determine whether the potential fire corresponds to a sun reflection based on a relation between pixel values for the different wavelength bands.

9. The system according to any of the preceding claims, wherein the controller (103) is arranged to determine whether the potential fire corresponds to a vehicle based on analysis of the identified first image and/or the second image to detect an object in the form of a vehicle.

10. The system according to claim 9, wherein the detection of a vehicle is made based on analysis of a plurality of subsequent images captured by the first and/or second imaging device, wherein a vehicle is detected when an object in the first and/or second images has been determined to be moving.

11. The system according to any of the preceding claims, further comprising an alarm unit (106) arranged to generate an alarm upon fire detection.

12. The system according to any of the preceding claims, wherein the controller (103) is arranged to transmit an initiation signal for initiation of automatic extinguishing of the fire upon fire detection.

13. The system according to any of the claims 1-11, wherein the controller (103) is arranged to evaluate the identified first image and/or the second image to detect persons in the identified first image and/or the second image and when no person has been detected, to transmit an initiation signal for initiation of automatic extinguishing of the fire upon fire detection, and when a person has been detected, transmit an initiation signal for initiation of an alternative action upon fire detection.

14. A method (200) for fire detection performed by a controller comprising a processor, said controller being operatively connected a first image capturing device (101) comprising a sensor arranged to detect infrared radiation and a second image capturing device (102) comprising a sensor arranged to detect visible radiation to said first and second sensors, comprising: wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa, said method comprising continuously obtaining (SI) first images captured by the sensor (101) of the first image capturing device, analysing (S2) each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining (S3) a second image from the sensor of the second image capturing device and analysing (S4) the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.

Description:
A system and method for fire detection

TECHNICAL FIELD

The present disclosure relates to a camera-based system and method for fire detection.

BACKGROUND

When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem. Fires can occur in several ways. For example, fires can occur from self-ignition in piles of wood chips, batteries in piles of waste, or by sabotage. Regardless of the cause, such a fire can cause major problems, with regard to safety, health, economy and the environment.

When a fire does occur, it is important that the fire is detected as early as possible, and different types of fire detectors have therefore been developed. Radiation-based fire detectors are common, such as video cameras with flame or smoke detection software or thermal cameras that sound alarms at high temperatures. A major problem, however, is various sources of false alarms, which reduces confidence in the systems and thus the usability of the systems. To get around this, surveillance cameras are often used to verify alarms, i.e., an operator receives an alarm and looks at the location of the alarm (using a surveillance camera) and determines if the alarm is a fire or a false alarm. This does not really solve the problem, as it requires that there is someone at hand who can manually check each alarm, i.e., exactly what you want to avoid.

SUMMARY

An object of the invention is to improve systems available today for fire detection. The aim is to decrease the number of false alarms systems for fire detection.

False alarms can be caused by many different things, but the most common are vehicles, such as wheel loaders, and sun reflections. For example, the exhaust pipe of a wheel loader is characteristically 200°C.

The object has been achieved by means of a system for fire detection. The system comprises a first image capturing device comprising a sensor arranged to detect infrared radiation and a second image capturing device comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The system further comprising a controller comprising a processor. The controller is operatively connected to said first and second sensors. The controller is arranged to continuously obtaining first images captured by the sensor of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.

Accordingly, a system is provided wherein an alarming radiation-based fire detector (thermal image sensor) is combined with a visual sensor that detects sources of false alarms. Such a system can then automatically give an alarm and/or start automatic extinguishing, for example using water cannons, without an operator having to verify.

In an option, the controller is arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image.

Sun reflections, e.g., in puddles of water, give strong measured intensities in both thermal and visual sensors. However, normal fires do not reach high enough temperatures to emit a significant amount of visual light, and analysing the second image is therefore an effective way to distinguish sun reflections from fires.

Further preferred embodiments are defined in the dependent claims.

The present disclosure further relates to a method for fire detection performed by a controller comprising a processor, said controller being operatively connected a first image capturing device comprising a sensor arranged to detect infrared radiation and a second image capturing device comprising a sensor arranged to detect visible radiation to said first and second sensors, wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The method comprises continuously obtaining first images captured by the sensor of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image. When a potential fire has been identified, steps of obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire are carried out.

DESCRIPTION OF DRAWING

Figure 1 is a block scheme showing an example system system for fire detection.

Figure 2 illustrates an example implementation of a system for fire detection, seen from above. Figure 3 illustrates another example implementation of a system for fire detection, seen from above. Figure 4 is a flow chart illustrating an example method for fire detection.

DETAILED DESCRIPTION

Figure 1 discloses a system 100 for fire detection. The system aims at generating a decreased number of false alarms while not decreasing the number of true alarms.

The system comprises a first, thermal image capturing device 101. The sensor of the first image capturing device is arranged to detect electromagnetic radiation with a wavelength bigger than 3 pm.

The system further comprises at least one second, visual image capturing device 102. The sensor of the at least one second image capturing device is arranged to detect visual wavelengths.

The at least one second, visual image capturing device is here referred to an image capturing device having a sensor that picks up light in the visible area of the electromagnetic spectrum. Such a sensor can capture light in a plurality of wavelength bands or be monochrome, such as black and white, or grayscale. In some cases, the second sensor is not limited to visible light, but also captures near infrared, NIR, or ultraviolet, UV.

When the at least one second, visual image capturing device comprises a plurality of image capturing devices having different properties. The different second image capturing devices may have properties adapted for different duties, wherein one of the duties is use in analysis to determine whether the potential fire is a fire as disclosed herein. For example, the different second image capturing devices may have different apertures, wherein one of the apertures is adapted for the analysis to determine whether the potential fire is a fire as disclosed herein.

Other of the second image capturing devices may be arranged for other duties such as detection of vehicles or pedestrians under different light conditions.

In other examples, a plurality of different second image capturing devices may be provided use in analysis to determine whether the potential fire is a fire as disclosed herein. For example, one of the different second image capturing devices may be selected for use based on prevailing light conditions. Alternatively, a plurality or all of the second image capturing devices can be used in analysis to determine whether the potential fire is a fire as disclosed herein. For example, the plurality of second image capturing devices used in analysis to determine whether the potential fire is a fire as disclosed herein may have different pre-set exposure times and/or fields of fiew..

In an example, the at least one second image capturing device used in analysis to determine whether the potential fire is a fire as disclosed herein is arranged to capture electromagnetic radiation within a range between 0.4-1 pm.

The second, visual image capturing device(s) 102 may be a camera. The camera(s) may be a color camera and/or RGB camera and/or black and white camera or grayscale camera.

Characteristically, the wavelengths captured by the first and second image capturing devices are characteristically not overlapping.

The first and second image capturing devices 101, 102 are arranged to cover an overlapping area. A region of interest to be monitored is located in the overlapping area. The region of interest to be monitored is for example located outdoors. The region of interest to be monitored comprises for example a pile of waste or a pile of biofuels or a pile of other material(s). When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem.

The relation between the first and second image capturing devices 101, 102 is known, whereby a spatial point in a first image captured by the first image capturing device is associated to a corresponding spatial point or line in a second image captured by the second image capturing device and vice versa.

As is clear from above, the relation is a geometrical relation. The geometrical relation comprises characteristically a difference in position and in pointing direction between the first and at least one image capturing device. Also, internal image device parameters such as field of view and/or aperture and/or resolution and/or focal length can be used for determining the geometrical relation between the first and at least one second image capturing device.

In one example, the first and at least of the at least one second image capturing devices are mounted to or integrated with a common platform. Then, the geometrical relation is characteristically know before setting up the system.

At least when the first and at least one second image capturing device are not mounted to or integrated with a common platform, a calibrating function may be provided for determining the relation between the first and at least one second image capturing device. For example, calibration software may be provided for analysing images captured by the first and at least one image capturing device to determine a mapping between a points (pixels) in the first image capturing device to points (pixels) in the second image capturing device. The mapping may be based for example based on feature mapping. The controller as explained later in this description is in one example arranged to implement this software/functionality.

The system 100 further comprises a controller 103. The controller comprises a processor 104 and one or more memories 105. The controller 103 is operatively connected to said first image capturing device 101 and said second image capturing device 102. The controller may be wirelessly connected to the first and second image capturing devices or connected via a wired connection. Interfaces for communication between the controller and the first and second image capturing devices are not illustrated in the figure.

The controller 103 is arranged to continuously receive first images captured by the first image capturing devices 101. The controller 103 is further arranged to and analyse each first image to detect occurrence of electromagnetic radiation exceeding a pre-set criteria in an identified part of the image. Thus, the controller is arranged to identify any part of the first image where the electromagnetic radiation exceeds the pre-set criteria. Potentially, a fire has occurred in any such identified part of the first image.

The controller is, when a potential fire has been detected in the identified part of the first image, arranged to obtain a second image from the second image capturing device and determine whether there is a fire in a corresponding part of the second image based on analysis of the second image. Thus, a determination of whether there is a fire is focused on that part or those parts of the monitored region of interest which has/have been identified as having potential fires from the first image.

A false alarm can be caused by many things. An important cause is sun reflections. The controller 103 may be arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image. Sun reflections, e.g., in puddles of water, give strong measured intensities in both thermal and visual sensors. However, normal fires do not reach high enough temperatures to emit a significant amount of visual light, and analysing the second image is therefore an effective way to distinguish sun cats from fires.

In an example, the second image capturing device is arranged to automatically adjust exposure time. The automatic adjustment may be controlled via a processor of the second image capturing device or the controller 103. Alternatively or in addition thereto, the exposure time may be manually adjustable for example via a user interface. The user interface may be arranged at the second image capturing device. Instead or in addition thereto, the exposure time may be remotely user controlled for example via an app of a user electronic device, a computer or the like. The second image capturing device may have a receiver for reception of such remote control signals. In another alternative, a plurality of second image capturing devices are provided, wherein the different second image capturing devices have different, pre-set exposure times. The controller may be arranged to select an image from one of the second image capturing devices for use in analysis or use a plurality or all of the second image capturing in the analysis, wherein the contribution from the respective image in an example is weighted or determined based on the pixel value distribution of each image.

The controller may then be arranged to form an intensity value based on the exposure time used at image capture and pixel value(s) of the second image and to determine whether the potential fire corresponds to a sun reflection based on the formed intensity value. In this context, it should be mentioned that many visual cameras automatically adjust the exposure time according to the brightness of the scene (the area being imaged). This means that a strong light source will give high pixel values if it is strong relative to other parts of the same scene, even if it is weak by absolute standards, and thus it is not possible to say that a light source is strong just because it has a high pixel value. The inclusion of the exposure time in the determination resolves this problem.

Accordingly, the second image capturing device may be arranged to automatically adjust the exposure time such that light sources such as sun reflections give results on the second image capturing device while parts without light sources or where the light source is a fire get pixel values close to zero. The automatic adjustment may be a way of handling varying daylight levels and thereby a varying strength of sun reflections.

Further, the second, visual image capturing device may comprise a polarizing filter, thereby allowing for improved detection of sun reflections. The sun reflections are usually polarized and using a polarizing filter could help in detecting sun reflections.

Further, the second, visual image capturing device may comprise an optical band pass filter, wherein the band pass filter optionally is arranged to let through blue and/or ultraviolet wavelengths while stopping red and/or infrared wavelengths. Thereby, improved detection of sun reflections can be achieved. A normal fire will emit some radiation in red and infrared wavelengths but only a very small amount of radiation in blue and ultraviolet wavelengths. Thus, blocking red and/or infrared radiation will reduce the probability of sensing radiation from a fire.

The second, visual image capturing device may have a plurality of wavelength bands. The controller may then be arranged to determine whether the potential fire corresponds to a sun reflection based on a relation between pixel values for the different wavelength bands.

Another important cause of false alarms is vehicles, such as wheel loaders. The controller may be arranged to analyse the first image and/or second image to detect presence of a vehicle and to determine the potential fire corresponds to the detected vehicle. Thus, some algorithm may be used arranged to recognize vehicles. For example, the controller may be arranged to identify vehicles for the second image based on visual characteristics of vehicles. In addition or instead, the controller may be arranged to identify vehicles based on heat characteristics of different types of vehicles from the first images, Further or instead, the detection of a vehicle may be made based on a plurality of subsequent images captured by the first and/or second image capturing device to determine a vehicle based on whether the object is moving or not and to determine whether the potential fire corresponds to the detected moving vehicle.

The system may further comprise at least one alarm unit 106 operatively connected to the controller 103 and arranged to generate and alarm upon fire detection. The at least one alarm unit may comprise one or more of the following:

• a sound alarm installed at the monitored site and/or an operator's site

• a visual alarm installed at the monitored site and/or an operator's site

• a mobile unit such a smart phone or the like presenting alarm notifications, wherein the alarm unit may be implemented in an app.

The controller 103 may be arranged to initiate automatic extinguishing of the fire upon fire detection. Thus controller 103 may be operatively connected, for example by wire or wirelessly, to a fire extinguishing system 107. The fire extinguishing may form part of the fire detection system 100.

The fire extinguishing system may for example comprise water cannons. A problem which may arise is that water cannons or other fire extinguishing systems may be inappropriate to use when persons are within its covered area of the extinguishing system. Normally, automatic extinguishing is used only when the monitored area is unmanned, but there can of course be situations where people are still in the area, for example unauthorized persons who have intruded, unforeseen schedule change etc. Therefore, the controller may further be arranged to evaluate the first and/or second images to detect persons in the image. Other person detectors may be used instead of or in addition thereto. When one or more persons have been detected, the automatic extinguishing might be limited. For example, alarm and warning signals can be given so that people in the area can move away or stop the extinguishing.

In figures 2 and 3, different implementations of a system for fire detection, seen from above are illustrated. The system for fire detection may have some of the features as discussed in relation to figure 1. First and second image capturing devices 101, 102 are arranged to cover an overlapping area 201. A region of interest 200 to be monitored is at least partly located in the overlapping area 201.

The region of interest 200 to be monitored is for example located outdoors. The region of interest to be monitored comprises for example a pile of waste or a pile of biofuels or a pile of other material(s). When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem.

The relation between the first and second image capturing devices 101, 102 is known, whereby a spatial point in a first image captured by the first image capturing device is associated to a corresponding spatial point or line in a second image captured by the second image capturing device and vice versa.

In the illustrated example of figure 2, the overlapping area 201 substantially covers the entire region of interest 200 to be monitored.

In the illustrated example of figure 3, the overlapping area 201 covers only a part of the region of interest 200 to be monitored. Several first image capturing device/second image capturing device pairs may then be used to cover the entire region of interest 200.

In figure 4, a method 200 for fire detection is illustrated. The method for fire detection is performed by a controller comprising a processor, said controller being operatively connected a first image capturing device 101 comprising a sensor arranged to detect infrared radiation and a second image capturing device 102 comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known. Thereby, a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa,

The method comprising continuously obtaining SI first images captured by the sensor 101 of the first image capturing device. The method further comprises analysing S2 each first image to identify a potential fire. The potential fire is identified by detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image.

When a potential fire has been identified, the method comprises obtaining S3 a second image from the sensor of the second image capturing device.

The second obtained sensor image is in an example filtered through a polarizing filter in the beam path before the sensor.

The second obtained sensor image is in an example filtered through an optical band pass filter optionally arranged to transmit blue and/or ultraviolet wavelengths while blocking red and/or infrared wavelengths.

The method further comprises a step of analysing S4 the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.

In an implementation example, the determination of whether the potential fire is a fire may comprises determining whether the potential fire corresponds to a sun reflection based on the analysis of the second image. In detail, this may involve adjusting, either manually or automatically, the exposure time of the second imaging device.

The analysis may comprise forming an intensity value based on both the exposure time used at image capture and pixel value(s) of the second image and to determine whether the potential fire corresponds to a sun reflection based on the formed intensity value. As stated earlier, a strong light source will give high pixel values if it is strong relative to other parts of the same scene, even if it is weak by absolute standards, and thus it is not possible to say that a light source is strong just because it has a high pixel value. The inclusion of the exposure time in the determination resolves this problem and therefore enables reliable identification of sun reflexions.

The exposure time of the second sensor 102 may be manually or automatically adjustable so that light sources such as sun reflections results in saturated pixels of the second imaging device or pixel values of the second imaging device within the dynamic range, while parts without light sources such as fire get pixel values close to zero. In an option, the wherein the second imaging device 102 is colour sensitive, the determination of whether the potential fire corresponds to a sun reflection is based on a relation between pixel values for the different wavelength bands.

The analysing S4 of the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire may comprise determine whether the potential fire corresponds to a vehicle based on analysis of the identified first image and/or the second image to detect an object in the form of a vehicle. The detection of a vehicle may be made based on analysis of a plurality of subsequent images captured by the first and/or second imaging device, wherein a vehicle is detected when an object in the first and/or second images has been determined to be moving.

The method may further comprise a step of generating S5 an alarm upon fire detection. The alarm may be generated through audio or light. For example, the audio alarm may be generated by means of load speakers. The audio alarm may be generated at the region of interest to be monitored and or a control room facility.

Instead or in addition thereto, the alarm may be generated on a display in a control room facility.

Instead or in addition thereto, the alarm may be a message transmitted to a user electronic device, such as a smartphone.

The method may instead or alternatively comprise a step of transmitting S7 an initiation signal for initiation of automatic extinguishing of the fire upon fire detection.

The method may further comprise evaluating the identified first image and/or the second image to detect S6 persons in the identified first image and/or the second image. When no person has been detected, an initiation signal for initiation of automatic extinguishing of the fire upon fire detection may then be transmitted S7. When a person has been detected, an initiation signal for initiation of an alternative action upon fire detection may be transmitted. The alternative action may be generating an alarm S5.