Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SOLUTION FOR OBSERVING AN OBJECT PAIR WITHIN AN ENVIRONMENT COMPRISING AN ELEVATOR SYSTEM
Document Type and Number:
WIPO Patent Application WO/2024/080987
Kind Code:
A1
Abstract:
The invention relates to a method for observing an object pair (130) within an environment (100) comprising an elevator system (110). The method comprises: obtaining (310) first image data by a first imaging device (230) arranged within the environment (100); associating (320) a first object (131) and a second object (132) as an object pair (130) based on the first image data, wherein the associating comprises generating association data; obtaining (330) second image data by a second imaging device (240) arranged within the environment (100); verifying (340) whether a presence of both the first object (131) and the second object (132) of the associated object pair (130) can be detected based on the second image data by using the generated association data; and controlling (350) the elevator system (110) depending on the verification result. The invention relates also to an observation system (120) for observing an object pair (130) within an environment (100) comprising an elevator system (110), and to an elevator system (110) for observing an object pair (130).

Inventors:
NOUSU HANNU (FI)
MATTILA MIKKO (FI)
WAKIM PETER (FI)
HOTTINEN TERO (FI)
TAYLOR PAUL (FI)
Application Number:
PCT/US2022/046516
Publication Date:
April 18, 2024
Filing Date:
October 13, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONE CORP (FI)
KONE INC (US)
International Classes:
B66B5/00
Foreign References:
EP3656715A12020-05-27
EP3770095A12021-01-27
JP2018095436A2018-06-21
JP2016064910A2016-04-28
JP2016003097A2016-01-12
Attorney, Agent or Firm:
LEWIS, Paul C. (US)
Download PDF:
Claims:
CLAIMS

1. A method for observing an object pair (130) within an environment (100) comprising an elevator system (110), the method comprising: obtaining (310) first image data by a first imaging device (230) arranged within the environment (100); associating (320) a first object (131) and a second object (132) as an object pair (130) based on the first image data, wherein the associating comprises generating association data; obtaining (330) second image data by a second imaging device (240) arranged within the environment (100); verifying (340) whether a presence of both the first object (131) and the second object (132) of the associated object pair (130) can be detected based on the second image data by using the generated association data; and controlling (350) the elevator system (110) depending on the verification result.

2. The method according to claim 1 , wherein the associating (320) comprises recognizing the first object (131) and the second object (132) based on the first image data by using a pattern recognition.

3. The method according to claim 2, wherein the pattern recognition is based on predefined computer vision (CV) models of the first object (131) and the second object (132).

4. The method according to claim 2 or 3, wherein the associating (320) further comprises making a decision on the association based on observing the behavior of the first object (131) in relation to the second object (132) based on the first image data, observing position of the first object (131 ) and position of the second object (132) relative to each other based on the first image data, and/or using statistical data.

5. The method according to any of the preceding claims, wherein the association data comprises a unique pair identifier (ID) of the associated object pair (130), recognition data of the first object (131), and/or recognition data of the second object (132).

6. The method according to any of the preceding claims, wherein the verifying (340) comprises: recognizing (410) based on the second image data one of the following: the first object (131 ) or the second object (132), detecting (420) based on the association data that the recognized first object (131) or second object (132) belongs to the associated object pair (130), and arriving (430) at a positive verification result, if the other one of the following: the first object (131) or the second object (132), is recognized based on the second image data and the association data; or arriving (440) at a negative verification result, if the other one of the following: the first object (131 ) or the second object (132), is not recognized based on the second image data and the association data.

7. The method according to any of the preceding claims, wherein the second imaging device (240) is arranged: inside an elevator car (140a-140n) of the elevator system (110) and the second image data covers said elevator car (140a-140n) at least partly, or inside an elevator lobby (260) of the elevator system (110) and the second image data covers an entrance of an elevator car of the elevator system (110).

8. The method according to claim 7, wherein in response to a negative verification result, the controlling (350) the elevator system (110) comprises generating an instruction to keep a door of said elevator car (140a-140n) open and/or generating an alarm.

9. An observation system (120) for observing an object pair (130) within an environment (100) comprising an elevator system (110), the observation system (120) comprises: an imaging device arrangement (210) comprising at least a first imaging device (230) and a second imaging device (240) arranged within the environment (100), wherein the first imaging device (230) is configured to obtain first image data and the second imaging device (240) is configured to obtain second image data; and a control system (220) communicatively coupled to the imaging device arrangement (210) and to the elevator system (110), wherein the control system (220) is configured to: associate a first object (131) and a second object (132) as an object pair (130) based on the first image data, wherein the associating comprises generating association data; verify whether a presence of both the first object (131) and the second object (132) of the associated object pair (130) can be detected based on the second image data by using the generated association data; and control the elevator system (110) depending on the verification result.

10. The observation system (120) according to claim 9, wherein the associating comprises that the control system (220) is configured to recognize the first object (131) and the second object (132) based on the first image data by using a pattern recognition.

11. The observation system (120) according to claim 10, wherein the pattern recognition is based on predefined computer vision (CV) models of the first object (131 ) and the second object (132).

12. The observation system (120) according to claim 10 or 11 , wherein the associating further comprises that the control system (220) is configured to make a decision on the association based on observe the behavior of the first object (131) in relation to the second object (132) based on the first image data, observe position of the first object (131) and position of the second object (132) relative to each other based on the first image data, and/or use statistical data.

13. The observation system (120) according to any of claims 9 to 12, wherein the association data comprises a unique pair identifier (ID) of the associated object pair (130), recognition data of the first object (131), and/or recognition data of the second object (132).

14. The observation system (120) according to any of claims 9 to 13, wherein the verifying comprises that the control system (220) is configured to: recognize based on the second image data one of the following: the first object (131 ) or the second object (132), detect based on the association data that the recognized first object (131) or second object (132) belongs to the associated object pair (130), and arrive at a positive verification result, if the other one of the following: the first object (131 ) or the second object (132), is recognized based on the second image data and the association data; or arrive at a negative verification result, if the other one of the following: the first object (131) or the second object (132), is not recognized based on the second image data and the association data.

15. The observation system (120) according to any of claims 9 to 14, wherein the second imaging device (240) is arranged: inside an elevator car (140a- 140n) of the elevator system (110) and the second image data covers said elevator car (140a-140n) at least partly, or inside an elevator lobby (260) and the second image data covers an entrance of an elevator car (140a-140n) of the elevator system (110).

16. The observation system (120) according to claim 15, wherein in response to a negative verification result the controlling the elevator system (110) comprises that the control system (220) is configured to generate an instruction to keep a door of said elevator car (140a-140n) open and/or generating an alarm.

17. An elevator system (110) for observing an object pair (130), the elevator system (110) comprises: one or more elevator cars (140a-140n) configured to travel along a respective elevator shaft between a plurality of floors, and an observation system (120) according to any of claims 9 to 16.

Description:
A solution for observing an object pair within an environment comprising an elevator system

TECHNICAL FIELD

The invention concerns in general the technical field of elevator systems. Especially the invention concerns observation of an object pair within environments comprising an elevator system.

BACKGROUND

In a building environment, persons with an associated object (e.g. owners with a pet or owners with luggage, etc.) may walk around the building. When the building environment comprises an elevator system, the persons with the associated object may also take elevator journeys from one floor to another floor. There may exist situations, where the owner and the associated object are separated from each other. For example, there may exist situations, where the associated object ends up on an opposite side of elevator car doors than the owner of the associated object. For example, when the owner exits or enters an elevator car and the associated object stays inside the elevator car or at the elevator lobby, respectively. This may cause an emergency situation, for example, if the associated object is a pet (e.g. a dog). For example, if the dog owner exits or enters the elevator car, and the dog does not follow the owner quickly. Typically, elevator cars comprise a curtain of light (COL) for detecting objects between the elevator doors. However, the COL may not detect a thin leash of the dog or a small dog between the elevator doors. The dog is then at high risk of being seriously harmed, if the elevator doors close on the leash, and the owner and dog end up on different sides of the elevator doors. In a worst-case scenario, the leash may pull the dog back into the elevator doors and strangle them or cause serious injury.

Thus, there is a need to further develop solutions for ensuring that the owner and the associated object travel together throughout a journey along a building, especially throughout an elevator journey.

SUMMARY

The following presents a simplified summary in order to provide basic understanding of some aspects of various invention embodiments. The summary is not an extensive overview of the invention. It is neither intended to identify key or critical elements of the invention nor to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to a more detailed description of exemplifying embodiments of the invention.

An objective of the invention is to present a method and an observation system for observing an object pair within an environment comprising an elevator system, and an elevator system for observing an object. Another objective of the invention is that the method and the observation system for observing an object pair within an environment comprising an elevator system, and the elevator system for observing an object improve avoidance of possible emergency situations.

The objectives of the invention are reached by a method, an observation system, and an elevator system as defined by the respective independent claims.

According to a first aspect, a method for observing an object pair within an environment comprising an elevator system is provided, wherein the method comprises: obtaining first image data by a first imaging device arranged within the environment; associating a first object and a second object as an object pair based on the first image data, wherein the associating comprises generating association data; obtaining second image data by a second imaging device arranged within the environment; verifying whether a presence of both the first object and the second object of the associated object pair can be detected based on the second image data by using the generated association data; and controlling the elevator system depending on the verification result.

The associating may comprise recognizing the first object and the second object based on the first image data by using a pattern recognition.

The pattern recognition may be based on predefined computer vision (CV) models of the first object and the second object.

Alternatively or in addition, the associating may further comprise making a decision on the association based on observing the behavior of the first object in relation to the second object based on the first image data, observing position of the first object and position of the second object relative to each other based on the first image data, and/or using statistical data. The association data may comprise a unique pair identifier (ID) of the associated object pair, recognition data of the first object, and/or recognition data of the second object.

The verifying may comprise: recognizing based on the second image data one of the following: the first object or the second object, detecting based on the association data that the recognized first object or second object belongs to the associated object pair, and arriving at a positive verification result, if the other one of the following: the first object or the second object, is recognized based on the second image data and the association data; or arriving at a negative verification result, if the other one of the following: the first object or the second object, is not recognized based on the second image data and the association data.

The second imaging device may be arranged: inside an elevator car of the elevator system and the second image data covers said elevator car at least partly, or inside an elevator lobby of the elevator system and the second image data covers an entrance of an elevator car of the elevator system.

In response to a negative verification result, the controlling the elevator system may comprise generating an instruction to keep a door of said elevator car open and/or generating an alarm.

According to a second aspect, an observation system for observing an object pair within an environment comprising an elevator system is provided, wherein the observation system comprises: an imaging device arrangement comprising at least a first imaging device and a second imaging device arranged within the environment, wherein the first imaging device is configured to obtain first image data and the second imaging device is configured to obtain second image data; and a control system communicatively coupled to the imaging device arrangement and to the elevator system, wherein the control system is configured to: associate a first object and a second object as an object pair based on the first image data, wherein the associating comprises generating association data; verify whether a presence of both the first object and the second object of the associated object pair can be detected based on the second image data by using the generated association data; and control the elevator system depending on the verification result. The associating may comprise that the control system is configured to recognize the first object and the second object based on the first image data by using a pattern recognition.

The pattern recognition may be based on predefined computer vision (CV) models of the first object and the second object.

Alternatively or in addition, the associating may further comprise that the control system is configured to make a decision on the association based on observe the behavior of the first object in relation to the second object based on the first image data, observe position of the first object and position of the second object relative to each other based on the first image data, and/or use statistical data.

The association data may comprise a unique pair identifier (ID) of the associated object pair, recognition data of the first object, and/or recognition data of the second object.

The verifying may comprise that the control system is configured to: recognize based on the second image data one of the following: the first object or the second object, detect based on the association data that the recognized first object or second object belongs to the associated object pair, and arrive at a positive verification result, if the other one of the following: the first object or the second object, is recognized based on the second image data and the association data; or arrive at a negative verification result, if the other one of the following: the first object or the second object, is not recognized based on the second image data and the association data.

The second imaging device may be arranged: inside an elevator car of the elevator system and the second image data covers said elevator car at least partly, or inside an elevator lobby and the second image data covers an entrance of an elevator car of the elevator system.

In response to a negative verification result the controlling the elevator system may comprise that the control system is configured to generate an instruction to keep a door of said elevator car open and/or generating an alarm.

According to a third aspect, an elevator system for observing an object pair is provided, wherein the elevator system comprises: one or more elevator cars configured to travel along a respective elevator shaft between a plurality of floors, and an observation system as described above.

Various exemplifying and non-limiting embodiments of the invention both as to constructions and to methods of operation, together with additional objects and advantages thereof, will be best understood from the following description of specific exemplifying and non-limiting embodiments when read in connection with the accompanying drawings.

The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of unrecited features. The features recited in dependent claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, i.e. a singular form, throughout this document does not exclude a plurality.

BRIEF DESCRIPTION OF FIGURES

The embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.

Figure 1 illustrates schematically an example of an environment comprising an elevator system.

Figure 2A illustrates schematically an example of entities of an observation system.

Figure 2B illustrates schematically an example implementation of imaging devices of an imaging device arrangement within the environment.

Figure 3 illustrates schematically an example of a method for observing an object pair within the environment comprising the elevator system.

Figure 4 illustrates schematically an example of a verifying process performed by a control system.

Figure 5 illustrates schematically an example of components of the control system.

DESCRIPTION OF THE EXEMPLIFYING EMBODIMENTS Figure 1 illustrates schematically an example of an environment 100 comprising an elevator system 110. An observation system 120 is implemented within the environment 100 for observing an object pair 130 within the environment 100. The environment may for example, but is not limited, to be a building environment, e.g. an office building, a public building (e.g. a station, an airport, a shopping center, etc.), or a residence building, etc.. The environment 100 may cover at least one floor of the building. In other words, the environment may cover only one floor of the building, more than one floor of the building, or all floors of the building. The object pair 130 comprises a first object 131 and a second object 132. The first object 131 may for example be a human object, e.g. a person. The second object may be a non-human object, e.g. a pet, a luggage, or any other non-human object being accompanied by the first object being a human object. In the example of Figure 1 , the second object 132 is a pet (i.e. a dog), but the second object 132 may also be any other non-human object being accompanied by the first object 131 .

The elevator system 110 comprises one or more elevator cars 140a-140n configured to travel along a respective elevator shaft between a plurality of floors, i.e. landings. In the example of Figure 1 , the elevator system 110 comprises four elevator cars 140a-140n, but the number of elevator cars 140a-140n of the elevator system 110 is not limited and the elevator system 110 may comprise any number of elevator cars 140a-140b. The elevator system 110 may comprise one or more elevator groups, i.e. one or more groups of two or more elevator cars 140a-140n each travelling along a separate elevator shaft configured to operates as a unit serving the same landings. For example, the four elevator cars 140a-140n of the elevator system 110 of the example of Figure 1 may form one elevator group. The elevator system 110 further comprises a hoisting machine configured to drive the one or more elevator cars 140a-140c along the respective elevator shafts between the floors, and an elevator control unit 150 configured to control the operation of the elevator system 110 at least in part. The elevator control unit 150 may comprise an elevator controller and/or one or more other local controllers, e.g. add-on installation controllers. The elevator control unit 150 may for example reside in a machine room and/or in one of the floors. For sake of the clarity the elevator shaft(s), the plurality of floors, the hoisting machine, and the machine room are not illustrated in Figure 1. The elevator system 110 may further comprise the observation system 120. Furthermore, the elevator system 110 may further comprise one or more known elevator related entities, e.g. user interface devices, safety circuit and devices, elevator brakes, and/or elevator doors, etc., which are not shown in Figure 1 for sake of clarity.

Figure 2A illustrates schematically an example of entities of the observation system 120. The observation system 120 comprises an imaging device arrangement 210 and a control system 220. The control system 220 is communicatively coupled to the imaging device arrangement 210. The communication between the control system 220 and the imaging device arrangement 210 may be based on one or more known communication technologies, either wired or wireless. Furthermore, the control system 220 is communicatively coupled to the elevator system 110, e.g. through the elevator control unit 150. The communication between the control system 220 and the elevator system 110 may be based on one or more known communication technologies, either wired or wireless. The imaging device arrangement 210 comprises at least a first imaging device 230 arranged within the environment 100 and a second imaging device 240 arranged within the environment 100. The first imaging device 230 is configured to obtain (i.e. produce) first image data. The first imaging device 230 may comprise one or more imaging related entities for producing the first image data. The second imaging device 240 is configured to obtain second image data. The second imaging device 240 may comprise one or more imaging related entities for producing the second image data. The first imaging device 230 may for example be arranged in a vicinity of an entrance of the environment 100, e.g. an entrance of a building, if the environment 100 is a building environment, so that the first image data obtained by the first imaging device 230 covers the entrance of the building at least partly. This enables that the object pair 130 may be detected based on the first image data obtained by the first imaging device 230, when the object pair 130 enters the building. However, the first imaging device 230 may alternatively arranged at any other location within the environment 100. According to an example, the second imaging device 240 may be arranged inside an elevator car 140a-140n of the elevator system 110 (e.g. inside one of the one or more elevator cars 140a-140n of the elevator system 110) so that the second image data obtained by the second imaging device 240 covers said elevator car 140a-140n at least partly. According to another example, the second imaging device 240 may be arranged inside an elevator lobby 260 of the elevator system 110 so that the second image data obtained by the second imaging device 240 covers an en- trance of an elevator car of the elevator system 110 (e.g. an entrance of one of the one or more elevator cars 140a-140n of the elevator system 110) at least partly. However, the second imaging device 240 may alternatively arranged at any other location within the environment 100. The imaging device arrangement 210 may further comprise one or more further imaging devices 250a- 250n arranged within the environment 100. Each further imaging device 250a- 250n may be configured to obtain further image data. Each further imaging device 250a-250n may comprise one or more imaging related entities for producing the further image data. The one or more further imaging devices 250a- 250n may be arranged around the environment 100. Preferably, the imaging devices (including the first imaging device 230, the second imaging device 240, and the one or more further imaging devices 250a-250n) of the imaging device arrangement 210 may be arranged around the environment 100 so that substantially the whole environment 100 may be covered by the image data obtained by the imaging devices 230, 240, 250a-250n of the imaging device arrangement 210.

Figure 2B illustrates schematically an example implementation of the imaging devices 230, 240, 250a-250n of the imaging device arrangement 210 within the environment 100. In the example of Figure 2B, the environment 100 is a building environment comprising at least an elevator lobby 260. The environment 100 may further comprise one or more further areas and/or floors of the building, which are not shown in Figure 2B for sake of clarity. In the example of Figure 2B, the first imaging device 230 is arranged in a vicinity of the entrance 265 of the building so that the first image data obtained by the first imaging device 230 covers the entrance 265 of the building at least partly. In the example of Figure 2B, the second imaging device 240 is arranged inside an elevator car 140a of the elevator system 110 so that the second image data obtained by the second imaging device 240 covers the elevator car 140a at least partly. In the example of Figure 2B, some non-limiting example locations for arranging the possible (i.e. optional) one or more further imaging devices 250a-250n of the imaging device arrangement 210 are illustrated. In the example of Figure 2B, the control system 220 comprises at least the elevator control unit 150.

The first imaging device 230 may be a camera, an IR sensor, a radar sensor, or any other sensor device capable of detecting objects, e.g. the first object 131 and/or the second object 132. Preferably the first imaging device 230 is a camera to enable reliable observation of behavior of objects, e.g. the first ob- ject 131 and/or the second object 132. Alternatively or in addition, the second imaging device 240 may be a camera, an IR sensor, a radar sensor, or any other sensor device capable of detecting objects, e.g. the first object 131 and/or the second object 132. Preferably, the second imaging device is a camera to enable reliable observation of behavior of objects, e.g. the first object 131 and/or the second object 132. If the imaging device arrangement 210 comprises further the one or more further imaging devices 250a-250n, the one or more further imaging devices 250a-250n may be a camera, an IR sensor, a radar sensor, or any other sensor device capable of detecting an object, e.g. the first object 131 and/or the second object 132.

The imaging device arrangement 210 may further comprise a common control unit 211 configured to control the operation of the imaging device arrangement 210 at least in part and/or process image data obtained by one or more imaging devices 230, 240, 250a-250n of the imaging device arrangement 210 at least in part, i.e. the common control unit 211 may take part in the processing of the image data obtained by the one or more imaging devices 230, 240, 250a-250n of the imaging device arrangement 210. Alternatively or in addition, each imaging device 230, 240, 250a-250n of the imaging device arrangement 210 may comprise an individual control unit 231 , 241 , 251a-251 n configured to control the operation of said imaging device 230, 240, 250a-250n and/or process image data obtained by said imaging device 230, 240, 250a-250n at least in part. In other words, the first imaging device 230 may comprise a control unit 231 configured to control the operation of the first imaging device 230 and/or process the first image data obtained by the first imaging device 230 at least in part, the second imaging device 240 may comprise a control unit 241 configured to control the operation of the second imaging device 240 and/or process the second image data obtained by the second imaging device 240 at least in part, and the one or more further imaging devices 250a-250n may each comprise a control unit 251a-251 n configured to control the operation of said further imaging device 250a-250n and/or process the further image data obtained by said further imaging device 250a-250n at least in part. The processing of the image data (e.g. the first image data, the second image data, and/or the further image data) may for example comprise at least an association process and a verification process described later in this application. The processing may further comprise any other kind of processing of the obtained image data. Each imaging device of the imaging devices 230, 240, 250a-250n of the imag- ing device arrangement 210 may be communicatively coupled with at least one other imaging device 230, 240, 250a-250n of the imaging device arrangement

210 directly and/or via the common control unit 211 of the imaging device arrangement 210. The common control unit 211 of the imaging device arrangement 210 may comprise one or more communication interfaces for communication with one or more imaging devices 230, 240, 250a-250n of the imaging device arrangement 210. The control unit of each imaging device 230, 240, 250a-250n of the imaging device arrangement 210 may comprise one or more communication interfaces for communication with the common control unit 211 of the imaging device arrangement 210 and/or with at least one other imaging device 230, 240, 250a-250n of the imaging device arrangement 210. The communication between the imaging devices 230, 240, 250a-250n of the imaging device arrangement 210 (either directly or via the common control unit

211 of the imaging device arrangement 210) may be based on one or more known communication technologies, either wired or wireless. The control system 220 may comprise the elevator control unit 150 of the elevator system 110, the common control unit 211 of the imaging device arrangement 210, the control unit 231 of the first imaging device 230, the control unit 241 of the second imaging device 240, the control unit(s) 251a-251 n of the one or more further imaging devices 250a-250n, and/or a remote control unit. The remote control unit may be any control unit residing remote from the environment 100.

Next an example of a method for observing the object pair 130 within the environment 100 comprising the elevator system 110 is described by referring to Figure 3. Figure 3 illustrates schematically the method as a flow chart. The method is described from now on by using the first imaging device 230 and the second imaging device 240 of the imaging device arrangement 210. However, the imaging device arrangement may further comprise the one or more further imaging devices 250a-250n, each further imaging device 250a-250n being capable to perform the operations of the first imaging device 230 that will be described and/or the operations of the second imaging device 240 that will be described.

At a step 310, the first imaging device 230 obtains the first image data. The first object 131 and the second object 132 may be detected (i.e. recognized) based on the first imaging data. In another words, the first object 131 and the second object 132 reside, i.e. are located, within an imaging view covered by the first image data. The first imaging data may comprise video image data and/or multiple consecutive image frames. The first image data obtained by the first imaging device 230 is provided to the control system 220.

At a step 320, the control system 220 associates the first object 131 and the second object 132 as the object pair 130 based on the first image data. The associating comprises generating association data. The association data may comprise a unique pair identifier (ID) of the associated object pair 130, recognition data of the first object 131 , and/or recognition data of the second object 132. The recognition data of the first object 131 may comprise one or more features of the first object 131 , position data of the first object 131 , and/or speed data of the first object 131 , etc.. The one or more features of the first object 131 may for example comprise, but are not limited to, one or more specific features specifying the first object 131 , e.g. a specific color, a specific garment, and/or a specific accessory, etc., from other similar objects. According to a non-limiting example, the one or more features of the first object 131 may comprise a specific color, e.g. yellow, coat. The recognition data of the second object 132 may comprise one or more features of the second object 132, position data of the second object 132, and/or speed data of the second object 132, etc.. The one or more features of the second object 132 may for example comprise, but are not limited to, one or more specific features specifying the second object 132, e.g. a specific color, a specific garment, and/or a specific accessory, etc., from other similar objects. According to a non-limiting example, the one or more features of the second object 132 being a dog may comprise a specific color fur, e.g. black, and/or a specific color, e.g. red, collar. The associating may further comprise recognizing the first object 131 and the second object 131 based on the first image data by using a pattern recognition. The pattern recognition may be based on predefined computer vision (CV) models of the first object 131 and the second object 132. For example, the predefined CV model of the first object 131 may be predefined, e.g. build or formed, to learn human objects to enable that the human objects (e.g. the first object 131) may be recognized from image data, e.g. the first image data. Similarly, the predefined CV model of the second object 132 may be predefined, e.g. build or formed, to learn one or more typical non-human objects accompanied by the human objects (e.g. a pet, a luggage, and/or any other non- human object being accompanied by human objects) to enable that the non- human objects (e.g. the second object 132) accompanied by the human object (e.g. the first object 131) may be recognized from image data, e.g. the first im- age data. Alternatively or in addition, the associating may further comprise making a decision on the association based on observing the behavior of the first object 131 in relation to the second object 132 based on the first image data, observing position of the first object 131 and position of the second object 131 relative to each other based on the first image data, and/or using statistical data. According to an example, the positions of the first object 131 and the second object 132 relative to each other, e.g. the first object 131 and the second object 132 are close to each other, based on the first image data may indicate that the first object 131 and the second object 132 are travelling together, which in turn may be used by the control system 220 to make the decision on the association of the first object 131 and the second object 132 as the object pair 130. According to another example, the behavior of the first object 131 in relation to the second object 132, e.g. the first object 131 guides and/or waits the second object 132, based on the first image data may indicate that the first object 131 and the second object 132 are travelling together, which in turn may be used by the control system 220 to make the decision on the association of the first object 131 and the second object 132 as the object pair 130.

At a step 330, the second imaging device 240 obtains second image data. The second imaging data may comprise at least one image frame (i.e. at least one still image), video image data, and/or multiple consecutive image frames. The second image data obtained by the second imaging device 240 is provided to the control system 220.

At a step 340, the control system 220 verifies whether a presence of both the first object 131 and the second object 132 of the associated object pair 130 can be detected based on the second image data by using the generated association data. In other words, the control system 220 verifies whether both the first object 131 and the second 132 object may be detected based on the second image data and the generated association data. In yet another words, the control system 220 verifies whether both the first object 131 and the second object 132 of the associated object pair 130 may be seen inside the same view, i.e. a view formed by the second image data. Figure 4 illustrates schematically the verifying process performed by the control system 220 at the step 340, i.e. the verification process of the associated pair 130. At a step 410 of the verifying process, the control system 220 recognizes based on the second image data one of the following: the first object 131 or the second object 132.

At a step 420 of the verifying process, the control system 220 detects based on the association data that the recognized first object 131 or second object 132 belongs to the associated object pair 130. In other words, the control system 220 uses the association data to detect that the recognized first object 131 or second object 132 belongs to the associated object pair 130.

After detecting that the first object 131 or the second object 132 belongs to the associated object pair 130 at the step 420, if the other one of the following: the first object 131 or the second object 132, is recognized based on the second image data and the association data, the control system 220 arrives at a positive verification result at a step 430 of the verifying process. In other words, if the first object 131 is recognized based on the second image data and the association data at the steps 410 and 420, the control system 220 arrives at the positive verification result at the step 430, if the second object 132 is recognized based on the second image data and the association data, and vice versa.

Alternatively, if the other one of the following: the first object 131 or the second object 132, is not recognized based on the second image data and the association data, the control system 220 arrives at a negative verification result at a step 440 of the verifying process. In other words, if the first object 131 is recognized based on the second image data and the association data at the steps 410 and 420, the control system 220 arrives at the negative verification result at the step 440, if the second object 132 is not recognized based on the second image data and the association data, and vice versa.

At a step 350, the control system 220 controls the elevator system 110 depending on the verification result. The controlling may for example comprise generating one or more instructions to the elevator system 110 and/or generating one or more alarms. The one or more instructions generated to the elevator system 110 may for example comprise an instruction to open a door of at least one elevator car 140a-140n, an instruction to close a door of at least one elevator car 140a-140n, an instruction to keep a door of at least one elevator car 140a-140n open, an instruction to keep a door of at least one elevator car 140a-140n closed, an instruction to drive at least one elevator car 140a-140n to a specific floor. The one or more alarms may for example be generated to a building service center (e.g. in case of a smart building), an elevator service center, or any other service center.

According to an example, if the second imaging device 240 is inside an elevator car 140a-140n of the elevator system 110 or inside the elevator lobby 260 of the elevator system 110 so the second image data obtained by the second imaging device 240 covers an entrance of an elevator car of the elevator system 110 at least partly, in response to the negative verification result, the controlling of the elevator system 110 by the control system 220 may comprise generating an instruction to keep a door of said elevator car 140a-140n open and/or generating an alarm. The control system 220 may for example control the elevator system 110 to keep the door of said elevator car 140a-140n open until the control system 220 arrives at the positive verification result (i.e. both the first object 131 and the second 132 are recognized based on the second image data and the association data) or the first object 131 is not detected based on the second image data anymore.

Next a non-limiting example situation for observing the object pair 130 within the environment 100 comprising the elevator system 110 by the observation system 120 descried above. In this example, the first imaging device 230 is arranged to an elevator lobby 260 of the elevator system 110 at a first floor of the building and the second imaging device 240 is arranged inside an elevator car 140a of the elevator system 110. When the first object 131 together with the second object 132 enters the building, the first object 131 and the second object 132 may for example be detected to enter the elevator lobby 260 based on the first image data obtained by the first imaging device 230. The control system 220 associates the first object 131 and the second object 132 as the object pair 130 based on the first image data as described above. After entering the elevator lobby 260 the first object 131 and the second object 132 travel within the environment 100 and the second image data is obtained by the second imaging device 240, but only the first object 131 is detected based on the second image data and the second object 132 cannot be detected based on the second image data and the association data (i.e. the control unit arrives at the negative verification result). An example of such a situation may be that the first object 131 enters the elevator car 140a of the elevator system 110, but for some reason the second object 132 stays at the elevator lobby 260. In re- sponse to the negative verification result the control system 220 controls of the elevator system 110 by generating an instruction to keep the door of said elevator car 140a open. The control system 220 may for example control to keep the door of said elevator car 140a open until the control system 220 arrives at the positive verification result (i.e. both the first object 131 and the second object 132 are recognized based on the second image data and the association data) or the first object 131 is not detected based on the second image data anymore. For example, in response to the positive verification result the control system 220 may control the elevator system 110 to close the door of said elevator car 140a and possibly also control the elevator system 110 to drive said elevator car 140a to a specific floor. Alternatively or in addition, in response to the negative verification result the control system 220 may generate an alarm as discussed above. In the above example situation, a possible emergency situation may be avoided, and it may be ensured that the first object 131 and the second object 132 travel together throughout the journey along the building, especially throughout the elevator journey.

Figure 5 illustrates schematically an example of components of the control system 220. The control system 220 may comprise a processing unit 510 comprising one or more processors, a memory unit 520 comprising one or more memories, a communication unit 530 comprising one or more communication devices, and possibly a user interface (III) unit 540. The mentioned elements may be communicatively coupled to each other with e.g. an internal bus. The memory unit 520 may store and maintain portions of a computer program (code) 525, the obtained image data, and any other data. The computer program 525 may comprise instructions which, when the computer program 525 is executed by the processing unit 510 of the control system 220 may cause the processing unit 510, and thus the control system 220 to carry out desired tasks, e.g. one or more of the method steps regarding to the control system 220 described above. The processing unit 510 may thus be arranged to access the memory unit 520 and retrieve and store any information therefrom and thereto. For sake of clarity, the processor herein refers to any unit suitable for processing information and control the operation of the control system 220, among other tasks. The operations may also be implemented with a microcontroller solution with embedded software. Similarly, the memory unit 520 is not limited to a certain type of memory only, but any memory type suitable for storing the described pieces of information may be applied in the context of the present invention. The communication unit 530 provides one or more communication interfaces for communication with any other unit, e.g. the imaging device arrangement 210, the elevator system 110, or with any other unit. The user interface unit 540 may comprise one or more input/output (I/O) devices, such as buttons, keyboard, touch screen, microphone, loudspeaker, display and so on, for receiving user input and outputting information. The computer program 525 may be a computer program product that may be comprised in a tangible nonvolatile (non-transitory) computer-readable medium bearing the computer program code 525 embodied therein for use with a computer, i.e. the control system 220.

The method and observation system 120 described above enable that possible emergency situations may be better avoided, and that it may be better ensured that the first object 131 and the second object 132 travel together throughout a journey along the environment 100, especially throughout an elevator journey. The specific examples provided in the description given above should not be construed as limiting the applicability and/or the interpretation of the appended claims. Lists and groups of examples provided in the description given above are not exhaustive unless otherwise explicitly stated.