Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM TO PROVIDE SYNCHRONIZED LIGHTING AND EFFECTS FOR CINEMA AND HOME
Document Type and Number:
WIPO Patent Application WO/2021/074678
Kind Code:
A1
Abstract:
The present disclosure relates to a system to provide synchronized lighting and effects at a location such as cinema and home, based on a video being played at the location. The system includes a computing unit 102 to receive a video package having an audio file and a video file into a master video package having the audio file, the video file and an illumination and effects control file. The master video package configured to be played on a cinema playback server or a home theatre. The location is provided with a display system 110 to display the video file, and a sound system 112 to play the audio file. The location further include an illumination and effect control unit 114, which incorporates lighting rigs 116 and effects devices 118 to provide synchronized lighting and effects based on the video being played at the location.

Inventors:
GHOSE ANIRVAN (IN)
Application Number:
PCT/IB2019/060427
Publication Date:
April 22, 2021
Filing Date:
December 04, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GHOSE ANIRVAN (IN)
International Classes:
H05B37/02
Foreign References:
US20190099668A12019-04-04
Attorney, Agent or Firm:
KHURANA & KHURANA, ADVOCATES & IP ATTORNEYS (IN)
Download PDF:
Claims:
I Claim:

1. A system for providing synchronized illumination and effects at a location based on a digital video package, the system comprising: a computing unit comprising one or more processors configured to execute one or more instructions stored in a memory of the computing unit and configured to: receive a first digital video package to be displayed at a location, wherein the first digital video package comprises an audio file comprising a plurality of audio frames, and a video file comprising a plurality of video frames; extract at least one video frame from the plurality of video frames of the received video file; detect one or more visual attributes from the at least one extracted video frame; generate a first set of control signals based on the detected one or more visual attributes of the at least one video frame.

2. The system as claimed in claim 1, wherein one or more illumination and effects sources are positioned at predetermined positions at the location, and wherein the one or more illumination and effects sources are configured to receive the first set of control signals and provide synchronized illumination and effects at the location when the at least one video frame is displayed at the location.

3. The system as claimed in claim 1, wherein the one or more visual attributes of the at least one video frame comprises any or a combination of color of most mobile element, most dominant color, average value of colors, luminance value, and duration of luminance.

4. The system as claimed in claim 1, wherein the location is any or a combination of a cinema hall, home, theatre, auditorium, and open area theatre, and wherein the one or more illumination and effects sources comprises any or a combination of club lights, moving head lights, light beams, light bars, light strips, lasers, moving head lasers, strobe, UV light, audience blinder light, fog machine, and wind blaster.

5. The system as claimed in claim 1, wherein the computing unit is configured to detect at least one white video frame between two video frames of video file and generate a corresponding second set of control signals, and wherein the one or more illumination and effects sources are configured to receive the second set of control signals and provide a white coloured illumination at the location for a first predetermined period of time.

6. The system as claimed in claim 1, wherein the system is configured to generate a third set of control signals when one or more elements present in the extracted at least one video frame remains unchanged and a fluctuation in a luminance value of the corresponding video frame is detected, and wherein the one or more illumination and effects sources are configured to receive the third set of control signals and provide a lightning effect at the location for a second predetermined period of time.

7. The system as claimed in claim 1, wherein the system is configured to extract at least one audio frame from the plurality of audio frames and detect one or more acoustical attributes form the extracted at least one audio frame, and wherein the one or more acoustical attributes comprises any or a combination of intensity, pitch, frequency, amplitude, beats per minute, and signal to noise ratio.

8. The system as claimed in claim 7, wherein the system is configured to detect an explosion sound in the at least one audio frame when the detected one or more acoustical attributes in the at least one audio frame exceeds a predefined level for a predefined period of time.

9. The system as claimed in claim 8, wherein the system is configured to generate a fourth set of control signals when the system detects the explosion sound in the at least one audio frame and an average value of the most dominant color in the at least one video frame corresponding to the at least one audio frame is detected to be any or a combination of red and yellow, and wherein the one or more illumination and effects sources are configured to receive the fourth set of signals and provide a fire effect at the location.

10. The system as claimed in claim 1, wherein the computing unit is configured to generate an illumination and effects control file based on the generated first set of control signals, and wherein the system is configured to generate a second digital video package comprising the audio file, the video file and the generated illumination and effects control file.

Description:
A SYSTEM TO PROVIDE SYNCHRONIZED LIGHTING AND EFFECTS FOR

CINEMA AND HOME

TECHNICAL FIELD

[0001] The present disclosure relates to the field of cinema and home theatre system.

More particularly, the present disclosure relates to a system to provide synchronized lighting and effects at a location based on a video to be played at the location.

BACKGROUND

[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.

[0003] Technology has evolved over the years in cinema and home theatre environments to make the experience more and more realistic in nature. Advancement in this field has been primarily driven by Hollywood movies and hence the emphasis has always been on technology that enhances action. However, Indian films are made differently and makes use of a lot of music and sound effects and emphasis has always been on technology that enhances musical and visual experience..

[0004] In recent years, ambilight or backlight TV systems have been very popular amongst people. Such ambilight TV systems generate light based on incoming video signals such that a background light is emitted on the wall behind the TV that matches the video being shown. The effect is a larger virtual screen and a more immersive viewing experience. [0005] United States Patent Document US9483982 provides an apparatus and method for television backlighting. The apparatus of the cited document includes an HDMI splitter for splitting a single input HDMI signal into two like output HDMI signals. A video frame analyzer is used communicative with one of the HDMI outputs and has a processor executing an instruction set for analyzing the HDMI signal. The analysis converts boundary video values of the HDMI signal to an LED illumination data signal and an LED light source output such that a background light is emitted on the wall behind the TV that matches the video being shown to provide a larger virtual screen and a more immersive viewing experience. [0006] However, even though the cited prior art document provides interesting features for further enhancing the ambient lighting experience when viewing video or images on a TV screen, it may be desirable to provide further improvements in cinema and home theatre systems to make the experience more and more realistic in nature, For instance, by providing enhanced music and visual experience, and also adding lighting and effects system so that the experience can now be delivered using projection, sound, lighting and effects on larger display screens and larger environment instead of only projection and sound.

[0007] There is, therefore, a need in the art to develop a system to provide synchronized lighting and effects at locations such as cinema, home, auditorium, etc based on a video to be played at the locations.

OBJECTS OF THE PRESENT DISCLOSURE

[0008] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.

[0009] It is an object of the present disclosure to enhance cinema and home theatre experience.

[00010] It is an object of the present disclosure to provide a system to provide synchronized lighting and effects at locations such as cinema, home, auditorium, etc based on a video to be played at the locations.

[00011] It is an object of the present disclosure to provide a system that can convert a digital cinema package having picture and audio into a format having picture, audio and synchronized lighting and effects.

[00012] It is an object of the present disclosure to provide a system that can provide music concert like experience in cinemas and home theatres.

SUMMARY

[00013] The present disclosure relates to the field of cinema and home theatre system. More particularly, the present disclosure relates to a system to provide synchronized lighting and effects at a location based on a video to be played at the location.

[00014] An aspect of the present disclosure pertains to a system for providing synchronized illumination and effects at a location based on a digital video package, the system may comprising: a computing unit comprising one or more processors configured to execute one or more instructions stored in a memory of the computing unit and may be configured to receive a first digital video package to be displayed at a location, wherein the first digital video package may comprise an audio file comprising a plurality of audio frames, and a video file comprising a plurality of video frames; extract at least one video frame from the plurality of video frames of the received video file; detect one or more visual attributes from the at least one extracted video frame; generate a first set of control signals based on the detected one or more visual attributes of the at least one video frame.

[00015] In an aspect, one or more illumination and effects sources may be positioned at predetermined positions at the location, and wherein the one or more illumination and effects sources may be configured to receive the first set of control signals and provide synchronized illumination and effects at the location when the at least one video frame is displayed at the location.

[00016] In another aspect, the one or more visual attributes of the at least one video frame may comprise any or a combination of color of most mobile element, most dominant color, average value of colors, luminance value, and duration of luminance.

[00017] In yet another aspect, the location may be any or a combination of a cinema hall, home, theatre, auditorium, and open area theatre, and wherein the one or more illumination and effects sources may comprise any or a combination of club lights, moving head lights, light beams, light bars, light strips, lasers, moving head lasers, strobe, UV light, audience blinder light, fog machine, and wind blaster.

[00018] In an aspect, the computing unit may be configured to detect at least one white video frame between two video frames of video file and generate a corresponding second set of control signals, and wherein the one or more illumination and effects sources may be configured to receive the second set of control signals and provide a white colour illumination at the location for a first predetermined period of time.

[00019] In another aspect, the system may be configured to generate a third set of control signals when one or more elements present in the extracted at least one video frame remains unchanged and a fluctuation in a luminance value of the corresponding video frame is detected, and wherein the one or more illumination and effects sources may be configured to receive the third set of control signals and provide a lightning effect at the location for a second predetermined period of time.

[00020] In yet another aspect, the system may be configured to extract at least one audio frame from the plurality of audio frames and detect one or more acoustical attributes form the extracted at least one audio frame, and wherein the one or more acoustical attributes may comprise any or a combination of intensity, pitch, frequency, amplitude, beats per minute, and signal to noise ratio.

[00021] In an aspect, the system may be configured to detect an explosion sound in the at least one audio frame when the detected one or more acoustical attributes in the at least one audio frame exceeds a predefined level for a predefined period of time. [00022] In another aspect, the system may be configured to generate a fourth set of control signals when the system detects the explosion sound in the at least one audio frame and an average value of the most dominant color in the at least one video frame corresponding to the at least one audio frame is detected to be any or a combination of red and yellow, and wherein the one or more illumination and effects sources may be configured to receive the fourth set of signals and provide a fire effect at the location.

[00023] In yet another aspect, the computing unit may be configured to generate an illumination and effects control file based on the generated first set of control signals, and wherein the system is configured to generate a second digital video package comprising the audio file, the video file and the generated illumination and effects control file.

[00024] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components

[00025] Within the scope of this application it is expressly envisaged that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible

BRIEF DESCRIPTION OF DRAWINGS

[00026] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.

[00027] FIG. 1 illustrates an exemplary architecture of the proposed system to provide synchronized lighting and effects for cinema and home, in accordance with an embodiment of the present disclosure.

[00028] FIG. 2 illustrates an exemplary process flow diagram for generating a control signal for lighting and effects system based on the video to be played, in accordance with an embodiment of the present disclosure. [00029] FIG. 3 illustrates an exemplary process flow diagram for loading the digital video package on a playback system of a cinema using internet, in accordance with an embodiment of the present disclosure.

[00030] FIG. 4 illustrates another exemplary process flow diagram for loading a digital video package on a playback system of a cinema having without internet, in accordance with an embodiment of the present disclosure.

[00031] FIG. 5 illustrates an exemplary a block diagram showing the major components of a playback system of a cinema using internet, in accordance with an embodiment of the present disclosure.

[00032] FIG. 6 illustrates an exemplary a block diagram showing the major components of a playback system of a cinema without internet, in accordance with an embodiment of the present disclosure

[00033] FIG. 7 illustrates an exemplary a block diagram showing the major components of a playback system of a home theatre setup, in accordance with an embodiment of the present disclosure.

[00034] FIG. 8 illustrates an exemplary a block diagram showing the major components of a programming system installed in a post-production studio, in accordance with an embodiment of the present disclosure.

[00035] FIG. 9 illustrates an exemplary a block diagram showing the major components of lighting system of the proposed system, in accordance with an embodiment of the present disclosure.

[00036] FIG. 10 illustrates an exemplary illustration for identifying a white flash transition in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[00037] FIG. 11 illustrates an exemplary illustration for identifying a lightning effect in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[00038] FIG. 12 illustrates an exemplary illustration for identifying a music and beats in an audio frame and a color scheme in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[00039] FIG. 13 illustrates an exemplary illustration for identifying an explosion in an audio frame and generating a corresponding fire effect in cinema or home by the proposed system, in accordance with an embodiment of the present disclosure. DETAILED DESCRIPTION

[00040] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.

[00041] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.

[00042] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.

[00043] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.

[00044] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another [00045] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.

[00046] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).

[00047] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).

[00048] The present disclosure relates to the field of cinema and home theatre system. More particularly, the present disclosure relates to a system to provide synchronized lighting and effects at a location based on a video to be played at the location.

[00049] An aspect of the present disclosure elaborates upon a system for providing synchronized illumination and effects at a location based on a digital video package, the system including: a computing unit including one or more processors configured to execute one or more instructions stored in a memory of the computing unit and can be configured to receive a first digital video package to be displayed at a location, wherein the first digital video package can include an audio file including a plurality of audio frames, and a video file including a plurality of video frames; extract at least one video frame from the plurality of video frames of the received video file; detect one or more visual attributes from the at least one extracted video frame; generate a first set of control signals based on the detected one or more visual attributes of the at least one video frame.

[00050] In an embodiment, one or more illumination and effects sources can be positioned at predetermined positions at the location, and wherein the one or more illumination and effects sources can be configured to receive the first set of control signals to provide synchronized illumination and effects at the location when the at least one video frame is displayed at the location.

[00051] In another embodiment, the one or more visual attributes of the at least one video frame can include any or a combination of color of most mobile element, most dominant color, average value of colors, luminance value, and duration of luminance.

[00052] In yet another embodiment, the location can be any or a combination of a cinema hall, home, theatre, auditorium, and open area theatre, and wherein the one or more illumination and effects sources can include any or a combination of club lights, moving head lights, light beams, light bars, light strips, lasers, moving head lasers, strobe, UV light, audience blinder light, fog machine, and wind blaster.

[00053] In an embodiment, the computing unit can be configured to detect at least one white video frame between two video frames of video file and generate a corresponding second set of control signals, and wherein the one or more illumination and effects sources can be configured to receive the second set of control signals and provide a white colour illumination at the location for a first predetermined period of time.

[00054] In another embodiment, the system can be configured to generate a third set of control signals when one or more elements present in the extracted at least one video frame remains unchanged and a fluctuation in a luminance value of the corresponding video frame is detected, and wherein the one or more illumination and effects sources can be configured to receive the third set of control signals and provide a lightning effect at the location for a second predetermined period of time.

[00055] In yet another embodiment, the system can be configured to extract at least one audio frame from the plurality of audio frames and detect one or more acoustical attributes form the extracted at least one audio frame, and wherein the one or more acoustical attributes can include any or a combination of intensity, pitch, frequency, amplitude, beats per minute, and signal to noise ratio.

[00056] In an embodiment, the system can be configured to detect an explosion sound in the at least one audio frame when the detected one or more acoustical attributes in the at least one audio frame exceeds a predefined level for a predefined period of time.

[00057] In another embodiment, the system can be configured to generate a fourth set of control signals when the system detects the explosion sound in the at least one audio frame and an average value of the most dominant color in the at least one video frame corresponding to the at least one audio frame is detected to be any or a combination of red and yellow, and wherein the one or more illumination and effects sources can be configured to receive the fourth set of signals and provide a fire effect at the location.

[00058] In yet another embodiment, the computing unit can be configured to generate an illumination and effects control file based on the generated first set of control signals, and wherein the system is configured to generate a second digital video package including the audio file, the video file and the generated illumination and effects control file.

[00059] FIG. 1 illustrates an exemplary architecture of the proposed system to provide synchronized lighting and effects for cinema and home, in accordance with an embodiment of the present disclosure.

[00060] As illustrated, in an embodiment, the proposed system can include a computing unit 104 configured to receive a first digital video package (also referred as a first digital cinema package or digital video package or DLEP, herein). The first digital video package can include an audio file having a plurality of audio frames, and a video file having a plurality of video frames. The audio frames and the video frames configured to be played together in cinema and/or home theatre system to provide an audio-visual experience to users at a location. In an exemplary embodiment, the location can be any or a combination of cinema hall, home, theatre, auditorium, and open area theatre, but not limited to the likes. [00061] In an embodiment, the computing unit 102 can be configured to analyze and process the received first digital video package to generate a first set of control signals to provide synchronized lighting and effects at the location along with the visual and audio experience to the users at the location.

[00062] In an embodiment, the computing unit 102 can be configured to generate an illumination and effects control file based on the generated first set of control signals. The system can be configured to generate a second video package (also referred to as master DLEP) including the audio file, the video file and the generated illumination and effects control file.

[00063] In an embodiment, the computing unit 102 can include one or more processor(s) 104. The one or more processor(s) 104 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 104 can be configured to fetch and execute computer-readable instructions stored in a memory 108 of the computing unit 102. The memory 108 can store one or more computer-readable instructions or routines, which can be fetched and executed to create or share the data units over a network service. The memory 108 can be any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.

[00064] The computing unit 102 can include an interface(s) 106. The interface(s) 106 can include a variety of interfaces, for example, interfaces for data input and output devices, referred to as EO devices, storage devices, and the like. The interface(s) 106 can facilitate communication of the computing unit 102 with various devices coupled to the computing 106 such as an input unit and an output unit. The interface(s) 106 can also provide a communication pathway for one or more components of the computing unit 106 and the proposed system 100.

[00065] In an embodiment, the location can include a display system 110 to provide visual experience to the users at the location. In an exemplary embodiment, the display system 110 for a cinema can include an assembly of a projector and a screen positioned at suitable position and angle to provide an optimum visual experience to the users. In another exemplary embodiment, the display system 110 for the cinema can include one or more displays positioned at suitable position and angle to provide an optimum visual experience to the users. The one or more displays can be any or combination of a LED, LCD, OLED, AMOLED, TV, but not limited to the likes.

[00066] In an exemplary embodiment, the display system 110 for a home can include an assembly of a projector and a screen positioned at suitable position and angle to provide an optimum visual experience to the users. In another exemplary embodiment, the display system 110 for the home can include one or more displays positioned at suitable position and angle to provide an optimum visual experience to the users. The one or more displays can be any or combination of a LED, LCD, OLED, AMOLED, TV, but not limited to the likes. [00067] In an embodiment, the location can include a sound system 112 to provide audio experience to users at the location. The sound system 112 can include an audio processor and a set of amplifiers to enhance the sound quality of the audio file/audio track to a desired optimum level. The sound system 112 can include one or more speakers at desired position to provide the optimum audio experience to the users at the location.

[00068] In an embodiment, the location can include an illumination and effects control unit 114 (also referred to as lighting and effects control unit 114 or a control unit 114, herein) to provide synchronized lighting and effects at the location. The control unit 114 can include one or more illumination sources 116 (also referred to as lighting rigs 116, herein) positioned at desired angle and position at the location to provide synchronized lighting experience based on the video being played at the location upon receiving the first set of control signals. In an exemplary embodiment, the one or more illumination sources 116 can include any or a combination of club lights, moving head lights, light beams, light bars, light strips, lasers, moving head lasers, strobe, UV light, and audience blinder light, but not limited to the likes. [00069] In an embodiment, the control unit 114 can include one or more effects devices 118 (also referred to as effects devices 118, herein) positioned at desired angle and position at the location to provide synchronized effects based on the video being played at the location upon receiving the first set of control signals. In an exemplary embodiment, the one or more effects devices 118 can be any or a combination of fog machine, and wind blaster, but not limited to the likes.

[00070] In an embodiment, the control unit 114 can receive the first set of control signals from the illumination and control files of the second digital video package.

[00071] In an embodiment, the control unit 114 can include other devices 122, which can include any or a combination of a communication module, and a power management unit, but not limited to the likes. The communication module can be a WIFI module, a serial data transfer module, and other wired or wireless modules to communicatively couple the system to the lighting rigs, effect devices, display system, and sound system, and other components of the playback system/server and home theatre system.

[00072] In an embodiment, the control unit 114 can include one or more sensors 120 (also referred to as sensors 120, herein) to facilitate calibration of position and distance between the screen (also referred to as prime viewing area) and the lighting rigs 116 at the location to maintain a predetermined ratio between an illumination level of the screen and the illumination of the lighting rigs 116.

[00073] The lighting distance between the prime viewing area (PVA) and the lighting rigs 116 can change from location to location (or cinema to cinema) as per the dimension of the location. This changes the light intensity from cinema to cinema. To have the desired effect as in the first digital video package, a calibration system can be used. The calibration system can include calibration tables for each individual light. The lighting rigs 116 can be fitted with the sensors 120 on the opposite walls. The calibration system can point the lights at the sensors and fire each primary color at a first intensity value (for instance 100). The sensors 120 can expect the received intensity value level at the same first intensity value of 100. However, due to the change in distance, this intensity value can be different. The offset of this number between actual received second intensity value and expected received intensity value can be keyed in automatically into the offset table for the particular light. This operation is carried out for every lighting rig 116.

[00074] In an implementation, the sensors 116 can be setup at the back wall of the screen, which can measure the reflected light level of the screen. During this operation, a white frame can be projected on the projector. The ratio between the screen illumination and the overall light level of the lighting rigs 116 (measured by all lights fired with white color at level 20) is referred as the ASI ratio. This ASI ratio can be maintained at a particular value to ensure proper picture quality on the screen.

[00075] In an embodiment, the computing unit 102 can be any or a combination of a computer, a Microcontroller, a cloud-based server, but not limited to the likes.

[00076] FIG. 2 illustrates an exemplary process flow diagram for generating a control signal for lighting and effects system based on the video to be played, in accordance with an embodiment of the present disclosure.

[00077] As illustrated, in an embodiment, the process for generating a control signal (also referred to as the first set of control signals, herein) for synchronized controlling of lighting and effects system based on the video to be played can include a step 202 of receiving a video file having a plurality of video frames from a first digital video package. [00078] In an embodiment, the process can include a step 204 of extracting at least one video frame from the plurality of video frames received at the step 202.

[00079] In an embodiment, the process can include a step 206 of detecting one or more visual attributes from the at least one frame extracted at the step 204. The one or more visual attributes of the at least one video frame can be any or a combination of color of most mobile element, most dominant color, average value of colors, luminance value, and duration of luminance.

[00080] In an embodiment, the process can include a step 208 of generating a first set of control signals based on the one or more visual attributes of the at least one video frame detected at the step 206.

[00081] In an embodiment, the computing unit 102 can be configured to generate an illumination and effects control file based on the first set of control signals generated at the step 208. The computing unit can be configured to generate the second digital video package, which can include the audio file, the video file and the generated illumination and effects control file. [00082] FIG. 3 illustrates an exemplary process flow diagram for loading the digital video package on a playback system of a cinema using internet, in accordance with an embodiment of the present disclosure.

[00083] As illustrated, in an embodiment, the cinema can have access to high speed internet connection with a dedicated IP address. The first digital video package (also referred to as DLEP, herein) can include the audio file and the video (set of image) files. The computing unit 102 can be configured to generate a second digital video package (also referred to as master DLEP, herein). The master DLEP can include the audio file, the video file, and the illumination and effects control files for the lighting rigs and effects devices. The master DLEP can be loaded on the playback server (also referred to as playback system, herein) of the cinema via a central server over the internet.

[00084] In an embodiment, a time code with a particular serial number for the master DLEP can be inserted in the DELP on one of the audio tracks. The master DLEP can be loaded on the playback system installed in the cinema. The playback system can output the illumination and effects control file to the lighting rigs and effects devices by chasing the time code information played back by the existing playback server. The master DLEP can be transferred via internet.

[00085] In an embodiment, the programming for the illumination and effects control file to generate the second digital video package (master DLEP) can be done by a person skilled in the art as per the effects required by the film maker. The programming of the master DLEP can be done at a suitable place such as a setup studio, but not limited to the likes.

[00086] In an embodiment, in the home environment, the playback device derives its time code from the existing set top box or playback device.

[00087] FIG. 4 illustrates another exemplary process flow diagram for loading a digital video package on a playback system of a cinema having without internet, in accordance with an embodiment of the present disclosure.

[00088] As illustrated, in an embodiment, the master DLEP can be loaded on the playback system of the cinema via the first digital video package or digital cinema package by adding the additional illumination and effects control files on the first digital video package. The key to open these additional files can be distributed separately. In this method, the illumination and effects control file can be put on the DLEP as additional files. [00089] FIG. 5 illustrates an exemplary a block diagram showing the major components of a playback system of a cinema using internet, in accordance with an embodiment of the present disclosure.

[00090] As illustrated, in an embodiment, the playback system of the cinema can have the master DLEP loaded. The video file can be fed to the projector from the playback server via an encrypted link. The server can have 16 AES EBU digital audio outputs. Channels 1 to 6 can be used for the 5.1 audio. One of the cannels can contain the time code information and the serial id of the master DLEP. This time code data can be fed to the playback server. The playback server can contain the master DLEP, which can have the illumination and effects control files as per the time code. This playback server can chase this time code and can feed the illumination and effects control files information to the lighting rig and effects devices. [00091] FIG. 6 illustrates an exemplary a block diagram showing the major components of a playback system of a cinema without internet, in accordance with an embodiment of the present disclosure.

[00092] As illustrated, in an embodiment, the playback server can connect to the cinema server on the network. The cinema server can provide the illumination and effects control files data to the playback server in synchronization with the content being played back on the screen or display. The playback server can convert the illumination and effects control files data into the same format as required by the lighting rigs 116 and the effects devices 118 and can feed the same to the lighting rigs 116 and the effects devices 118.

[00093] FIG. 7 illustrates an exemplary a block diagram showing the major components of a playback system of a home theatre setup, in accordance with an embodiment of the present disclosure.

[00094] As illustrated, the second digital video package can be received from the HDMI output of the playback device/server or the set top box (STB). The playback server can downloads the second digital video package (master DLEP) from the central server. This master DLEP can be played in synchronization with the content being played back and illumination and effects control files information can be fed to the lighting rigs 116 and the effects devices 118.

[00095] FIG. 8 illustrates an exemplary a block diagram showing the major components of a programming system installed in a post-production studio, in accordance with an embodiment of the present disclosure.

[00096] As illustrated, the system can be installed either in an audio post production studio or a picture color correction studio commonly referred to as a DI studio. The color correction or audio workstation can connect to a projector to display the video and a audio cinema processor, which in turn feeds the amplifiers and speakers to playback the audio. The programming workstation of this invention connects to either of these workstations and can derive its time code synchronization information from it. The audio Dialogue, Song, Background music and Effects stems of the audio soundtrack can be fed to this workstation. The video file can also be fed to this workstation. The lighting rigs installed in the studio can also be connected to this workstation. The programming of illumination and effects control file can be carried out based on the video and audio files fed to the system. The system can allow the video file and the audio file to be defined, which can crawl through the content and identify particular picture of audio patterns. When these patterns are identified, certain predefined sequences of lights or effects can be activated. The programming of illumination and effects control file for the entire duration of the content can be carried out in this manner. Once the entire programming is completed, this can be converted into a second video package (master DLEP) in reference with a time code track to form the DLEP. The same time code track can be inserted in the audio track of the audio file. Hence the time code track from the master DLEP can synchronize the master DLEP. Each master DLEP can have a unique serial number in the time code track. This can help identify particular master DLEP when a number of them are loaded on the playback server together. The playback server can match the content serial number of the incoming time code from the cinema server and play the corresponding master DLEP.

[00097] FIG. 9 illustrates an exemplary a block diagram showing the major components of lighting system of the proposed system, in accordance with an embodiment of the present disclosure.

[00098] As illustrated, the lighting rigs can be of different types of lights installed in a fixed layout. This layout remains constant for every setup irrespective of the location like cinema or home. However, the number of lighting rigs can differ as per the size of the location. The sizes can be classified into three broad categories - approx. 30ft, approx. 60ft and approx. 90ft length from screen to back wall. Different combinations of lights with different powers are configured as per the size category. For example 30ft size has 4 nos moving heads on the side wall. This becomes 8 for 60ft and 12 for 90ft. In the 30ft setup, each of the 4 lights move individually. However, in the 60ft setup, the lights move in groups of 2 and in the 90ft setup, the lights move in groups of 3.

[00099] The sensors can be fitted on the opposite walls to measure the intensity of the light. There are sensors fitted on the back wall to measure the intensity of the projector. A calibration system can calibrate the intensity of each light to achieve standardization across different location. The overall intensity can then again be readjusted in respect to the illumination of the projector by projecting a white frame and measuring the light output. This can ensure that the correct ratio is maintained between projection and lights. Effects devices like fog machines and wind blasters can also be installed. This ensure that the light beams are visible. Normally, otherwise, the light can be visible only on the surface where it falls. Once the path contains some fog, the entire beam becomes visible.

[000100] FIG. 10 illustrates an exemplary illustration for identifying a white flash transition in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[000101] As illustrated, in an embodiment, the computing unit 102 can be configured to detect at least one white video frame between two video frames of the video file and generate a corresponding second set of control signals based on the detected at least one white frame. The lighting rigs 116 and the effects devices 118 can be configured to receive the second set of control signals from the computing unit 102 and provide a white coloured illumination at cinema or home for a first predetermined period of time.

[000102] FIG. 11 illustrates an exemplary illustration for identifying a lightning effect in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[000103] As illustrated, in an embodiment, the system can be configured to generate a third set of control signals when one or more elements present in the extracted at least one video frame remains unchanged and a fluctuation in a luminance value of the corresponding video frame is detected. The lighting rigs 116 and the effects devices 118 can be configured to receive the third set of control signals and provide a lightning effect at cinema or home for a second predetermined period of time

[000104] FIG. 12 illustrates an exemplary illustration for identifying music and beats in an audio frame and a color scheme in a video frame by the proposed system, in accordance with an embodiment of the present disclosure.

[000105] As illustrated, in an embodiment, the system can be configured to extract at least one audio frame from the plurality of audio frames and detect one or more acoustical attributes form the extracted at least one audio frame. The one or more acoustical attributes can include any or a combination of intensity, pitch, frequency, amplitude, beats per minute, and signal to noise ratio. [000106] In an embodiment, the computing unit 102 can identify music and the beats along with the color scheme of the lighting rigs 116. The system can detect the presence of music in the music track along with the beats at regular interval. From the time period between 2 beats, it gets the beats per minute. The computing unit 102 can then identify a color scheme from the picture. Based on the identified beats per minute and the color scheme the lighting pattern of the lighting rigs 116 an be triggered.

[000107] FIG. 13 illustrates an exemplary illustration for identifying an explosion in an audio frame and generating a corresponding fire effect in cinema or home by the proposed system, in accordance with an embodiment of the present disclosure.

[000108] As illustrated, in an embodiment, the system can be configured to detect an explosion sound in the at least one audio frame when the detected one or more acoustical attributes in the at least one audio frame exceeds a predefined level for a predefined period of time. The system can be configured to generate a fourth set of control signals when the system detects the explosion sound in the at least one audio frame and an average value of the most dominant color in the at least one video frame corresponding to the at least one audio frame is detected to be any or a combination of red and yellow. The lighting rigs 116 and the effects devices 118 can be configured to receive the fourth set of signals and provide a fire effect at cinema or home.

[000109] In an embodiment, the computing unit can be configured to generate an illumination and effects control file based on the first set of control signals, the second set of control signals, the third set of control signals, and the fourth set of control signals. The system can be configured to generate a second video package comprising the audio file, the video file and the generated illumination and effects control file.

[000110] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE INVENTION

[000111] The proposed disclosure provides a system to enhance cinema and home theatre experience. [000112] The proposed disclosure provides a system to provide synchronized lighting and effects at locations such as cinema, home, auditorium, etc based on a video to be played at the locations.

[000113] The proposed disclosure provides a system that can convert a digital cinema package having picture and audio into a format having picture, audio, and synchronized lighting and effects.

[000114] The proposed disclosure provides a system that can provide music concert like experience in cinemas and home theatres.