Sign In to Follow Application
View All Documents & Correspondence

A System And Method For Controlling The Condition Of Air

Abstract: ABSTRACT “A SYSTEM AND METHOD FOR CONTROLLING THE CONDITION OF AIR” Present disclosure relates to a technique of controlling temperature of an environment. The technique includes detecting temperature values inside and outside the environment. The technique further includes capturing audio data and video data pertaining to users of the environment. The technique further includes processing the audio data and video data to determine at least one of frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user. The technique furthermore includes changing the temperature value inside the environment based on at least one of the frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user body language of the at least one user and temperature value outside the environment. [Fig. 3]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 December 2018
Publication Number
50/2020
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-11-29
Renewal Date

Applicants

ZENSAR TECHNOLOGIES LIMITED
ZENSAR KNOWLEDGE PARK, PLOT # 4, MIDC, KHARADI, OFF NAGAR ROAD, PUNE-411014, MAHARASHTRA, INDIA

Inventors

1. KULKARNI, Sumant
T-307, Nammane Apartments, Judicial Layout Main Road, Talaghattapura, Bangalore -560062, Karnataka, India

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION (See section 10, rule 13)
“A SYSTEM AND METHOD FOR CONTROLLING THE CONDITION OF AIR”
ZENSAR TECHNOLOGIES LIMITED of Plot#4 Zensar Knowledge Park, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra – 411014, India
The following specification particularly describes the invention and the manner in which it
is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[0001] The present application claims priority from Indian Provisional Patent Application No.
201821050041 filed on 31st December 2018, the entirety of which is incorporate herein by a
reference.
TECHNICAL FIELD
[0002] The present disclosure generally relates to the field of controlling an environment condition. More particularly, but not exclusively, the present disclosure describes a system and a method for controlling temperature of the environment.
BACKGROUND
[0003] Now a days, air-conditioning is become very common. Many times, we may be using it because it is just on. Most of the times we do not change the temperature unless some explicit complaint is made (“it is too cold”, or “it is too hot”). In summer, the temperature of an air conditioner (AC) can be high and in winter it the temperature might be low. In day-to-day life also, based on the number of people in a room and the weather outside, we may have to change the temperature frequently.
[0004] People show implicit signs of discomfort before they complain about the AC temperature. These implicit signs are indicative of the need of change in the temperature. However, the conventional techniques are unbale to take into account these implicit signs of discomfort to change the temperature.
[0005] Therefore, there exists a need in the art of a technique that understands these implicit signs of discomfort and changes the temperature based on the implicit signs.
SUMMARY
[0006] The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.

[0007] In one non-limiting embodiment of the present disclosure, a method for controlling temperature of an environment is provided. The method comprises a step of detecting, by a at least one temperature detecting unit, a first set of parameters comprising at least a first temperature value and a second temperature value, wherein the first temperature value represents a temperature value inside the environment and the second temperature value represents a temperature value outside the environment. The method further comprises a step of capturing, by at least one media capturing unit, at least one of audio data and video data pertaining to at least one user in the environment. In the next step, at least one processing unit processes the at least one of captured audio data and video data to determine a second set of parameters, wherein the second set of parameters comprising at least one of frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user. The method further comprises a step of classifying, by the at least one processing unit, at least the second set of parameters into different classes. Further, at next step the method describes comprises mapping, by the at least one processing unit, at least the first set of parameters and the classified second set of parameters with one or more pre-classified parameters for assigning a score to each of the at least one of the first set of parameters and the classified second set of parameters. Lastly, the method describes modifying, by the at least one processing unit, the first temperature value based on the assigned scores for controlling the temperature of the environment.
[0008] In another non-limiting embodiment of the present disclosure, the step of classifying at least the second set of parameters into different classes comprises classifying the frequency of voice for the at least user into below normal frequency class, normal frequency class and above normal frequency class, classifying the walking speed for the at least user into below normal speed class, normal speed class and above normal speed class, classifying the type of clothing for the at least user into hot clothes class, normal clothes class and cold clothes class, classifying the body language for the at least user into normal body language class and abnormal body language class.
[0009] In yet another non-limiting embodiment of the present disclosure, the method further comprises averaging the scores assigned to each of the at least one of the first set of parameters and the classified second set of parameters to obtain an average score.
[0010] In yet another non-limiting embodiment of the present disclosure, the step of modifying the first temperature value based on the assigned scores comprises modifying the first temperature value based on the average score, wherein the first temperature value is decreased

if the average score is positive, the first temperature value is increased if the average score is negative, and the first temperature value remains unchanged if the average score is zero.
[0011] In yet another non-limiting embodiment of the present disclosure, the step of analysing the postures of the user comprises determining duration of the one or more postures and generating the one or more questions related to at least one task based on the duration of the one or more postures.
[0012] In yet another non-limiting embodiment of the present disclosure, the step of analysing the at least one user activity further comprises generating a request for information from the user to generate the one or more questions after a predefined time period.
[0013] In yet another non-limiting embodiment of the present disclosure, a system for controlling temperature of an environment is provided. The system comprises at least one temperature detecting unit configured to detect a first set of parameters comprising at least a first temperature value and a second temperature value, wherein the first temperature value represents a temperature value inside the environment and the second temperature value represents a temperature value outside the environment. The system further comprises at least one media capturing unit configured to capture at least one of audio data and video data pertaining to at least one user in the environment.
[0014] The system further comprises at least one processing unit operatively coupled to the at least one temperature detecting unit and at least one media capturing unit. The at least one processing unit is configured to process the at least one of captured audio data and video data to determine a second set of parameters, wherein the second set of parameters comprising at least one of frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user. The at least one processing unit is configured to classify at least the second set of parameters into different classes. The at least one processing unit is configured to map at least the first set of parameters and the classified second set of parameters with one or more pre-classified parameters for assigning a score to each of the at least one of the first set of parameters and the classified second set of parameters. The at least one processing unit is configured to modify the first temperature value based on the assigned scores for controlling the temperature of the environment. The system further comprises at least one memory unit configured to store the first set of parameters, the second set of parameters, the classified second set of parameters and the one or more pre-classified parameters.

[0015] In another non-limiting embodiment of the present disclosure, the at least one processing is configured to classify at least the second set of parameters into different classes by: classifying the frequency of voice for the at least user into below normal frequency class, normal frequency class and above normal frequency class, classifying the walking speed for the at least user into below normal speed class, normal speed class and above normal speed class, classifying the type of clothing for the at least user into hot clothes class, normal clothes class and cold clothes class, and classifying the body language for the at least user into normal body language class and abnormal body language class.
[0016] In yet another non-limiting embodiment of the present disclosure, the at least one processing unit is configured to average the scores assigned to each of the at least one of the first set of parameters and the classified second set of parameters to obtain an average score.
[0017] In yet another non-limiting embodiment of the present disclosure, the at least one processing unit is configured to modify the first temperature value based on the assigned scores by: modifying the first temperature value based on the average score, wherein the first temperature value is decreased if the average score is positive, the first temperature value is increased if the average score is negative, and the first temperature value remains unchanged if the average score is zero.
[0018] In yet another non-limiting embodiment of the present disclosure, the at least one media capturing unit comprises at least one audio capturing unit for capturing the audio data and at least one video capturing unit for capturing the video data. The at least one audio capturing unit comprises one or more microphones. The at least one video capturing unit comprises at least one of an optical camera, a stereo camera and an infrared camera.
[0019] In yet another non-limiting embodiment of the present disclosure, the at least one temperature detecting unit comprises at least one temperature sensor for detecting the first set of parameters.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0020] The features, nature, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout. Some

embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
[0021] Fig. 1 shows a block diagram illustrating a system for controlling temperature of an environment in accordance with an embodiment of the present disclosure.
[0022] Fig. 2 shows a block diagram illustrating a system for controlling temperature of an environment in accordance with an embodiment of the present disclosure.
[0023] Fig. 3 shows a flowchart of an exemplary method for controlling temperature of an environment in accordance with an embodiment of the present disclosure.
[0024] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0025] In the present document, the word “exemplary” is used herein to mean “serving as an example, instance, or illustration”. Any embodiment or implementation of the present subject-matter described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.
[0026] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0027] The terms “comprises”, “comprising”, “include(s)”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, system or method that comprises a list of components or steps does not include only those components or steps but may include

other components or steps not expressly listed or inherent to such setup or system or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
[0028] In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0029] The present disclosure will be described herein below with reference to the accompanying drawings. In the following description, well known functions or constructions are not described in detail since they would obscure the description with unnecessary detail.
[0030] The present disclosure relates to a system that may identify implicit signs of discomfort and thereafter may control temperature of an environment. The disclosure relates to the system that processes the implicit signs of the discomfort shown by at least one user present in the environment for automatically changing the temperature for controlling the environment condition.
[0031] Referring to figure 1, an exemplary system 100 is disclosed for controlling temperature of an environment. The system 100 may include various elements such as at least one temperature detecting unit 102, at least one media capturing unit 104, at least one processing unit 106, a database unit 108, an input interface 110 and an output interface 112. The at least one temperature detecting unit 102, at least one media capturing unit 104, at least one processing unit 106 and the database unit 108 may communicate with each other over wired or wireless link. The system 100 may receive one or more inputs from the input interface 110 and provide one or more outputs via the output interface 112.
[0032] According to a non-limiting exemplary embodiment of the present disclosure, the database unit 108 may store following information. The information may comprise one or more

of temperature related parameters associated with the environment and one or more audio data and video data pertaining to at least one user present in the environment. The one or more temperature related parameters associated with the environment may comprise one or more of the following, but not limited to, inside temperature of the environment and outside temperature of the environment, etc. The audio data may comprise one or more of following, but not limited to: frequency of voice of the at least one user, amplitude of the voice of the at least one user, etc. The video data may comprise one or more of following, but not limited to: walking speed of the at least one user, body language information of the at least one user, clothing type of at least one user. The body language may include dropping shoulders of the at least one user, facial expression indicating tiredness of the at least one user, sweating etc.
[0033] The information stored in the database unit 108 may be used by the system 100 to determine whether the at least one user is comfortable or uncomfortable with the environment condition and may determine whether there is a need of change the inside temperature of the environment. The audio data and video data may represent comfortability level of the at least one user in the environment. The audio data and video data may be stored in the database unit 108 in association with at least one of the inside temperatures of the environment and the outside temperature of the environment. The database unit 108 may further store one or more of the following information, but not limited to: one or more classifications of the at least one of the audio data and video data, one or more scores assigned to the one or more classifications of the at least one of the audio data and video data. The classifications of the at least one of the audio data and video data and the scores assigned to the classifications of the at least one of the audio data and video data may be stored in the database unit 108 in association with at least one of the inside temperature of the environment and the outside temperature of the environment.
[0034] Various non limiting examples of controlling the temperature of the environment in accordance to an aspect of the present invention are described here below.
[0035] Considering there are 100 users present in an environment such as an office floor. The inside temperature of the environment is TI1 and outside temperature of the environment is TO1. The one or more users of the 100 may be comfortable with the inside temperature TI1 and one of more users of the 100 users may be uncomfortable with the inside temperature TI1. Let us consider that 10 users may say that inside temperature TI1 is good for them, 5 users may say

that the inside temperature is little hot for them and, remaining 85 users may say that the inside temperature is too cold for them.
[0036] Further, in this embodiment, frequency of voice of the one or more users of the 100 users may be determined and classified into at least the following classes, but not limited to: below normal frequency class, normal frequency class, above normal frequency class. In order to classify, frequency of one or more users may be compared with a threshold frequency range. The threshold frequency range may signify normal frequency class at which one or more users feel comfortable at a particular temperature. The threshold frequency range may vary according to different environmental conditions and different locations.
[0037] In the same embodiment, walking speed of one or more users of the 100 users may be determined and classified into at least the following classes, but not limited to: below normal speed class, normal speed class, above normal speed class. In order to classify, walking speed of one or more users may be compared with a threshold speed range. The threshold speed range may signify normal speed class at which one or more users feel comfortable at a particular temperature. The threshold speed range may vary according to different environmental conditions and different locations.
[0038] In the same embodiment, body language of one or more users of the 100 users may be determined and classified into at least the following classes, but not limited to: normal body language class and abnormal body language class. In order to classify, body language parameters of one or more users may be compared with predefined body language parameters that represents normal body language class at which one or more users feel comfortable at a particular temperature. The body language parameters may include, but not limited to, anchor point of human body and a height of human body etc.
[0039] In the same embodiment, clothing type of one or more users of the 100 users may be determined and classified into at least the following classes, but not limited to: hot clothes class, normal clothes class and cold clothes class. In order to classify, the clothing type of one or more users may be compared with predefined clothing types that are indicative of hot clothes, normal clothes and cold clothes.

[0040] In the same embodiment, each of the classifications i.e. the frequency of voice classification, walking speed classification, body language classification, and clothing type classification may be assigned a score that is indicative of comfort level of the users present in the environment. For example, let us consider that out of 100 users, frequency of voice of 70 users is below normal frequency range, frequency of voice of 20 users is within normal frequency range, and frequency of voice of 10 users is above normal frequency range. Further, out of 100 users, walking speed of 75 users is below normal speed range, walking speed of 20 users is within normal speed range, and walking speed of 5 users is above normal speed range. Furthermore, body language of 70 users is abnormal and body language of 30 users is normal. Furthermore, clothing type of 65 users is hot clothes, clothing type of 15 users is normal clothes, clothing type of 30 users is cold clothes. Further, ratio of number of users in each of the class may be determined and assigned a score that is indicative of comfort level.
[0041] In the same embodiment, the classification ratio of frequency classes may be 70:20:10 i.e. 7:2:1 (below normal frequency : normal frequency : above normal frequency), ratio of walking speed may be 75:20:5 i.e. 15:4:1 (below normal speed : normal speed : above normal speed), classification ratio of body language may be 30:70 i.e. 3:7 (normal body language : abnormal body language), and classification ratio of clothing type may be 65:15:30 i.e. 13:3:6 (hot clothes : normal clothes : cold clothes). Each of the classification ratios may be assigned a score, for example, the value of score is -0.8, which indicates that a greater number of users are feeling cold. All the above described classifications, classification ratios, the score assigned to the classifications may be stored in the database unit 108 in association with inside temperature value TI1 and the outside temperature value TO1.
[0042] In the same embodiment, based on the feedback of the users, the inside temperature TI1 may be modified to increase the inside temperature of the environment because majority of the users are feeling cold. The inside temperature TI1 may be increased to temperature TI2. Now, the inside temperature is TI2 and the outside temperature may be still TO1. However, one or more users of the 100 users may still be uncomfortable with the inside temperature TI2. For example, 30 users out of the 100 users may say that inside temperature is good for them, 5 users may say that the inside temperature is little hot for them, remaining 65 may say that inside temperature is too cold for them. For the current environment conditions, frequency of voice, walking speed, body language and clothing type of the one or more users may be determined and classified into the one or more classes as defined above. For example, let us consider that out

of 100 users, frequency of voice of 50 users is below normal frequency range, frequency of voice of 40 users is within normal frequency range, and frequency of voice of 10 users is above normal frequency range. Further, out of 100 users, walking speed of 60 users is below normal speed range, walking speed of 35 users is within normal speed range, and walking speed of 5 users is above normal speed range. Furthermore, body language of 50 users is abnormal and body language of 50 users is normal. Furthermore, clothing type of 65 users is hot clothes, clothing type of 15 users is normal clothes, clothing type of 30 users is cold clothes. Further, ratio of each of the class may be determined and assigned a score that is indicative of comfort level.
[0043] In the same embodiment, the classification ratio of frequency classes may be 50:40:10 i.e. 5:4:1 (below normal frequency : normal frequency : above normal frequency), ratio of walking speed may be 60:35:5 i.e. 12:7:1 (below normal speed : normal speed : above normal speed), classification ratio of body language may be 50:50 i.e. 1:1 (normal body language : abnormal body language), and classification ratio of clothing type may be 65:15:30 i.e. 13:3:6 (hot clothes : normal clothes : cold clothes). Each of the classification ratios may be assigned a score, for example, the value of score is -0.5, which indicates that still majority of users are feeling cold. All the above described classifications, classification ratios, the score assigned to the classifications may be stored in the database unit 108 in association with inside temperature value TI2 and the outside temperature value TO1.
[0044] In the same embodiment, based on the feedback of the users, the inside temperature TI2 of the environment may be modified to increase the temperature of the environment because majority of the users are feeing cold. The inside temperature of the environment may be increased to temperature TI3. Now, the inside temperature is TI3 and the outside temperature may be still TO1. However, one or more users of the 100 users may still be uncomfortable with the inside temperature TI3. For example, 55 users out of the 100 users may say that inside temperature is good for them, 5 users may say that the inside temperature is little it hot for them, remaining 40 users may say that inside temperature is cold. For the current environment conditions, frequency of voice, walking speed, body language and clothing type of the one or more users may be determined and classified into the one or more classes as defined above. For example, let us consider that out of 100 users, frequency of voice of 30 users is below normal frequency range, frequency of voice of 60 users is within normal frequency range, and frequency of voice of 10 users is above normal frequency range. Further, out of 100 users,

walking speed of 35 users is below normal speed range, walking speed of 55 users is within normal speed range, and walking speed of 10 users is above normal speed range. Furthermore, body language of 40 users is abnormal and body language of 60 users is normal. Furthermore, clothing type of 65 users is hot clothes, clothing type of 15 users is normal clothes, clothing type of 30 users is cold clothes. Further, ratio of each of the class may be determined and assigned a score that is indicative of comfort level.
[0045] In the same embodiment, the classification ratio of frequency classes may be 30:60:10 i.e. 3:6:1 (below normal frequency : normal frequency : above normal frequency), ratio of walking speed may be 35:55:10 i.e. 7:11:2 (below normal speed : normal speed : above normal speed), classification ratio of body language may be 60:40 i.e. 3:2 (normal body language : abnormal body language), and classification ratio of clothing type may be 65:15:30 i.e. 13:3:6 (hot clothes : normal clothes : cold clothes). Each of the classification ratios may be assigned a score, for example, the value of score is -0.2, which indicates that some of the users are feeling cold. All the above described classifications, classification ratios, the score assigned to the classifications may be stored in the database unit 108 in association with inside temperature value TI3 and the outside temperature value TO1.
[0046] In the same embodiment, based on the feedback of the users, the inside temperature of the environment may be modified to increase the temperature TI3 of the environment because some of the users are feeling cold. The inside temperature of the environment may be increased to temperature TI4. Now, the inside temperature is TI4 and the outside temperature may be still TO1. However, one or more users of the 100 users may be still uncomfortable with the inside temperature TI4. For example, 80 users out of the 100 users may say that inside temperature is good for them, 15 users may say that the inside temperature is little hot for them, remaining 5 users may say that inside temperature is little cold for them. For the current environment conditions, frequency of voice, walking speed, body language and clothing type of the one or more users may be determined and classified into the one or more classes as defined above. For example, let us consider that out of 100 users, frequency of voice of 15 users is below normal frequency range, frequency of voice of 75 users is within normal frequency range, and frequency of voice of 10 users is above normal frequency range. Further, out of 100 users, walking speed of 20 users is below normal speed range, walking speed of 70 users is within normal speed range, and walking speed of 10 users is above normal speed range. Furthermore, body language of 10 users is abnormal and body language of 90 users is normal. Furthermore,

clothing type of 30 users is hot clothes, clothing type of 50 users is normal clothes, clothing type of 20 users is cold clothes. Further, ratio of each of the class may be determined and assigned a score that is indicative of comfort level.
[0047] In the same embodiment, the classification ratio of frequency classes may be 15:75:10 i.e. 3:15:2 (below normal frequency : normal frequency : above normal frequency), ratio of walking speed may be 20:70:10 i.e. 4:14:2 (below normal speed : normal speed : above normal speed), classification ratio of body language may be 90:10 i.e. 9:1 (normal body language : abnormal body language), and classification ratio of clothing type may be 30:50:20 i.e. 6:10:4 (hot clothes : normal clothes : cold clothes). Each of the classification ratios may be assigned a score, for example, the value of score is +0.2, which indicates that some of the users are feeling that environment is hot. All the above described classifications, classification ratios, the score assigned to the classifications may be stored in the database unit 108 in association with inside temperature value TI4 and the outside temperature value TO1.
[0048] In the same embodiment, although the majority of the users may feel comfortable with the environment condition i.e. inside temperature TI4, however, some number of users may provide the feedback that the environment temperature has been increased more than required. Thus, the inside temperature TI4 may be further modified to slightly decrease the inside temperature of the environment. The inside temperature of the environment may be decreased to temperature TI5. Now, the inside temperature is TI5 and the outside temperature may be still TO1. In this scenarios, 90 users out of the 100 users may say that inside temperature is good for them, 5 users may say that the inside temperature is little hot for them, remaining 5 users may say that inside temperature is little cold for them. For the current environment conditions, frequency of voice, walking speed, body language and clothing type of the one or more users may be determined and classified into the one or more classes as defined above. For example, let us consider that out of 100 users, frequency of voice of 10 users is below normal frequency range, frequency of voice of 85 users is within normal frequency range, and frequency of voice of 5 users is above normal frequency range. Further, out of 100 users, walking speed of 10 users is below normal speed range, walking speed of 85 users is within normal speed range, and walking speed of 5 users is above normal speed range. Furthermore, body language of 10 users is abnormal and body language of 90 users is normal. Furthermore, clothing type of 30 users is hot clothes, clothing type of 60 users is normal clothes, clothing type of 10 users is

cold clothes. Further, ratio of each of the class may be determined and assigned a score that is indicative of comfort level.
[0049] In the same embodiment, the classification ratio of frequency classes may be 10:85:5 i.e. 2:17:1 (below normal frequency : normal frequency : above normal frequency), ratio of walking speed may be 10:85:5 i.e. 2:17:1 (below normal speed : normal speed : above normal speed), classification ratio of body language may be 95:5 i.e. 19:1 (normal body language : abnormal body language), and classification ratio of clothing type may be 30:60:10 i.e. 6:12:2 (hot clothes : normal clothes : cold clothes). Each of the classification ratios may be assigned a score, for example, the value of score is 0.0, which indicates optimum value of the inside temperature with respect to the outside temperature as majority of the users are feeling comfortable and number of users that feel cold and hot in the environment are very less. All the above described classifications, classification ratios, the score assigned to the classifications may be stored in the database unit 108 in association with inside temperature value TI5 and the outside temperature value TO1.
[0050] In the same embodiment, the outside temperature TO1 may also change. Accordingly, the inside temperature of the environment may be modified based on feedback from one or more users, as described above. In this manner, one or more of the above-mentioned information about the different scenarios may be stored in the database unit 108 in association with the respective inside temperature and the outside temperature of the environment as depending on the different types of scenarios.
[0051] Referring again to figure 1, the at least one processing unit 106 may comprise at least one processor and a memory. The at least one processing unit 106 of the system 100 may be trained for various scenarios, as described above, to control the temperature of an environment in real time based on the inside temperature of the environment, outside temperature of the environment, and one or more of the audio & video data as defined above. The database unit 108 of the system 100 may store all the training data for controlling the temperature of the environment based on the one or more implicit signs of discomfort exhibited by the users in the environment.
[0052] In an embodiment, the temperature detecting unit 102 may detect a first set of parameters. The first set of parameters may comprise at least a first temperature value and a

second temperature value. The first temperature value may represent a temperature value inside the environment and the second temperature value may represent a temperature value outside the environment. In an exemplary embodiment, temperature detecting unit 102 may comprise at least one temperature sensor for detecting the first set of parameters. The temperature sensor may be one or more of the following, but not limited to: thermocouples, resistor temperature detectors, thermistors, infrared sensors, semiconductors, thermometers, etc. The environment may be but not limited to, an airport, railway station, temple, conference room, canteen, house, cabin, office premise, building premise, shopping mall, parking area, vehicles, etc. [0053] The media capturing unit 104 may capture one or more audio data and video data pertaining to the at least one user in the environment. The media capturing unit 104 may comprise one or more of the following, but not limited to: at least one audio sensor such as microphone, at least one video data capturing unit such as an optical camera, a stereo camera and an infrared camera, etc.
[0054] The at least one processing unit 106 may process the one or more audio and video data to determine a second set of parameters. The second set of parameters may comprise one or more of following, but not limited to: frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user, and body language of the at least one user, etc. After determining the second set of parameters, the at least one processing unit 106 may classify the second set of parameters into different classes. The at least one processing unit 106 may classify the frequency of voice for the at least user into one or more of the following classes, but not limited to: below normal frequency class, normal frequency class and above normal frequency class. Further, the at least one processing unit 106 may classify the walking speed for the at least user into one or more of the following classes, but not limited to: below normal speed class, normal speed class and above normal speed class. The at least one processing unit 106 further may classify the type of clothing for the at least user into one or more users into following classes, but not limited to: hot clothes class, normal clothes class and cold clothes class. Furthermore, the at least one processing unit 106 may classify the body language for the at least user into one or more following classes, but not limited to, normal body language class and abnormal body language class. The at least one processing unit 106 may process the audio and video data using one or more of the following, but not limited to: speech processing techniques, image processing techniques, video processing techniques, etc.

[0055] Further, the at least one processing unit 106 may map the first set of parameters and the classified second set of parameters with one or more predefined scenarios stored in the database unit 108. The predefined scenarios represent training data as defined above. In other words, the at least one processing unit 106 map the first set of parameters and the classified second set of parameters with the pre-classified parameters stored in the database unit 108 in association with the inside temperature and outside temperature values and may assign a score to each of the at least one of the first set of parameters and the classified second set of parameters based on the mapping.
[0056] In an exemplary embodiment, the at least one processing unit 106 may map the classification ratio of the frequency of the at least one user with the one or more pre-classified frequency ratios stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the frequency ratio based on the score associated with the mapped pre-classified frequency ratio stored in the database.
[0057] Similarly, the at least one processing unit 106 may map the classification ratio of the walking speed of the at least one user with the one or more pre-classified ratios of the walking speed stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the walking speed based on the score associated with the mapped pre-classified ratio of the walking speed stored in the database unit 108. The at least one processing unit 106 may further map the classification ratio of the body language of the at least one user along with the one or more pre-classified ratios of the body languages stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the body language based on the score associated with the mapped pre-classified ratio of the body language in the database unit 108. Furthermore, the at least one processing unit 106 may map the classification ratio of the clothing type of the at least one user with the one or more pre-classified ratios of the clothing types stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the clothing type based on the score associated with the mapped pre-classified ratio of the clothing type in the database unit 108.

[0058] Further, the at least one processing unit 106 may average the above assigned scores to obtain an average score. The average score may signify the comfort level of the at least one user in the environment and may indicate the need of the change of the temperature of the environment. The average score may be positive or negative or zero. Based on the average score, the at least one processing unit 106 may modify the inside temperature for controlling the temperature of the environment. In an exemplary embodiment, the at least one processing unit 106 may decrease the inside temperature if the average score is a positive value. The at least one processing unit 106 may increase the inside temperature if the average score is negative. Whereas, the at least one processing unit may not change the inside temperature if the average score is zero. Further, the at least one processing unit 106 may change the temperature by a certain value based on the magnitude of the average score. The at least one processing unit 106 may change the temperature of the environment gradually or abruptly based on the value of score.
[0059] For explaining the embodiments defined in paragraphs [0051]- [0058], let us consider a scenario that there are 50 users present in an environment such as conference room. The temperature detection unit 102 may detect inside temperature of the conference room and temperature outside the conference room. The media capturing unit 104 may capture audio data and video data pertaining to the users in the conference room. Further, the at least one processing unit 106 may process the captured audio data and video data to determine one or more of the following parameters, but not limited to: frequency of the voice of one or more users in the conference room, walking speed of the one or more users in the conference room, body language of the one or more users in the conference room, and clothing type of the one or more users in the conference room. Further, the at least one processing unit 106 may classify the one or more determined parameters into different classes. For example, the at least one processing unit 106 may classify the frequency of one or more users into different classes such as: below normal frequency of voice, normal frequency of voice and above normal frequency of the voice. The at least one processing unit 106 may determine that frequency of the voice of 30 users is below normal frequency range, frequency of the voice the 15 users is within normal frequency range, and frequency of the voice of the 5 users is above the normal frequency range of the voice. The at least one processing unit 106 may determine a ratio of frequencies of the voice as 30:15:5 i.e. 6:3:1 (below normal frequency: normal frequency: normal frequency).

[0060] In the same embodiment, the at least one processing unit 106 may also classify the walking speed of the one or more users into different classes of the walking speed such as: below normal walking speed, normal walking speed, and above normal walking speed. The at least one processing unit 106 may determine that walking speed of voice of 18 users is below normal walking speed range, walking speed of the 30 users is within normal walking speed, and walking speed of the 2 users may be above the normal frequency range of the voice. The at least one processing unit 106 may determine a ratio of the walking speeds as 18:30:2 i.e. 9:15:1 (below normal walking speed : normal walking speed : above normal walking speed).
[0061] In the same embodiment, the at least one processing unit 106 may also classify the body language of the one or more users into different classes of the body language: normal body language, and abnormal body language. The at least one processing unit 106 may determine that body language of 15 users is normal and body language of the 25 users is abnormal. The at least one processing unit 106 may determine a ratio of the body language as 15:25 i.e. 3:5 (normal body language: abnormal body language). The at least one processing unit 106 may also classify the clothes type of the one or more users into different classes of the clothes type such as: hot clothes class, normal clothes class, and cold clothes class. The at least one processing unit 106 may determine that clothes type of 16 users is hot cloth, clothes type of the 30 users is normal clothes and clothes type of 6 users is cold cloth. The at least one processing unit 106 may determine a ratio of the clothes type as 16:30:6 i.e. 8:15:3 (hot clothes : normal clothes : cold clothes).
[0062] In this, the at least one processing unit 106 may map the determined classification ratios with one or more pre-classified ratios stored in association with the inside and outside temperature values and may assign a score to each of the classification ratio. The at least one processing unit 106 may map the classification ratio of frequency with one or more stored classification ratios of the frequencies in the database unit 108 in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score of -0.5 based on a score stored in the database unit 108 in association with the mapped pre-classified ratio. Further, the at least one processing unit 106 may map the classification ratio of walking speed with one or more stored classification ratios of the walking speed in the database unit 108 in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score of -0.2 based on a score stored in the database unit 108 in association with the mapped pre-classified ratio.

[0063] In this, the at least one processing unit 106 may further map the classification ratio of body language with one or more stored classification ratios of the body language stored in the database unit 108 in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score of -0.6 based on a score stored in the database unit 108 in association with the mapped pre-classified ratio stored. Furthermore, the at least one processing unit 106 may map the classification ratio of clothing type with one or more stored classification ratios of the clothing type stored in the database unit 108 in association with the inside and outside temperature values. may assign a score of -0.2 based on a score assigned to the mapped pre-classified ratio stored in database unit 108.
[0064] In the same embodiment, the at least one processing unit 106 may determine an average score for all the determined scores. For example, the at least one processing unit 106 may determine the average score as -0.375. Based on the determined average score, the at least one processing unit may modify the temperature by increasing the inside temperature.
[0065] The system 100 may monitor the environment condition continuously or periodically and may modify the inside temperature to provide an optimum temperature of the environment to the users present in the environment. In this manner, the system 100 ensure that a comfortable environment is provided to maximum number of the users present in the environment. The system may also automatically control the inside temperature of the environment based on variation in the outside temperature of the environment. Further, the system 100 may also provide techniques to modify the environment condition based on the implicit discomfort signs shown by the users present in the environment.
[0066] According to another embodiment of the present application, the at least one processing unit 106 of the system 100 may comprise one or more units, as illustrated in figure 2. The functionality and operations of the at least one processing unit 106 of system 100 may be performed by the units illustrated in figure 2. The at least one processing unit 106 as described in figure 2 may comprises at least one determination unit 202, at least one classification unit 204, at least one mapping unit 206, an averaging unit 208, and a modifying unit 210. All the units may be operationally coupled to a memory (not shown in figure) of the at least one processing units 106. The units 202-210 may be communicatively coupled with at least one temperature detecting unit, media capturing unit, database unit, the input interface and output

interface. The at least one determination unit 202, at least one classification unit 204, at least one mapping unit 206, an averaging unit 208, and modifying unit 210 may be operatively and functionally coupled with each other and may communicate with each other over wired or wireless link. Each unit of the above described units may be a hardware unit and may comprise at least one processor and memory (not shown in figure).
[0067] In the same embodiment, the at least one determination unit 202 may process the at least one of captured audio data and video data to determine a second set of parameters. The at least one classification unit 204 may classify at least the second set of parameters into different classes. The at least one mapping unit 206 may map at least the first set of parameters and the classified second set of parameters with one or more pre-classified parameters for assigning a score to each of the at least one of the first set of parameters and the classified second set of parameters. The averaging unit 208 average the scores assigned to each of the at least one of the first set of parameters and the classified second set of parameters to obtain an average score. The modifying unit 210 may modify the first temperature value based on the assigned scores for controlling the temperature of the environment.
[0068] Fig. 3 shows a flowchart of an exemplary method 300 of controlling temperature of an environment, in accordance with another embodiment of the present disclosure. At block 302, the method describes detecting a first set of parameters comprising at least a first temperature value and a second temperature value. The first temperature value may represent a temperature value inside the environment and the second temperature value may represents a temperature value outside the environment. The inside and outside temperature of the environment may be detected by the at least one temperature detecting unit 106 of the system 100.
[0069] At block 304, audio data and video data pertaining to at least one user in the environment may be captured by the media capturing unit 104 of the system 100. At step 306, the at least one of captured audio data and video data may be processed to determine a second set of parameters. The second set of parameters may comprise at least on of following, but not limited to: frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user. The at least one of captured audio data and video data may be processed by the at least one processing unit 106 of the system 100.

[0070] At step 308, the at least one processing unit 106 may classify at least the second set of parameters into different classes. The frequency of voice for the at least user may be classified into one or more of the following classes, but not limited to: below normal frequency class, normal frequency class and above normal frequency class. Further, the walking speed for the at least user may be classified into one or more of the following classes, but not limited to: below normal speed class, normal speed class and above normal speed class. Further, the type of clothing for the at least user may be classified into one or more users into following classes, but not limited to: hot clothes class, normal clothes class and cold clothes class. Furthermore, the body language for the at least user may be classified into one or more following classes, but not limited to, normal body language class and abnormal body language class.
[0071] At step 310, the first set of parameters and the classified second set of parameters may be mapped with one or more pre-classified parameters stored in the database unit 108 by the at least one processing unit 106. The at least one processing unit 106 may assign may be a score to each of the at least one of the first set of parameters and the classified second set of parameters based on the mapping. In an exemplary embodiment, the classification ratio of the frequency of the at least one user a may be mapped with the one or more pre-classified ratios of the frequency stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio may be assigned based on the score associated with the mapped pre-classified ratio of the frequency in the database unit 108.
[0072] Further, the classification ratio of the walking speed of the at least one user may be mapped with the one or more pre-classified ratios of the walking speed stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the walking speed may be assigned based on a score associated with the mapped pre-classified ratio of the walking speed in the database unit 108. Further, the classification ratio of the body language of the at least one user may be mapped with the one or more pre-classified ratios of the body languages stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the body language may be assigned based on a score associated with the mapped pre-classified ratio of the body language in the database unit 108.

[0073] Furthermore, the classification ratio of the clothing type of the at least one user may be mapped with the one or more pre-classified ratios of the clothing types stored in the database unit 108. The mapping is done in association with the inside and outside temperature values. The at least one processing unit 106 may assign a score to the classification ratio of the clothing type may be assigned based on a score associated with the mapped pre-classified ratio of the clothing type in the database unit 108.
[0074] At step 312, the inside temperature may be modified based on the assigned scores for controlling the temperature of the environment. The inside temperature may be modified by the at least one processing unit 106. Further, the modifying the inside temperature comprises averaging the assigned scores to obtain an average score. The average score may signify the comfort level of the alt least one user in the environment and may indicate the need of the change of the temperature of the environment. The average score may be positive or negative or zero. Based on the average score, the inside temperature may be modified for controlling the temperature of the environment. In an exemplary embodiment, the inside temperature may be decreased if the average score is a positive value. The inside temperature may be increased if the average score is negative. Further, the inside temperature may not be changed if the average score is zero. Further, the inside temperature may be changed by a certain value based on the magnitude of the average score. The inside temperature of the environment may be gradually or abruptly changed based on the value of average score. Furthermore, the assigned scores may be averaged by the at least one processing unit 106 to obtain the average score.
[0075] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
[0076] Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.

[0077] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer- readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[0078] Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
Advantages of the embodiment of the present disclosure are illustrated herein.
[0079] In an embodiment, the present disclosure provides techniques for automatically controlling the temperature of the environment based the implicit signs of the discomfort shown the users present in the environment.
[0080] In an embodiment, the present disclosure provides the system that decides whether the change in temperature is required or not and then intelligently and efficiently controls the temperature of the environment.
[0081] In an embodiment, the system ensure that a comfortable environment is provided to maximum number of the user present in the environment.
[0082] In an embodiment, the present disclosure provides a system that automatically control the inside temperature of the environment based on variation in the outside temperature of the environment.

We Claim:
1. A method for controlling temperature of an environment, the method comprising:
detecting, by a at least one temperature detecting unit, a first set of parameters
comprising at least a first temperature value and a second temperature value, wherein the first temperature value represents a temperature value inside the environment and the second temperature value represents a temperature value outside the environment;
capturing, by at least one media capturing unit, at least one of audio data and video data pertaining to at least one user in the environment;
processing, by at least one processing unit, the at least one of captured audio data and video data to determine a second set of parameters, wherein the second set of parameters comprising at least one of frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user;
classifying, by the at least one processing unit, at least the second set of parameters into different classes;
mapping, by the at least one processing unit, at least the first set of parameters and the classified second set of parameters with one or more pre-classified parameters for assigning a score to each of the at least one of the first set of parameters and the classified second set of parameters; and
modifying, by the at least one processing unit, the first temperature value based on the assigned scores for controlling the temperature of the environment.
2. The method of claim 1, wherein classifying at least the second set of parameters into
different classes comprising:
classifying the frequency of voice for the at least user into below normal frequency class, normal frequency class and above normal frequency class;
classifying the walking speed for the at least user into below normal speed class, normal speed class and above normal speed class;
classifying the type of clothing for the at least user into hot clothes class, normal clothes class and cold clothes class; and
classifying the body language for the at least user into normal body language class and abnormal body language class.
3. The method of claim 1, further comprising:

averaging the scores assigned to each of the at least one of the first set of parameters and the classified second set of parameters to obtain an average score.
4. The method of claim 3, wherein modifying the first temperature value based on the
assigned scores comprises:
modifying the first temperature value based on the average score, wherein the first temperature value is decreased if the average score is positive, the first temperature value is increased if the average score is negative, and the first temperature value remains unchanged if the average score is zero.
5. A system for controlling temperature of an environment, the system comprising:
at least one temperature detecting unit configured to detect a first set of parameters comprising at least a first temperature value and a second temperature value, wherein the first temperature value represents a temperature value inside the environment and the second temperature value represents a temperature value outside the environment;
at least one media capturing unit configured to capture at least one of audio data and video data pertaining to at least one user in the environment;
at least one processing unit operatively coupled to the at least one temperature detecting unit and at least one media capturing unit and configured to:
process the at least one of captured audio data and video data to determine a second set of parameters, wherein the second set of parameters comprising at least one of frequency of voice of the at least one user, walking speed of the at least one user, clothing type of the at least one user and body language of the at least one user,
classify at least the second set of parameters into different classes,
map at least the first set of parameters and the classified second set of parameters with one or more pre-classified parameters for assigning a score to each of the at least one of the first set of parameters and the classified second set of parameters, and
modify the first temperature value based on the assigned scores for controlling the temperature of the environment; and
at least one memory unit configured to store the first set of parameters, the second set of parameters, the classified second set of parameters and the one or more pre-classified parameters.

6. The system of claim 5, wherein the at least one processing is configured to classify at
least the second set of parameters into different classes by:
classifying the frequency of voice for the at least user into below normal frequency class, normal frequency class and above normal frequency class;
classifying the walking speed for the at least user into below normal speed class, normal speed class and above normal speed class;
classifying the type of clothing for the at least user into hot clothes class, normal clothes class and cold clothes class; and
classifying the body language for the at least user into normal body language class and abnormal body language class.
7. The system of claim 5, wherein the at least one processing unit is further configured to:
average the scores assigned to each of the at least one of the first set of parameters and
the classified second set of parameters to obtain an average score.
8. The system of claim 7, wherein the at least one processing unit is further configured to
modify the first temperature value based on the assigned scores by:
modifying the first temperature value based on the average score, wherein the first temperature value is decreased if the average score is positive, the first temperature value is increased if the average score is negative, and the first temperature value remains unchanged if the average score is zero.
9. The system of claim 5, wherein the at least one media capturing unit comprises at least
one audio capturing unit for capturing the audio data and at least one video capturing unit for
capturing the video data;
wherein the at least one audio capturing unit comprises one or more microphones; and wherein the at least one video capturing unit comprises at least one of an optical camera, a stereo camera and an infrared camera.
10. The system of claim 5, wherein the at least one temperature detecting unit comprises at
least one temperature sensor for detecting the first set of parameters.

Documents

Application Documents

# Name Date
1 201821050041-STATEMENT OF UNDERTAKING (FORM 3) [31-12-2018(online)].pdf 2018-12-31
2 201821050041-PROVISIONAL SPECIFICATION [31-12-2018(online)].pdf 2018-12-31
3 201821050041-PROOF OF RIGHT [31-12-2018(online)].pdf 2018-12-31
4 201821050041-POWER OF AUTHORITY [31-12-2018(online)].pdf 2018-12-31
5 201821050041-FORM 1 [31-12-2018(online)].pdf 2018-12-31
6 201821050041-DRAWINGS [31-12-2018(online)].pdf 2018-12-31
7 201821050041-DECLARATION OF INVENTORSHIP (FORM 5) [31-12-2018(online)].pdf 2018-12-31
8 201821050041-Proof of Right (MANDATORY) [07-05-2019(online)].pdf 2019-05-07
9 201821050041-RELEVANT DOCUMENTS [21-11-2019(online)].pdf 2019-11-21
10 201821050041-FORM 13 [21-11-2019(online)].pdf 2019-11-21
11 201821050041-ORIGINAL UR 6(1A) FORM 1-080519.pdf 2019-12-31
12 201821050041-FORM 18 [31-12-2019(online)].pdf 2019-12-31
13 201821050041-DRAWING [31-12-2019(online)].pdf 2019-12-31
14 201821050041-CORRESPONDENCE-OTHERS [31-12-2019(online)].pdf 2019-12-31
15 201821050041-COMPLETE SPECIFICATION [31-12-2019(online)].pdf 2019-12-31
16 Abstract1.jpg 2021-10-18
17 201821050041-FER.pdf 2021-10-28
18 201821050041-OTHERS [27-04-2022(online)].pdf 2022-04-27
19 201821050041-FER_SER_REPLY [27-04-2022(online)].pdf 2022-04-27
20 201821050041-DRAWING [27-04-2022(online)].pdf 2022-04-27
21 201821050041-COMPLETE SPECIFICATION [27-04-2022(online)].pdf 2022-04-27
22 201821050041-CLAIMS [27-04-2022(online)].pdf 2022-04-27
23 201821050041-ABSTRACT [27-04-2022(online)].pdf 2022-04-27
24 201821050041-PatentCertificate29-11-2023.pdf 2023-11-29
25 201821050041-IntimationOfGrant29-11-2023.pdf 2023-11-29

Search Strategy

1 SearchHistory(1)-convertedE_27-10-2021.pdf

ERegister / Renewals

3rd: 22 Jan 2024

From 31/12/2020 - To 31/12/2021

4th: 22 Jan 2024

From 31/12/2021 - To 31/12/2022

5th: 22 Jan 2024

From 31/12/2022 - To 31/12/2023

6th: 22 Jan 2024

From 31/12/2023 - To 31/12/2024

7th: 07 Nov 2024

From 31/12/2024 - To 31/12/2025

8th: 06 Oct 2025

From 31/12/2025 - To 31/12/2026