Sign In to Follow Application
View All Documents & Correspondence

Context Aware Chat Management

Abstract: CONTEXT-AWARE CHAT MANAGEMENT The present invention relates to context-aware chat management. The invention provides a method implemented in a chat application for context-aware chat management and a computing device thereof. In one embodiment, a computing device (100) for context-aware chat management in a chat application, comprises: a context manager (101) coupled with a control unit (102) and configured to create at least one contextual group in an ongoing chat, the at least one contextual group comprises one or more messages having same context; and an user interface (103) coupled with the control unit (102) and configured to display the at least one contextual group in the ongoing chat and perform an action with the at least one contextual group in response to a user command, wherein the action comprises: expanding the at least one contextual group, or collapsing the at least one contextual group.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 January 2016
Publication Number
28/2017
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
mail@lexorbis.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-10-04
Renewal Date

Applicants

Samsung India Electronics Private Limited
Tower D - Ground to 10th Floor, Tower C - 7th to 10th Floor, Logix Cyber Park, Plot C 28-29, Sector 62 - Noida, Uttar Pradesh 201301

Inventors

1. GUPTA, Ankur
Ambica House, 88/18, Kailash Puri, Meerut 250002, Uttar Pradesh, India
2. PREM P, Ashoka
'Hamara' Near Government Hospital, Kayamkulam, Alleppey District 690502, Kerala, India
3. DEB, Saurabh
124 C, Pocket C, Siddhartha Extension, Ashram, New Delhi 110014, Delhi, India
4. SAGAR, Pushpendra Prakash
3, Anand Colony, Station Road, Chandausi 202412, Uttar Pradesh, India
5. BASU, Avirup
23/C, Mahendra Road, Bhowanipore, Kolkata 700025, West Bengal, India
6. SARDANA, Ankur
C-52, Galaxy Apartment, Sector 43, Gurgaon 122009, Haryana, India
7. PATWA, Vishal
C/O. Col. Satyavir Singh, D-143, First Floor, Sector 55, Noida 201301, Uttar Pradesh, India
8. THOMAS, Theophilus
Type III/8, Staff Quarters, Kendriya Vidyalaya No. 2, Industrial Estates PO, Visakhapatnam 530007, Andhra Pradesh, India

Specification

DESCRIPTION
TECHNICAL FIELD OF INVENTION
The invention generally relates to the field of context awareness. More particularly,
the invention relates to methods and devices for context-aware chat management.
BACKGROUND OF INVENTION
At present, chat services have become very popular as they offer instantaneous
transmission of chat messages from a sender to one or more receivers through a network,
Near Field Communication (NFC), or the Internet, etc. A typical chat message may include
text, image, audio, video, location, contact, or the like. Chat services may involve direct
communications between individuals, or may involve group communications, wherein
communication occurs from one sender to many receivers.
In any chat service, a user has access to contact list that includes the names or
identifications of other users with whom communication may be desired in the future. When
users identified in the contacts list connect to the network, the user is notified of their
presence, so that an interactive chat session may begin, if desired. During the interactive chat
session, the instant messages between users are contemporaneously routed to the users'
electronic devices and displayed on a pop-up window or display area of a display screen. In
this way, two or more users may converse with one another in a simulated real-time manner
through messages. The unique features of instant messages (as opposed to email and forum
posts, whether public or non-public) provide users with the ability to engage in a near realtime
conversation, which is appealing to users.
2
SUMMARY OF INVENTION
This summary is provided to introduce a selection of concepts in a simplified form
that are further described below in the detailed description. This summary is not intended to
identify key features or essential features of the claimed subject matter, nor is it intended as
an aid in determining the scope of the claimed subject matter.
The problem with current chat services is the inefficient management of chat
messages because the number of chat messages in a chat session can sometimes be huge.
While chat messages are typically sorted according to the increasing order of time, i.e., from
past to present, and displayed in top to bottom fashion in the chat session; in order to see an
old message which is not displayed on a current screen and is undeleted, a user needs to
continuously scroll up the chat session. This may be time consuming and boring for users.
To this end, the present invention provides a solution for improving the management
of chat messages. Embodiments of the present invention basically provide context-aware chat
management. It is observed that there can be mini-conversations inside a long chat session,
these mini conversations have associated artefacts that help in establishing context and
supplementing the conversation. Accordingly, a chat session can be segmented into various
contextual groups, which can be expanded or collapsed as and when required, for example,
by zooming in and zooming out respectively, a pinch open gesture and pinch closed gesture
respectively, clicking an expand button and collapse button respectively, etc. In one
implementation, the context can be established automatically. Alternatively, the user can
manually change the context or initiate a new context. Once context is known, the same can
be indicated along with the one or more chat messages and/or a chat session. In one
implementation, readymade chat objects can be sorted according to known context for
facilitating easy selection by user.
3
The advantages of the present invention include, but are not limited to efficient
management of chat messages, a new interaction to retrieve contextual groups in a chat
session, utilization of context awareness to provided new functionalities, etc. Furthermore, a
reader’s attention is focused clearly on context due to these segmented chunks and has most
relevant artefacts, such as pictures, documents, etc. to support his conversation.
The details of one or more embodiments are set forth in the accompanying drawings
and description below. Other features and advantages will be apparent from a reading of the
following detailed description and a review of the associated drawings. It is to be understood
that the following detailed description is explanatory only and is not restrictive of the
invention as claimed.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
To further clarify the advantages and features of the invention, a more particular
description of the invention will be rendered by reference to specific embodiments thereof,
which is illustrated in the appended drawings. It is appreciated that these drawings depict
only typical embodiments of the invention and are therefore not to be considered limiting of
its scope. The invention will be described and explained with additional specificity and detail
with the accompanying drawings in which:
Figure 1 illustrates an exemplary computing device for context-aware chat
management in a chat application, in accordance with one or more embodiments of the
present invention;
Figure 2 illustrates an exemplary method implemented in a chat application for
context-aware chat management, in accordance with one or more embodiments of the present
invention;
4
Figure 3 illustrates an exemplary scenario for interaction in a contextual healthcare
chat, in accordance with one or more embodiments of the present invention;
Figure 4 illustrates an exemplary scenario for manual change of context in a chat
session, in accordance with one or more embodiments of the present invention;
Figure 5 illustrates an exemplary scenario for user initiated context detection, in
accordance with one or more embodiments of the present invention;
Figure 6 illustrates an exemplary scenario for auto-detection of context in a chat
window, in accordance with one or more embodiments of the present invention;
Figure 7 illustrates an exemplary scenario for reflection of chat contexts in a
dashboard, in accordance with one or more embodiments of the present invention; and
Figure 8 illustrates an exemplary scenario for prioritization of emoticons based on the
context, in accordance with one or more embodiments of the present invention.
It may be noted that to the extent possible, like reference numerals have been used to
represent like elements in the drawings. Further, those of ordinary skill in the art will
appreciate that elements in the drawings are illustrated for simplicity and may not have been
necessarily drawn to scale. For example, the dimensions of some of the elements in the
drawings may be exaggerated relative to other elements to help to improve understanding of
aspects of the invention. Furthermore, the one or more elements may have been represented
in the drawings by conventional symbols, and the drawings may show only those specific
details that are pertinent to understanding the embodiments of the invention so as not to
obscure the drawings with details that will be readily apparent to those of ordinary skill in the
art having the benefits of the description herein.
5
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention,
reference will now be made to the embodiment illustrated in the drawings and specific
language will be used to describe the same. It will nevertheless be understood that no
limitation of the scope of the invention is thereby intended, such alterations and further
modifications in the illustrated system, and such further applications of the principles of the
invention as illustrated therein being contemplated as would normally occur to one skilled in
the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description
and the following detailed description are exemplary and explanatory of the invention and are
not intended to be restrictive thereof. Throughout the patent specification, a convention
employed is that in the appended drawings, like numerals denote like components.
Reference throughout this specification to “an embodiment”, “another embodiment”
or similar language means that a particular feature, structure, or characteristic described in
connection with the embodiment is included in at least one embodiment of the invention.
Thus, the appearances of the phrase “in an embodiment”, “in another embodiment” and
similar language throughout this specification may, but do not necessarily, all refer to the
same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to
cover a non-exclusive inclusion, such that a process or method that comprises a list of steps
does not include only those steps but may include other steps not expressly listed or inherent
to such process or method. Similarly, one or more devices or sub-systems or elements or
6
structures proceeded by "comprises... a" does not, without more constraints, preclude the
existence of other devices or other sub-systems.
Various embodiments of the invention will be described below in detail with
reference to the accompanying drawings.
Figure 1 illustrates an exemplary computing device (100) for context-aware chat
management in a chat application. Examples of the computing device (100) include, but are
not limited to a mobile phone, a tablet computer, a laptop, a desktop computer, a chat server,
an app server, cloud computers, etc. The computing device (100) may include one or more of:
a context manager (101), a control unit (102), a user interface (103), a storage means (105), a
communication interface (106), and other components (not shown).
The context manger (101) may be implemented as a specified program (whether by
means of hardware or software) for controlling or carrying out context management and
related processes in association with the control unit (102).
The control unit (102) may include one or more processors, microprocessors,
microcontrollers, application specific integrated circuits (ASICs), field programmable gate
arrays (FPGAs), or the like. The control unit (102) may centrally control the operation of the
computing device (100) and its components.
The user interface (103) may include mechanisms, such as a touch screen display, for
outputting information from the computing device (100) and/or inputting information to the
computing device (100). Other examples of input and output mechanisms might include a
speaker to receive electrical signals and output audio signals; a camera lens to receive image
and/or video signals and output electrical signals; a microphone to receive audio signals and
output electrical signals; buttons (e.g., control buttons and/or keys of a keypad) to permit data
and control commands to be input into the computing device (100); one or more sensors to
7
recognise user gestures; a vibrator to cause the computing device (100) to vibrate; a display
screen to output visual information; a light emitting diode, etc.
The storage means (104) may include a random access memory (RAM), a read only
memory (ROM), and/or other type of memory to store data and instructions that may be used
by the control unit (102). The storage means (104) may also include routines, programs,
objects, components, data structures, etc., which perform particular tasks, functions or
implement particular abstract data types.
The communication interface (105) may include any transceiver-like mechanism that
enables the computing device (100) to communicate with other devices and/or systems, for
instance a server or a service provider. For example, the communication interface (105) may
include a modem or an Ethernet interface to a LAN. The communication interface (105) may
also include mechanisms for communicating via a network, such as a wireless network. For
example, the communication interface (105) may include a transmitter that may convert
baseband signals from the control unit (102) to radio frequency (RF) signals and/or a receiver
that may convert RF signals to baseband signals. Alternatively, the communication interface
(105) may include a transceiver to perform functions of both a transmitter and a receiver. The
communication interface (105) may connect to an optional antenna assembly (not shown) for
transmission and/or reception of the RF signals. The communication interface (105) is used
broadly here. It may be used to establish a data connection with a chat server. It may also be
used for establishing a radio or any other type of connection with partner service providers.
In one embodiment, the computing device (100) comprises: a context manager (101)
coupled with a control unit (102) and configured to create at least one contextual group in an
ongoing chat, the at least one contextual group comprises one or more messages having same
context; and an user interface (103) coupled with the control unit (102) and configured to
display the at least one contextual group in the ongoing chat and perform an action with the at
8
least one contextual group in response to a user command, wherein the action comprises:
expanding the at least one contextual group, or collapsing the at least one contextual group.
In a further embodiment, the context manger (101) is further configured to
automatically establish a context for the one or more messages in an ongoing chat.
In a further embodiment, the at least one contextual group is automatically created in
accordance with the automatically established context.
In a further embodiment, wherein the user interface (103) is further configured to
receive a user input for manually initiating or changing a context in the ongoing chat.
In a further embodiment, the at least one contextual group is created in accordance
with the manually initiated or changed context.
In a further embodiment, the user interface (102) is further configured to display an
icon corresponding to the context along with the one or more messages.
In a further embodiment, the context manager (101) is configured to sort, based on the
context, a plurality of objects which can be inserted in the ongoing chat, and wherein the user
interface (102) is configured to display the sorted objects in the chat application to facilitate
user selection.
In a further embodiment, the user interface (102) is configured to display an alert in
the at least one contextual group upon receiving a new message in that contextual group.
Figure 2 illustrates an exemplary method (200) implemented in a chat application for
context-aware chat management. In one embodiment, the method (200) comprises: creating
(201) at least one contextual group in an ongoing chat, the at least one contextual group
comprises one or more messages having same context; displaying (202) the at least one
contextual group in the ongoing chat; and performing (203) an action with the at least one
9
contextual group in response to a user command, wherein the action comprises: expanding
the at least one contextual group, or collapsing the at least one contextual group.
In a further embodiment, the method (200) comprises: automatically establishing
(204) a context for the one or more messages in an ongoing chat.
In a further embodiment, the at least one contextual group is automatically created in
accordance with the automatically established context.
In a further embodiment, the method (200) comprises: receiving (205) a user input for
manually initiating or changing a context in the ongoing chat.
In a further embodiment, the at least one contextual group is created in accordance
with the manually initiated or changed context.
In a further embodiment, the method (200) comprises: displaying (206) an icon
corresponding to the context along with the one or more messages.
In a further embodiment, the method (200) comprises: sorting (207), based on the
context, a plurality of objects which can be inserted in the ongoing chat; and displaying (208)
the sorted objects in the chat application to facilitate user selection.
In a further embodiment, the method (200) comprises: displaying (209) an alert in the
at least one contextual group upon receiving a new message in that contextual group.
Subsequent paragraphs explain the present invention using examples of contextual
chat in healthcare, i.e., a chat between a doctor and a patient through a medical chat
application. Those skilled in the art will appreciate that it is not be construed as a limitation
on the scope of present invention, the same is applicable to any chat application, standalone
chat program, or web based chatting.
10
Figure 3 illustrates an exemplary scenario (300) for interaction in a contextual
healthcare chat that can be used to retrieve contextual groups in a chat session. Figure 3(a)
illustrates a normal chat session having detailed textual messages, pictures, vitals, symptoms,
etc. The user can provide a user input (301) to group the messages based on contexts, for
example, OPD visit 1 follow up, prescription queries, and so on. In this way, it will be easier
to search for a message by collapsing and/or expanding the long conversations thread, and
then scrolling up (302) and/or scrolling down (303) as shown in Figure 3(b). Figure 3(c)
illustrates that a user can pinch out (304) or tap to zoom in on any of the contextual group of
chats, i.e., a chat block (305). As illustrated in Figure 3(d), this user action will bring back the
details (306) in the chat session with focus on that particular contextual group (305).
Figure 4 illustrates an exemplary scenario (400) for manual change of context in a
chat session. More specifically, Figure 4a illustrates that a user can provide user input (401)
on an artefact for manual changing the context or starting a new context altogether. As
shown, said user input (401) changes the context to ‘Reports’ (402) as shown in the Figure
4b. A contextual group may be created based on the manually changed context or the newly
started context.
Figure 5 illustrates an exemplary scenario (500) for user initiated context detection.
As per the invention, the context of chat changes as per the communication automatically and
this helps in segmenting the chat thread in various contextual groups. In one example, the
user can select an attachment (501) to be sent in the chat session as shown in Figure 5a. The
attachment may be a report (502) as shown in Figure 5b. When this report is sent as an
attachment in the chat, there is change in the context (503) reflected in the user interface as
shown in Figure 5c. A contextual group may be created based on said context (503).
Figure 6 illustrates an exemplary scenario (600) for auto-detection of context in a
chat window. Continuing with previous example, the figure shows the change of context
11
(601) as automatically received at the Doctor’s end when the patient sends a report.
Consequently, a contextual group is created at Doctor’s end based on said automatically
established context (601).
Figure 7 illustrates an exemplary scenario (700) for reflection of chat contexts in a
dashboard. In one example, the context of the current chats is reflected on a doctor’s
dashboard. As illustrated, the regular icons (701) go through a flip animation effect to show
the context (702) of the chat along with the appointment type. In one implementation, an alert
may be displayed in a contextual group upon receiving a new message in that contextual
group.
Figure 8 illustrates an exemplary scenario (800) for prioritization of emoticons based
on the context. In one example, emoticons/Stickers may be prioritized based on a patient’s
chief complaints and symptoms. As illustrated, based on the type of the chief complaints of
the patient, the emoticons/stickers are re-arranged and shown in a preference order such that
those which are more relevant to the complaint/symptom being pointed out by the patient are
shown first.
Similarly, any artefacts or external objects, such as images, videos, audios, contacts,
locations, or the like that can be inserted in the chat session are sorted according to
established context and displayed accordingly. This facilitates quick selection of the
artefacts/external objects, hence enriches overall user experience. For example, if two friends
are chatting about a person having name XYZ and one friend wants to send the contact
details of XYZ in the chat session, then the contact list is automatically sorted based on the
identified context. That is, all the contacts having XYZ in their name will appear first in the
sorted contact list. In this way, the effort otherwise required in locating the XYZ’s contact
details in the contact book is saved to a very large extent.
12
Embodiments of the invention have been described in detail for purposes of clarity
and understanding. However, it will be appreciated that certain changes and modifications
may be practiced within the scope of the appended claims. Thus, although the invention is
described with reference to specific embodiments and figures thereof, the embodiments and
figures are merely illustrative, and not limiting of the invention. Rather, the scope of the
invention is to be determined solely by the appended claims.
13
We Claim:
1. A computing device (100) for context-aware chat management in a chat application, the
computing device (100) comprising:
a context manager (101) coupled with a control unit (102) and configured to create at
least one contextual group in an ongoing chat, the at least one contextual group comprises
one or more messages having same context; and
an user interface (103) coupled with the control unit (102) and configured to display
the at least one contextual group in the ongoing chat and perform an action with the at least
one contextual group in response to a user command, wherein the action comprises:
expanding the at least one contextual group, or collapsing the at least one contextual group.
2. The computing device (100) as claimed in claim 1, wherein the context manger (101) is
further configured to automatically establish a context for the one or more messages in an
ongoing chat.
3. The computing device (100) as claimed in claim 2, wherein the at least one contextual group
is automatically created in accordance with the automatically established context.
4. The computing device (100) as claimed in claim 1, wherein the user interface (103) is further
configured to receive a user input for manually initiating or changing a context in the ongoing
chat.
5. The computing device (100) as claimed in claim 4, wherein the at least one contextual group
is created in accordance with the manually initiated or changed context.
6. The computing device (100) as claimed in claim 1, wherein the user interface (103) is further
configured to display an icon corresponding to the context along with the one or more
messages.
14
7. The computing device (100) as claimed in claim 1, wherein the context manager (101) is
configured to sort, based on the context, a plurality of objects which can be inserted in the
ongoing chat, and wherein the user interface (103) is configured to display the sorted objects
in the chat application to facilitate user selection.
8. The computing device (100) as claimed in claim 1, wherein the user interface (103) is
configured to display an alert in the at least one contextual group upon receiving a new
message in that contextual group.
9. A method (200) implemented in a chat application for context-aware chat management, the
method (200) comprising:
creating (201) at least one contextual group in an ongoing chat, the at least one
contextual group comprises one or more messages having same context;
displaying (202) the at least one contextual group in the ongoing chat; and
performing (203) an action with the at least one contextual group in response to a user
command, wherein the action comprises: expanding the at least one contextual group, or
collapsing the at least one contextual group.
10. The method (200) as claimed in claim 9 further comprising:
automatically establishing (204) a context for the one or more messages in an ongoing
chat.
11. The method (200) as claimed in claim 10, wherein the at least one contextual group is
automatically created in accordance with the automatically established context.
12. The method (200) as claimed in claim 9 further comprising:
receiving (205) a user input for manually initiating or changing a context in the
ongoing chat.
13. The method (200) as claimed in claim 12, wherein the at least one contextual group is created
in accordance with the manually initiated or changed context.
15
14. The method (200) as claimed in claim 9, further comprising:
displaying (206) an icon corresponding to the context along with the one or more
messages.
15. The method (200) as claimed in claim 9, further comprising:
sorting (207), based on the context, a plurality of objects which can be inserted in the
ongoing chat; and
displaying (208) the sorted objects in the chat application to facilitate user selection.
16. The method (200) as claimed in claim 9, further comprising:
displaying (209) an alert in the at least one contextual group upon receiving a new
message in that contextual group.

Documents

Application Documents

# Name Date
1 201611000631-FER.pdf 2021-10-17
1 Power of Attorney [07-01-2016(online)].pdf 2016-01-07
2 Form 5 [07-01-2016(online)].pdf 2016-01-07
2 201611000631-IntimationOfGrant04-10-2021.pdf 2021-10-04
3 Form 3 [07-01-2016(online)].pdf 2016-01-07
3 201611000631-PatentCertificate04-10-2021.pdf 2021-10-04
4 Form 18 [07-01-2016(online)].pdf 2016-01-07
4 201611000631-CLAIMS [02-09-2020(online)].pdf 2020-09-02
5 Drawing [07-01-2016(online)].pdf 2016-01-07
5 201611000631-FER_SER_REPLY [02-09-2020(online)].pdf 2020-09-02
6 Description(Complete) [07-01-2016(online)].pdf 2016-01-07
6 201611000631-OTHERS [02-09-2020(online)].pdf 2020-09-02
7 201611000631-Form-1-(01-02-2016).pdf 2016-02-01
7 201611000631-8(i)-Substitution-Change Of Applicant - Form 6 [22-11-2019(online)].pdf 2019-11-22
8 201611000631-Correspondence Others-(01-02-2016).pdf 2016-02-01
8 201611000631-ASSIGNMENT DOCUMENTS [22-11-2019(online)].pdf 2019-11-22
9 201611000631-PA [22-11-2019(online)].pdf 2019-11-22
9 201611000631-Correspondecne Others-(15-03-2016).pdf 2016-03-15
10 201611000631-Form-2-(07-04-2016).pdf 2016-04-07
10 abstract.jpg 2016-07-10
11 201611000631-Correspondence Others-(07-04-2016).pdf 2016-04-07
11 REQUEST FOR CERTIFIED COPY [04-05-2016(online)].pdf 2016-05-04
12 REQUEST FOR CERTIFIED COPY [04-05-2016(online)].pdf_7.pdf 2016-05-04
13 201611000631-Correspondence Others-(07-04-2016).pdf 2016-04-07
13 REQUEST FOR CERTIFIED COPY [04-05-2016(online)].pdf 2016-05-04
14 201611000631-Form-2-(07-04-2016).pdf 2016-04-07
14 abstract.jpg 2016-07-10
15 201611000631-Correspondecne Others-(15-03-2016).pdf 2016-03-15
15 201611000631-PA [22-11-2019(online)].pdf 2019-11-22
16 201611000631-ASSIGNMENT DOCUMENTS [22-11-2019(online)].pdf 2019-11-22
16 201611000631-Correspondence Others-(01-02-2016).pdf 2016-02-01
17 201611000631-8(i)-Substitution-Change Of Applicant - Form 6 [22-11-2019(online)].pdf 2019-11-22
17 201611000631-Form-1-(01-02-2016).pdf 2016-02-01
18 201611000631-OTHERS [02-09-2020(online)].pdf 2020-09-02
18 Description(Complete) [07-01-2016(online)].pdf 2016-01-07
19 201611000631-FER_SER_REPLY [02-09-2020(online)].pdf 2020-09-02
19 Drawing [07-01-2016(online)].pdf 2016-01-07
20 Form 18 [07-01-2016(online)].pdf 2016-01-07
20 201611000631-CLAIMS [02-09-2020(online)].pdf 2020-09-02
21 Form 3 [07-01-2016(online)].pdf 2016-01-07
21 201611000631-PatentCertificate04-10-2021.pdf 2021-10-04
22 Form 5 [07-01-2016(online)].pdf 2016-01-07
22 201611000631-IntimationOfGrant04-10-2021.pdf 2021-10-04
23 Power of Attorney [07-01-2016(online)].pdf 2016-01-07
23 201611000631-FER.pdf 2021-10-17

Search Strategy

1 searchstrategy_26-02-2020.pdf

ERegister / Renewals

3rd: 03 Jan 2022

From 07/01/2018 - To 07/01/2019

4th: 03 Jan 2022

From 07/01/2019 - To 07/01/2020

5th: 03 Jan 2022

From 07/01/2020 - To 07/01/2021

6th: 03 Jan 2022

From 07/01/2021 - To 07/01/2022

7th: 03 Jan 2022

From 07/01/2022 - To 07/01/2023

8th: 07 Jan 2023

From 07/01/2023 - To 07/01/2024