Sign In to Follow Application
View All Documents & Correspondence

Method And System For Touch Sensor Based Differential User Interface (Ui) Solutions

Abstract: ABSTRACT METHOD OF ENABLING DEVICE FUNCTION IN A TOUCH SCREEN ENABLED DEVICE USING USER INPUTS AND A TOUCH SCREEN ENABLED DEVICE THERE OF The present invention discloses a touch screen enabled device and a method for enabling one or more device function in a touch screen enabled device using one or more user inputs. The method comprising detecting a change in self capacitance and mutual capacitance in device display panel in a pre-defined time period based on one or more user inputs in the touch screen enabled device, identifying whether the user input is a valid gesture based on a pre-defined criteria and enabling the one or more device functions corresponding to the identified one or more user inputs in the touch screen enabled device. The method further comprising enabling a multi window function in the device display panel of the touch screen enabled device by providing user inputs to the touch screen enabled device. Moreover, the present invention discloses a method of automatically hide the keypad when palm, thumb, or finger is not touching, or not hovering above the keypad. Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 January 2014
Publication Number
36/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-10-12
Renewal Date

Applicants

SAMSUNG R&D INSTITUTE INDIA – BANGALORE PRIVATE LIMITED
# 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India

Inventors

1. THAJUDEEN, Niyas Ahmed Sulthar
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
2. DEOTALE, Gunjan Prakash
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
3. VAISH, Rahul
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
4. BHAMIDIPATI, Sreevatsa Dwaraka
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India

Specification

DESC:
FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)

METHOD OF ENABLING DEVICE FUNCTION IN A TOUCH SCREEN DEVICE USING USER INPUTS AND A TOUCH SCREEN DEVICE THERE OF

SAMSUNG R&D INSTITUTE INDIA – BANGALORE PRIVATE LIMITED
# 2870, ORION Building, Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post, Bangalore-560 037
An Indian Company

The following Specification particularly describes the invention and the manner in which it is to be performed

RELATED APPLICATION

Benefit is claimed to Indian Provisional Application No. 434/CHE/2014 titled “METHOD AND SYSTEM FOR TOUCH SENSOR BASED DIFFERENTIAL USER INTERFACE (UI) SOLUTIONS” filed on 30 January 2014, which is herein incorporated in its entirety by reference for all purposes.

FIELD OF INVENTION

The present invention relates to the field of touch screen and more particularly relates to a method and apparatus for enabling device function(s) in a touch screen device using user inputs.

BACKGROUND OF THE INVENTION

The touch screen of different types of electronic device detects the co-ordinates of point of touch and performs the pre-defined action. With touch screen, the movement of the input pointer on a display can correspond to the relative movements of the user's finger as the finger is moved along a surface of the touch screen. Likewise, the hand gestures have been implemented in touch screens such as selections can be made when one or more taps can be detected on the surface of the touch screen. In some cases, any portion of the touch screen can be tapped, and in other cases a dedicated portion of the touch screen can be tapped. In addition to selections, scrolling can be initiated by using finger motion at the edge of the touch screen.
In recent times, more advanced gestures have been implemented. For example, scrolling can be initiated by placing four fingers on the touch screen so that the scrolling gesture is recognized, and thereafter moving these fingers on the touch screen to perform scrolling events. In certain applications, it can be beneficial to enable a user to use “real-world” gestures such as hand movements and/or finger orientations that can be generally recognized to mean certain things to more efficiently and accurately effect intended operations. In all the conventional touch inputs, the contact occurs between a user’s finger and the touch panel or directly above the touch panel area in case of hovering supported touch devices within the supported hovering range and the touch panel determines whether a touch/hover occurred or not. Further, an input coordinate is detected. The touch panel is also referred as device
Whereas in certain scenario’s there is a need for user inputs without touching the touch panel directly such as logging in to any of the device specific applications such as volume, brightness etc is done by selecting different settings. This may interrupt running applications. Hence, there exists a need of a method and system for interpreting touching on touch insensitive region near to the touch sensitive regions, and hovering above touch sensitive regions.

BRIEF DESRIPTION OF THE ACCOMPANYING DRAWINGS

The aforementioned aspects and other features of the present invention will be explained in the following description, taken in conjunction with the accompanying drawings, wherein:

Figure 1 is flow diagram illustrating a method of enabling one or more device function in a touch screen device using one or more user inputs, according to one embodiment of present invention.

Figure 2 is an exemplary embodiment of present invention, wherein controlling the device function of a touch screen device using one or more user inputs.

Figure 3 is another exemplary embodiment of present invention, wherein controlling the device function of a touch screen device using one or more user inputs.

Figure 4 is another exemplary embodiment of present invention, wherein controlling the device function of a touch screen device using one or more user inputs.

Figure 5 is yet another exemplary embodiment of present invention, wherein controlling the device function of a touch screen device using one or more user inputs.

Figure 6A is block diagram illustrating a touch screen device, according to one embodiment of present invention.

Figure 6B is block diagram illustrating a touch screen device, according to another embodiment of present invention.

Figure 6C is block diagram illustrating a touch screen device, according to yet another embodiment of present invention.

DETAILED DESCRIPTION OF THE INVENTION

The embodiments of the present invention will now be described in detail with reference to the accompanying drawings. However, the present invention is not limited to the embodiments. The present invention can be modified in various forms. Thus, the embodiments of the present invention are only provided to explain more clearly the present invention to the ordinarily skilled in the art of the present invention. In the accompanying drawings, like reference numerals are used to indicate like components.

The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.

As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include operatively connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.

Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.

Figure 1 is flow diagram illustrating a method of enabling one or more device functions in a touch screen device using one or more user inputs, according to one embodiment of the present invention. The present invention comprises a method of analyzing and interpreting mutual capacitance and self-capacitance data from a touch screen. Self-capacitance is the charge divided by potential of a conductor. Whereas, mutual capacitance is capacitance that occurs between two charge-holding objects or conductors, in which the current passing through one passes over into the other.

According to the present invention, the input to the touch screen device is single touch or mutual touch or hovering above the device display panel or while touching or hovering above touch-insensitive areas of touch enabled device or combination of above listed activities or regions near to the touch screen panel. The touch insensitive areas of the touch screen device may be the back cover, peripheral portions of the touch screen device, side portions of the touch screen or display panel and the area near the speaker of the touch screen device. The touch input generates a change in mutual capacitance and self-capacitance. This change in self-capacitance and mutual capacitance is raw data as it is obtained from touch screen panel (system on chip platform or firmware). This raw data is referred as touch frame data.

Hereafter, the touch screen panel of the touch screen enabled device is referred to as a screen or a device display panel.

Any touch enabled device comprises a touch screen panel IC, an application processor and a system user interface. The method of enabling one or more device function in a touch screen enabled device using one or more user inputs is enabled using the aforementioned components.

The touch frame data is detected from a touch controller in the touch screen device in step 101. Further, in step 102, the detected data is processed to identify whether the touch frame data possess any valid input information regarding touch regions over the touch screen device.

The touch input in the touch insensitive regions near the boundary of the touch panel is referred as peripheral touch input / peripheral user input.

Also, hereafter, the user input will be referred to touch input or user inputs or peripheral user input.

It is to be noted that, all the touch user inputs on the insensitive regions such as regions near the boundary of touch panel or hover inputs above the device display panel need not be valid inputs. The method described in reference to figure 1 proposes a pre-defined set of criteria for detecting valid user inputs. The touch input is detected based on the pattern as well as the time limit. For instance, a swipe at the edge of the touch screen device must be in particular time limit, so that the corresponding gesture is detected. This time limit is pre-defined based on each of the gestures while defining the gestures. For example, the time limit for swiping is different from the time limit for touch over the speaker. Moreover, each of the gesture is defined based on the change in self capacitance and mutual capacitance. Hence, in order to detect a valid gesture, the change in capacitance must be above a pre-defined threshold. This pre-defined threshold is also defined corresponding to each of the gestures.

If the change in self-capacitance and mutual capacitance in particular place of touch panel is above the pre-defined threshold of capacitance change and within the pre-defined time period. At step 103, the touch regions are identified based on the processed touch frame data. Identification of touch regions in case of change self-capacitance and mutual capacitance are performed in different methods. For example, consider that the received touch frame data is self-capacitance data, then all the peaks that are above the pre-defined threshold of capacitance change is scanned and the touch regions are identified. Likewise, if the received touch frame data is mutual capacitance data, then all the mutual capacitance values are scanned and the regions that are above the pre-defined threshold of capacitance change is identified as touch regions.

At step 104, it is determined whether the touch region is present near the boundary of the touch panel. If the touch region is present near the boundary of the touch panel, then it is identified whether the peripheral user input is valid or not based on one or more pre-defined criteria as shown in step 105. If the peripheral user input is valid, then at step 107, one or more device function corresponding to the peripheral user input is enabled in the touch enabled device. The pre-defined criterion for identifying the valid peripheral user input / hover input comprises self-capacitance, mutual capacitance and change in self capacitance and mutual capacitance. The pre-defined criteria are set based on the time and change in capacitance in a pre-defined time period. For instance, the pre-defined time period for touch over the speaker of the touch enabled device and the corresponding change in capacitance is different for a gesture such as a swipe over edge of the touch screen device.
Some of the user inputs and corresponding device functions are tapping on speaker to play or pause music, scrolling at edges of the device for adjusting brightness, or for adjusting volume, auto hiding the keypad when palm, finger, or thumb is not touching or hovering above the keypad and displaying the keypad back when palm, finger, or thumb hovers in the region of edit text etc. Likewise, the devices functions such as multi window is enabled or disabled by slicing the device display panel with hand, horizontally or vertically.

Likewise, touch on specific portion of side touch sensors in combination with user input can assign for launching Camera. This device function can also be enabled while the phone is in lock state. The user actions through touch such as single or multiple touch or hovering can activate a specific application.

According to one embodiment of present invention, the touch sensors can be placed in any location spread across the device edges of front surface, back surface, curved portions or any area of the device which can be touched to generate a valid sensor data.

Figure 2 is an exemplary embodiment of present invention, wherein controlling the device function of a touch screen enabled device using one or more user inputs. In the example shown in the figure, Assume that music is playing in background. According to the method explained in the present invention, if the user taps on the speaker above the touch screen panel and music pauses. Likewise, if the user again taps on the speaker above the touch screen panel and music resumes.

In the above case, the mutual capacitance and self-capacitance is obtained from the touch screen panel, when the user tapped on the speaker and the changes in self-capacitance and mutual capacitance are interpreted by the Peripheral User Input Identification Module and Gesture Detection Module and then an interrupt is sent to the application processor for enabling or disabling the music player.

Figure 3 is an exemplary embodiment of present invention, wherein controlling the device function of a touch screen enabled device using one or more user inputs. As indicated in the figure, when user scrolls at the edges of the device, a dialogue is shown to the user corresponding to the device function. For instance, consider that the device function is one of a volume control and brightness control. Then the dialogue box the current brightness level is indicated by position of the seek bar. As user scrolls down, the brightness is decreased and as user scrolls up, the brightness is increased. The same procedure is performed in case of volume control.

Here, the changes in mutual capacitance and self-capacitance are monitored. The mutual capacitance and self-capacitance data are obtained from the touch screen panel. Further, the device function based on the configured user inputs and corresponding change in capacitance. The region of scrolling can be the edge of the device .i.e. a touch insensitive region near to the touch screen panel.

Figure 4 is another exemplary embodiment of present invention, wherein controlling the device function of a touch screen enabled device using one or more user inputs. This scenario is associated with the touch enabled devices having multi-window feature. The multi window feature is enabled or disabled in a conventional touch enabled device using settings. Whereas, in the present invention, the multi-window feature is enabled when user slices the device display panel with his hand horizontally and when user slices the device display panel with his hand vertically, the multi window setting is disabled. However, the user can configure the horizontal and vertical slicing based on his preferences.

In this embodiment, the changes in mutual capacitance data are monitored when the user slices the device display panel with his hand. Based on the changes in the monitored mutual capacitance, the slicing pattern is determined. Further, the corresponding function is enabled.

Figure 5 is another exemplary embodiment of present invention, wherein controlling the device function of a touch screen enabled device using one or more user inputs. As illustrated in the figure, when the palm, thumb or finger is not touching or not hovering above the keypad, the keypad is hidden. When user brings his thumb or finger near to the text space, the keypad is displayed again. The changes in self-capacitance data obtained from touch screen panel while user is hovering above it, and decide whether keypad should be hidden or displayed.

The method according to present invention comprises steps incudes identifying whether an application launched by user involve a typing space, detecting hovering above the typing space by identifying the change in the self-capacitance popping a keypad based on the identified change in the self- capacitance .

Likewise, consider a touch on a specific portion of side touch sensors in combination with user input may be assigned for controlling device functions such as launch Camera from any active application. Camera application is launched from any native application like Browser, Messaging etc. as soon as user holds the device on the specific portion of side sensors in combination with peripheral user input / user input. The grip is detected by monitoring the changes in self-capacitance data obtained from side touch sensors / peripheral user input on the device. Further, from the change in self-capacitance obtained from touch screen panel, since the user's thumbs/fingers hovering above or touching the device display panel along the edges of device.

Figure 6A is block diagram illustrating a touch screen enabled device, according to one embodiment of present invention. The touch screen enabled device according to one embodiment of present invention includes a touch screen panel IC (TSP IC) 601, an application processor 610. The TSP IC 601 can be a microcontroller or a micro-processor. According to one embodiment of present invention, the TSP IC 301 comprises touch panel 602, touch controller 603, a TSP firmware 604, a CPU 607, RAM 608 and register 609. The TSP firmware 604 includes a peripheral user input identification module 605 and gesture detection module 606.

The peripheral user input identification module 605 identifies the peripheral user input based on the change in self-capacitance and mutual capacitance when the user touches the insensitive region or area near the sensitive region. The touch frame data is considered as the input of the touch screen enabled device 600. The gesture detection module 606 identifies the valid pre-defined gestures based on the user input.

The TSP IC sends an interrupt to the application processor. The application processor 610 comprises system and platform 611, CPU 613, RAM 614 and registers 615. The system and platform comprises a device function enabling module 612. The device function enabling module 614 in the system user interface enables the device function.

Figure 6B is block diagram illustrating a touch screen enabled device, according to another embodiment of present invention. In the present embodiment, The TSP firmware 604 includes only the peripheral user input identification module 605. Whereas, the system and platform 611 of the application processor includes gesture detection module 606 and device function enabling module 612. This embodiment of the present invention enables the detection of gesture and the enablement of the corresponding device function from the application processor 610.

Figure 6C is block diagram illustrating a touch screen enabled device, according to another embodiment of present invention. In the present embodiment, the system and platform 611 of the application processor 610 comprises peripheral user input identification module 605, gesture detection module 606 and device function enabling module 612. The touch input or the touch frame data from the touch panel 602 is sent to the application processor 610 using the touch controller 603. The identification of the input, detection of the gesture and the enablement of the device function are performed by the peripheral user input identification module 605, gesture detection module 606 and device function enabling module 612 respectively.

Although the invention of the method and system has been described in connection with the embodiments of the present invention illustrated in the accompanying drawings, it is not limited thereto. It will be apparent to those skilled in the art that various substitutions, modifications and changes may be made thereto without departing from the scope and spirit of the invention.

,CLAIMS:We claim:

1. A method of enabling one or more device functions in a touch screen device using one or more user inputs, comprising:
detecting a change in self capacitance and mutual capacitance in a display panel of the touch screen device in a pre-defined time period based on the one or more user inputs provided to the touch screen device;
determining that the one or more user inputs comprise a valid gesture based on a pre-defined criteria; and
enabling one or more device functions corresponding to the determined gesture in the touch screen enabled device.

2. The method as claimed in claim 1, wherein the validity of a gesture is determined by comparing the change in self capacitance and mutual capacitance with a pre-defined threshold value in the pre-defined time period.

3. The method as claimed in claim 1, wherein the one or more user inputs comprise one or more of:
a vertical scroll at one or more portions of the touch screen device that are touch insensitive;
a horizontal scroll at one or more portions of the touch screen device that are touch insensitive; and
a tap at one or more portions of the touch screen device that are touch insensitive.

4. The method as claimed in claim 3, wherein the one or more user inputs are provided by one of a touch and a hover.

5. The method as claimed in claim 1, further comprising:
enabling a multi window function in the device display panel of the touch screen device by providing the one or more user inputs to the touch screen device.

6. The method as claimed in claim 5, wherein enabling the multi window function in the device display panel comprising:
dividing the device display panel into at least two portions in response to the one or more touch inputs provided along at least one of a vertical direction and horizontal direction on the device display panel.

7. The method as claimed in claim 6, wherein the touch inputs provided along at least one of a vertical direction and horizontal direction is detected based on the change in mutual capacitance in one of a horizontal and vertical direction on the screen.
8. The method as claimed in claim 1, further comprising:
identifying whether an application launched by user involves a typing area;
detecting hovering above the typing area by identifying the change in the self-capacitance; and
popping a keypad based on the identified change in the self- capacitance.

9. A touch screen enabled device for enabling one or more device function using one or more user inputs, comprises:
a touch sensor panel that transmits the analog capacitance value to the touch controller;
a touch controller that converts the transmitted analog signal into digital signal and transmits the digital signal to the TSP Firmware; ;
a TSP Firmware determines the touch co-ordinates , also sends the touch frame data to the other sub modules.
a peripheral user input identification module for identifying the peripheral touch region;
a gesture detection module adapted for identifying whether the user input being a valid gesture based on a pre-defined criteria; and
a device function enabling module adapted for enabling the one or more device functions corresponding to the identified one or more user inputs in the touch screen enabled device.

10. The touch screen enabled as claimed in claim 9, wherein the peripheral user input identification module identifies the user inputs by comparing the change in self capacitance and mutual capacitance with a pre-defined threshold value in the pre-defined time period.

11. The touch screen enabled as claimed in claim 9, wherein the peripheral user input identification module is configured to identify one or more user inputs comprises:
a vertical scroll at one or more portions of the touch screen device that are touch insensitive;
a horizontal scroll at one or more portions of the touch screen device that are touch insensitive; and
a tap at one or more portions of the touch screen device that are touch insensitive.

12. The touch screen enabled device as claimed in claim 11, wherein the one or more user inputs are one of a finger touch, finger hover, stylus touch and stylus hover.

13. The touch screen enabled as claimed in claim 9, wherein the device function enabling module is further configured for enabling a multi window function in the device display panel of the touch screen enabled device by providing user inputs to the touch screen enabled device,
wherein the user input for enabling the multi window function comprises:
touch based slicing in one of a horizontal and vertical direction on the device display panel of the touch screen enabled device.

14. The touch screen enabled as claimed in claim 9, wherein inputs provided along at least one of a vertical direction and horizontal direction is detected based on the change in mutual capacitance in one of a horizontal and vertical direction on the screen.

15. The touch screen enabled as claimed in claim 9, further configured for:
determining whether an application launched by user involve a typing area;
detecting hovering above the typing area by identifying the change in the self-capacitance; and
popping a keypad based on the identified change in the self- capacitance.

Dated this the 16th day of January 2015


Signature

KEERTHI J S
Patent Agent
Agent for the Applicant

Documents

Application Documents

# Name Date
1 434-CHE-2014-IntimationOfGrant12-10-2022.pdf 2022-10-12
1 POA_Samsung R&D Institute India-Bangalore.pdf 2014-01-31
2 434-CHE-2014-PatentCertificate12-10-2022.pdf 2022-10-12
2 2013_SSG_1477_Provisional Specification_for filing.pdf 2014-01-31
3 434-CHE-2014-FER_SER_REPLY [18-06-2020(online)].pdf 2020-06-18
3 2013_SSG_1477_Drawings_FOR FILING.pdf 2014-01-31
4 434-CHE-2014-FER.pdf 2019-12-18
4 374-CHENP-2012 CORRESPONDENCE OTHERS 21-07-2014.pdf 2014-07-21
5 434-CHE-2014-FORM 13 [05-08-2019(online)].pdf 2019-08-05
5 374-CHENP-2012 POWER OF ATTORNEY 21-07-2014.pdf 2014-07-21
6 434-CHE-2014-FORM-26 [03-08-2019(online)].pdf 2019-08-03
6 374-CHENP-2012 FORM-1 21-07-2014.pdf 2014-07-21
7 Form-2(Online).pdf 2016-11-24
7 2013_SSG_1477_drawing for filing _16 Jan 2015.pdf 2015-03-12
8 2013_SSG_1477_CS for filing _16 Jan 2015.pdf 2015-03-12
9 Form-2(Online).pdf 2016-11-24
9 2013_SSG_1477_drawing for filing _16 Jan 2015.pdf 2015-03-12
10 374-CHENP-2012 FORM-1 21-07-2014.pdf 2014-07-21
10 434-CHE-2014-FORM-26 [03-08-2019(online)].pdf 2019-08-03
11 434-CHE-2014-FORM 13 [05-08-2019(online)].pdf 2019-08-05
11 374-CHENP-2012 POWER OF ATTORNEY 21-07-2014.pdf 2014-07-21
12 434-CHE-2014-FER.pdf 2019-12-18
12 374-CHENP-2012 CORRESPONDENCE OTHERS 21-07-2014.pdf 2014-07-21
13 434-CHE-2014-FER_SER_REPLY [18-06-2020(online)].pdf 2020-06-18
13 2013_SSG_1477_Drawings_FOR FILING.pdf 2014-01-31
14 434-CHE-2014-PatentCertificate12-10-2022.pdf 2022-10-12
14 2013_SSG_1477_Provisional Specification_for filing.pdf 2014-01-31
15 POA_Samsung R&D Institute India-Bangalore.pdf 2014-01-31
15 434-CHE-2014-IntimationOfGrant12-10-2022.pdf 2022-10-12

Search Strategy

1 Searchstrategy_18-09-2019.pdf

ERegister / Renewals

3rd: 29 Dec 2022

From 30/01/2016 - To 30/01/2017

4th: 29 Dec 2022

From 30/01/2017 - To 30/01/2018

5th: 29 Dec 2022

From 30/01/2018 - To 30/01/2019

6th: 29 Dec 2022

From 30/01/2019 - To 30/01/2020

7th: 29 Dec 2022

From 30/01/2020 - To 30/01/2021

8th: 29 Dec 2022

From 30/01/2021 - To 30/01/2022

9th: 29 Dec 2022

From 30/01/2022 - To 30/01/2023

10th: 29 Dec 2022

From 30/01/2023 - To 30/01/2024