Sign In to Follow Application
View All Documents & Correspondence

A Virual Reality Headset

Abstract: The invention is a mobile visual function measurement system with a head mounted device means, with a head portion means, a remote portion means, alongwith control means and communication means. The arrangement is capable of being used for multiple visual functions and specifically in perimetric area.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 May 2017
Publication Number
45/2018
Publication Type
INA
Invention Field
PHYSICS
Status
Email
brinda@iprightsindia.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-12-19
Renewal Date

Applicants

DR. AGARWAL'S HEALTHCARE LIMITED
19, CATHEDRAL ROAD, CHENNAI TAMIL NADU INDIA

Inventors

1. VAMSI CHINTALAPATI
2-1-255/1, OLD NALLAKUNTA, STREET 14 ,HYDERABAD TELANGANA INDIA
2. RAVITEJA CHIVUKULA
PLOT 37, SRI VENKATESWARA ENCLAVE, UPPARPALLI, HYDERABAD TELANGANA INDIA.

Specification

DESC:FIELD OF INVENTION:

The invention is relating to the field of a virtual reality headset to measure and characterize visual functions.

PRIOR ART :

Idea of using a LCD screen for visual function measurement has been used actively in the past two decades. Specifically, campimetry and the Octopus 600 visual field analyzer use LCD screens to measure visual field.

A headset in which a phone can be used to perform glaucoma screening is known in prior art.

IMO is another prior art product which is a head-mounted perimeter.

The problem suffered in prior art are
• Existing field analyzers are bulky
• Most of the head mounted perimeters stated above test a very narrow field at any given time. This results in having the patient to look at different parts of the screen.
• Even the head mounted perimeter the weight of the product is around 1.8 kgs .
• It is not possible to test other visual functions on the same device.

OBJECT OF THE INVENTION :

Ophthalmologists routinely measure visual function to diagnose and grade a patients health. This is done by measuring multiple parameters such as visual acuity, contrast sensitivity, glare adaptation, depth sensitivity and health of the visual field. Specifically, visual field measurements are used to diagnose diseases such as Glaucoma. Devices like the Computerized field analyzer (used for visual field measurement) are expensive and bulky. The invention aims to solve this problem.

DESCRIPTION OF THE INVENTION:

Figure 1 : Mobile Visual Function Measurement System
Figure 2 : Data Flow
Figure 3 : System Set-Up-Variant 1
Figure 4 : System Set-Up-Variant 2
Figure 5 : Server Connectivity
Figure 6 : Headset (Top View)
Figure 7 : Headset (side view)
Figure 8 : Optic System
Figure 9 : Binocular layout

PART NAME PART NUMBER
1 DISPLAY AND COMPUTER
2 BODY
3 LENSES
4 LENSE HOLDER
5 VIEW PORT
6 EYE POSITION DETECTORS

System Overview
A mobile visual function measurement system (Figure 1) is proposed. It consists of a test control unit means (1.1), a head mounted device means (1.2), a patient response button means (1.3) and a server means (backend).

• The test control unit means is used by test administrator to set up the test and to start the test. It is also used by the administrator to monitor and control the test procedure live and finally to generate a test report based on patient responses.
• The headset means is worn by the patient taking a test. The test is presented to the patient using a display on the headset means.
• The patient response button means is used by the patient to record his information. The response can be a click of a button means, a 2d/3d motion in the air or a voice response.
• The server means is used to save all test related information. It is to be noted that the server means is connected to all the visual function measurement systems.

The Figure 2 shows the broad flow-chart of the workflow followed by the system as per one of the embodiments of the invention.

All the components of the system are designed to communicate with each other through wireless data connections and wired connections (where applicable). Two possible embodiments and their functions are explained below.

• System set-up variant 1 (Figure 3)

The test control unit means has a two-way communication with the server means which in turn has a connection to the headset device means. The headset means communicates with the test control device means through the server means (indirectly). The patient response is paired with the headset device means and sends its data for the latter to record. Also, the server means is designed such that it can be accessed using the word-wide web.

• System set-up variant 2 (Figure 4).

In this embodiment, the test control unit means is connected to the headset means through a direct two-way connection. Data is transmitted directly between the two. Patient response button means still communicates directly with the headset. The test control unit means has a connection to the server means to save test information. Again, the server means is connected to the internet and can be accessed using the world-wide web.

One of the aims of creating the product as per the invention is that the setup used for multiple visual functions. The initial work is predominantly in the area of perimeters. The description may focus mainly on that. However, the scope of the patent is for a head mounted set up for other visual function tests such as visual acuity testing, refraction correction, contrast sensitivity, glare sensitivity, ocular motility, stereopsis, light and dark adaptation also.

Individual Components

SERVER MEANS

The server means is a common connected back-end which is linked to all the mobile field. The server means collects all test and report information. Based on results obtained from the different devices, statistical information about the population is generated and transferred back to the devices. This information includes
• Global and Local normative data bases for visual fields
• Updates to the statistical information based on collected information

The Figure 5 shows the data flow between the device means and server means.

TEST CONTROL UNIT MEANS

The test control means can be any device that is capable of wireless data connections. A software on the test control unit has different functions they are.
• Test Set-Up: Here the test type (screening/measurement/type of measurement) and patient details such as age sex are entered.
• Trigger test: Patient information and related base statistical information based on set-up preferences is transmitted to the headset. Once this information is received by the headset, the test administrator can trigger the test using the test control unit itself.
• During test: The test control unit displays live information of how the test progressing. This includes information such as percentage completion, results of locations where the test is complete and along with patient behavior information that is obtained through patient monitoring sub-system on the headset. (explained in next sections)
• Post Test: Once the test is completed, the test control uses patient responses and local and global statistical performance parameters for people similar to the patient are used to generate a test report. Also, compiled patient responses
• Data Update: The server occasionally updates the local and global statistical patient information.

HEADSET AND PATIENT RESPONSE BUTTON MEANS

The headset, top-view shown in Figure 6, consists of an electronic display and mobile computing unit, a body that holds all components, aspheric condensing lenses and a lens holding fixture for adding additional correction lenses as per patient requirement. It is to be noted that the display and control unity also has wireless communication capabilities and an integrated battery.

In addition to this, the headset also incorporates an eye-position detection system which consists of an assembly of two hot mirrors and full field cameras, one for each eye. This can be seen in Figure 7.

The mobile onboard computer is the main controller of the functions performed by the headset. It is used to communicate with the test control unit and patient response button, control headset-display, record patient response, track head and eye positions of the patient and start/pause/resume/end visual function tests.

The distance between the display and lenses is designed such that magnified virtual image at a much higher distance (~10 times) is created when seen from the view port. The body is designed to separate the viewport of each eye; that is, different images presented to each of the eyes. The virtual image magnification and image distance is set such that the resulting image covers a field of 60 degrees when the user is focusing at the center of the said image.

Lens holders are provided at each lens so as to add lenses that account for refractive power of the user. This ensures the patient sees the virtual image formed by the headset clearly. The lens holders are a two-part assembly. They are

• Support sub-assembly: this is fixed to the headset and forms the part which has features to hold specific lenses. This sub-assembly consists of three magnets which are used to support the positioning of the lens sleeve sub-assembly
• Lens sleeve sub-assembly: The lens sleeve is a removable component. It is held onto the headset using magnets. The lens sleeve can hold up to two lenses and can lock their relative rotational positions.

When required, the lens sleeve is removed from the headset and the required lenses are placed in it and put back on the headset. This way, the refraction correction required for users can be incorporated as per requirements of the user.

Eye position detection system is used to determine area in space the user is looking at when the test is being conducted. In order to get accurate visual function measurements, patients need to focus on a specific area of the visible field; that is gaze at a desired area. Eye position trackers are used to determine the patient gaze when the test is being conducted. The eye trackers consist of
• Full spectrum CMOS cameras: These cameras can record images in the visible spectrum as well as in the infra-red spectrum
• Infra-red LEDs and nylon diffusers: When the headset is worn, the area around the eye is too dark for a camera to capture any useful image. Normal flash cannot be used as it will distract a patient from performing the visual field test. Hence, infra-red LEDs are used coupled with nylon diffusers to provide uniform illumination of the eye region (the exact design of the diffuser could be patentable).
• Hot mirrors used to allow the cameras view the patient’s eyes while not obstructing the latter’s view of the electronic display

The eye tracking subsystem is powered and controlled by the display and computer of the headset. All the images recorded by the cameras are transmitted to the on-board computer on the headset. The images are read and processed to determine the size of the pupil and gaze direction of the pupil. This information is then
• Transmitted to the test control unit for the test administrator to see and if required intervene
• Used to automatically pause visual function test if gaze direction is not the desired position

The patient response button is used by the patient when conducting the test. It is paired wirelessly with the headset. The on-board computer on the headset records different patient responses that can include
• Click of a button
• Motion of the response button (in 3d space)

This information is used to determine patient response to conduct test.

The invention is described and disclosed for an improved head mounted system which takes into account what the patient is looking at and what the patient is supported to be looking at b providing a binocular wearable device with a display device (1), stereoscopic optic system (2), pupil illumination setup (3), image capturing devices (4), and a communication and control unit.

The device consists of:

1. Head-Mounted Virtual Reality Display with an Eye trackers
The headset consists of the optic and display subsystem, the eye tracking subsystem, and the computing and communication subsystem.

The optic and display subsystem is a spectroscope that uses an electronic display capable of a high dynamic range of brightness. The optic system separates the images presented to each of the eye. This makes it possible for us to administer the visual field test binocularly. We can also conduct a monocular test without the need for occlusion. The monocular field of view achieved at each eye is 95 degrees diagonally, sufficient to conduct a standard 24-2 white on white perimetry test with a single central fixation target after correcting for lens distortion and field curvature. Background illumination of the display are controlled by changing the brightness and luminance of the screen. It is maintained at 9.6 cd/m^2. The size and brightness of the stimuli are changed to measure the dynamic light sensitivity of the patients. The high resolution of the screen allows us to achieve stimulus sizes ranging from Goldman size II to V.

The eye tracking sub-system consists of an array of IR LED’s used to illuminate the pupil of the patient taking test , hot mirrors to direct IR light onto and from the patient’s eyes, and two full field CMOS cameras; one for each eye; which capture that images at 30 frames per second rate. These images are used to determine the gaze direction of the patient. This information is used to account for fixation losses in white-on-white perimetry.

The computing and communication subsystem does the backend calculations to determine the order, timing, intensity, size and position of the stimuli presented. It also maintains a wireless connection to a backend server where all test related information is stored, and receives patient response after the presentation of the stimulus.

2. Patient Response Button: This is a wireless button which the patient uses to record the response to the presented stimuli. For a given stimulus to be recorded as seen, the patient is expected to press the button within one second of the presentation of the stimulus. The patient response button is connected to the headset and communicates the time-stamp at which the button is clicked. This information is recorded and used by the computing subsystem to determine subsequent stimuli and finally, the test results.

3. Test Controller Device: The test controller is used by the administrator to set up the test and to examine the results obtained. It is connected to the mobile perimeter’s cloud server through internet. To setup a test, the administrator

4. Refraction Correction: The device has provisions for varying the object distances. This is done changing the distance between the screen and lenses, and with the use of additional corrective lenses.

KEY DIFFERENCES WHEN COMPARED WITH PRIOR ART :-

1. Optic structure is different. The layout of the lenses and screens are different.
2. Display system: The invention uses a single screen and play with size of stimulus along with intensity of stimulus to achieve the complete dynamic range
3. Mechanical layout is different. The device is built to allow for the purpose of performing tests by patients and this different from any existing prior art.
4. Test algorithm: The specific test algorithm used is different and specific parameters used are different

In one aspect the invention relates to a system for measuring and characterizing to determine visual functions of human eyes. The inventive portion lies in ensuring that there is one portion which collects the information, which compiles the information, which computes the information and a second portion wherein it can be made accessible to the qualified persons for taking decisions for the purpose of the further course of action on the user whose eyes have been tested / measured. The system has essentially been divided into two portions referred herein as a head portion and a remote portion. The head portion is typically mounted on the user head partially or fully. The remote portion is typically remote from the user.
The referred head portion has a head mounted display unit means. It comprises a pair of display means, including lens means. The first adapted for presenting an image to user’s one of the two eyes of the user and the other adapted to present an image to user’s other eye. There is an attachment means to attach the lenses to the display means which may be magnetic attachment means to attach user-specific refractive power correction lenses to the display means for each of the user’s eye. The system also includes an optic eye tracking means with a pair of full-field cameras including lens means one adapted for capturing a user’s one eye and the other adapted for capturing a user’s other eye. Since there is need to adjust the lenses, there is a mechanical linkage means to adjust the lens means of optic eye tracking means and the lens means of display means to align with a user eye’s optic axis. Since the lenses are adjustable there is a requirement of lock means which is typically a lock means to mechanically lock through the mechanical linkage means the optic eye tracking means and display means of each eye when the optic axes of corresponding lens means are aligned. Having collected the information regarding the user eye there is a need for a computing means to compute and adjust the display means to align with center of display means with a user eye’s optic axis such that lens means of display means, lens means of optic tracking means, center of display means aligns along a common axis with a user eye’s optic axis. The system also includes a response means adapted for capturing a user’s response to the presented stimuli images from the display means. The computing controller means is adapted to process captured images from camera of optic eye tracking means into first set-controlled messages and also to process captured responses from response means into second set-controlled messages. The test control means is operationally adapted to determine the order, timing, intensity, size and position of stimuli to be presented through the displaying means to the eyes, based on the signals received from computing controller means, and the transmission means is connecting operationally with the test means and also with the remote portion of the system. The systems second portion is a remote portion having at least an external display means operationally associated with transmission means for receiving data from test control means of the head portion.
In another aspect the transmission means includes a secure wireless connection means to transfer live data transmission between the said external display means and the head portion.
In another aspect the computing controller means is adapted to transmit the controlled messages through the transmission means over a network to a remote destination for storage.
In another aspect the system further includes a test control means operationally adapted for switching on or off operation of the head portion and the remote portion and for monitoring the operation of the head portion and the remote portion.
In another aspect the test control means is remotely operationally coupled with the head portion of the head mounted system.
In another aspect the test control means is proximately operating coupled with the head portion of the head mounted system.
In another aspect the display means of the head portion is configured such that the distance between the display and lens in the display means is so calculated that a magnified virtual image is created at a distance which is greater than 10 times the distance between the display and lens.
In another aspect the display means of the head portion is configured such that the virtual image magnification and the image distance is so calculated that the resulting image covers a field of 50 degrees when the eye is focusing at the center of said image.
In another aspect the display means of the head portion includes lens means with magnetic lens holder means for holding a single lens and a magnetic lens sleeve means for holding plurality of lens, the said holder means and said sleeve means both removably attachable to the display means.
In another aspect the eye tracking means of the head portion includes an eye illumination means with an array of IR LEDS coupled with polymer diffuser caps.
In another aspect the eye tracking means of the head portion includes an IR light re-directing means using a set of hot mirrors.
In another aspect the eye tracking means of the head portion includes an eye
In another aspect the eye tracking means of the head portion includes a pair of infrared CMOS cameras, one for each eye, which is adapted to capture images at 30 frames per second rate.
In another aspect the response means and the computing controller means of the head portion are connected through Bluetooth or through a wired connection.
All variations and modifications obvious to the skilled persons are within the scope of the invention.

The complete specification also relies upon the provisional specification and the drawings referred herein
,CLAIMS:1. A system for measuring and characterizing to determine visual functions of human eyes with a virtual reality headset comprising of two portions, a head portion (A) and a remote portion (B) ;
the said head portion (A) having at least :-
a. a head mounted display unit means comprising a pair of display means, including lens means one adapted for presenting an image to user’s one eye, and the other adapted to present an image to user’s other eye,
b. a magnetic attachment means to attach user-specific refractive power correction lenses to the display means for each of the user’s eye,
c. an optic eye tracking means with a pair of full-field cameras including lens means one adapted for capturing a user’s one eye and the other adapted for capturing a user’s other eye,
d. a mechanical linkage means to adjust the lens means of optic eye tracking means and the lens means of display means to align with a user eye’s optic axis,
e. a lock means to mechanically lock through the mechanical linkage means the optic eye tracking means and display means of each eye when the optic axes of corresponding lens means are aligned,
f. a computing means to compute and adjust the display means to align with centre of display means with user eye’s optic axis such that lens means of display means, lens means of optic tracking means, centre of display means align along a common axis with a user eye optic axis.
g. a response means adapted for capturing a user’s response to the presented stimuli images from the display means,
h. a computing controller means adapted to process captured images from camera of optic eye tracking means into first set-controlled messages and also to process captured responses from response means into second set-controlled messages,
i. a test control means operationally adapted to determine the order, timing, intensity, size and position of stimuli to be presented through the displaying means to the eyes, based on the signals received from computing controller means, and
j. a transmission means connecting operationally with the test control means.
the said remote portion (B) having at least an external display means operationally associated with transmission means for receiving data from test control means of the head portion.
2. The system as claimed in claim 1 wherein the transmission means includes a secure wireless connection means to transfer live data transmission between the said external display means and the head portion.
3. The system as claimed in claim 1 wherein the computing controller means is adapted to transmit the controlled messages through the transmission means over a network to a remote destination for storage.
4. The system as claimed in claim 1 wherein the system further includes a test control means operationally adapted for switching on or off operation of the head portion and the remote portion and for monitoring the operation of the head portion and the remote portion.
5. The system as claimed in claim 3, wherein the test control means is remotely operationally coupled with the head portion of the head mounted system.
6. The system as claimed in claim 3, wherein the test control means is proximately operationally coupled with the head portion of the head mounted system.
7. The system as claimed in claim 1, wherein the display means of the head portion is configured such that the distance between the display and lens in the display means is so calculated that a magnified virtual image is created at a distance which is greater than 10 times the distance between the display and lens.
8. The system as claimed in claim 1, wherein the display means of the head portion is configured such that the virtual image magnification and the image distance is so calculated that the resulting image covers a field of at least 50 degrees when the eye is focusing at the center of said image.
9. The system as claimed in claim 1, wherein the display means of the head portion includes lens means with magnetic lens holder means for holding a single lens and a magnetic lens sleeve means for holding plurality of lens, the said holder means and said sleeve means both removably attachable to the display means.
10. The system as claimed in claim 1, wherein the eye tracking means of the head portion includes an eye illumination means with an array of IR LEDS coupled with polymer diffuser caps.
11. The system as claimed in claim 1, wherein the eye tracking means of the head portion includes an IR light re-directing means using a set of Hot Mirrors.
12. The system as claimed in claim 1, wherein the eye tracking means of the head portion includes a pair of infrared CMOS cameras, one for each eye, which is adapted to capture images at 30 frames per second rate.
13. The system as claimed in claim 1, wherein the response means and the computing controller means of the head portion are connected through blue tooth

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201741015479-IntimationOfGrant19-12-2024.pdf 2024-12-19
1 PROOF OF RIGHT [02-05-2017(online)].pdf 2017-05-02
2 201741015479-PatentCertificate19-12-2024.pdf 2024-12-19
2 Power of Attorney [02-05-2017(online)].pdf 2017-05-02
3 Form 1 [02-05-2017(online)].pdf 2017-05-02
3 201741015479-Written submissions and relevant documents [01-08-2024(online)].pdf 2024-08-01
4 Drawing [02-05-2017(online)].pdf 2017-05-02
4 201741015479-Correspondence to notify the Controller [27-06-2024(online)].pdf 2024-06-27
5 Description(Provisional) [02-05-2017(online)].pdf 2017-05-02
5 201741015479-US(14)-ExtendedHearingNotice-(HearingDate-29-07-2024).pdf 2024-06-27
6 Correspondence by Agent_Form1.PA_08-05-2017.pdf 2017-05-08
6 201741015479-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [25-06-2024(online)].pdf 2024-06-25
7 201741015479-FORM 3 [26-04-2018(online)].pdf 2018-04-26
7 201741015479-Correspondence to notify the Controller [20-06-2024(online)].pdf 2024-06-20
8 201741015479-US(14)-HearingNotice-(HearingDate-02-07-2024).pdf 2024-06-19
8 201741015479-Response to office action [11-03-2022(online)].pdf 2022-03-11
8 201741015479-FORM 3 [26-04-2018(online)]-3.pdf 2018-04-26
9 201741015479-FER_SER_REPLY [18-08-2022(online)].pdf 2022-08-18
9 201741015479-FORM 3 [26-04-2018(online)]-2.pdf 2018-04-26
10 201741015479-FORM 3 [26-04-2018(online)]-1.pdf 2018-04-26
10 201741015479-Response to office action [11-03-2022(online)].pdf 2022-03-11
11 201741015479-ENDORSEMENT BY INVENTORS [26-04-2018(online)].pdf 2018-04-26
11 201741015479-FER.pdf 2022-03-08
12 201741015479-DRAWING [26-04-2018(online)].pdf 2018-04-26
12 201741015479-FORM 18 [01-04-2021(online)].pdf 2021-04-01
13 201741015479-DRAWING [26-04-2018(online)]-1.pdf 2018-04-26
13 Correspondence by Agent_Form 1,Form 3,Form 5_27-04-2018.pdf 2018-04-27
14 201741015479-COMPLETE SPECIFICATION [26-04-2018(online)]-1.pdf 2018-04-26
14 201741015479-CORRESPONDENCE-OTHERS [26-04-2018(online)].pdf 2018-04-26
15 201741015479-COMPLETE SPECIFICATION [26-04-2018(online)].pdf 2018-04-26
15 201741015479-CORRESPONDENCE-OTHERS [26-04-2018(online)]-1.pdf 2018-04-26
16 201741015479-COMPLETE SPECIFICATION [26-04-2018(online)].pdf 2018-04-26
16 201741015479-CORRESPONDENCE-OTHERS [26-04-2018(online)]-1.pdf 2018-04-26
17 201741015479-CORRESPONDENCE-OTHERS [26-04-2018(online)].pdf 2018-04-26
17 201741015479-COMPLETE SPECIFICATION [26-04-2018(online)]-1.pdf 2018-04-26
18 Correspondence by Agent_Form 1,Form 3,Form 5_27-04-2018.pdf 2018-04-27
18 201741015479-DRAWING [26-04-2018(online)]-1.pdf 2018-04-26
19 201741015479-DRAWING [26-04-2018(online)].pdf 2018-04-26
19 201741015479-FORM 18 [01-04-2021(online)].pdf 2021-04-01
20 201741015479-ENDORSEMENT BY INVENTORS [26-04-2018(online)].pdf 2018-04-26
20 201741015479-FER.pdf 2022-03-08
21 201741015479-FORM 3 [26-04-2018(online)]-1.pdf 2018-04-26
21 201741015479-Response to office action [11-03-2022(online)].pdf 2022-03-11
22 201741015479-FER_SER_REPLY [18-08-2022(online)].pdf 2022-08-18
22 201741015479-FORM 3 [26-04-2018(online)]-2.pdf 2018-04-26
23 201741015479-FORM 3 [26-04-2018(online)]-3.pdf 2018-04-26
23 201741015479-US(14)-HearingNotice-(HearingDate-02-07-2024).pdf 2024-06-19
24 201741015479-Correspondence to notify the Controller [20-06-2024(online)].pdf 2024-06-20
24 201741015479-FORM 3 [26-04-2018(online)].pdf 2018-04-26
25 201741015479-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [25-06-2024(online)].pdf 2024-06-25
25 Correspondence by Agent_Form1.PA_08-05-2017.pdf 2017-05-08
26 Description(Provisional) [02-05-2017(online)].pdf 2017-05-02
26 201741015479-US(14)-ExtendedHearingNotice-(HearingDate-29-07-2024).pdf 2024-06-27
27 Drawing [02-05-2017(online)].pdf 2017-05-02
27 201741015479-Correspondence to notify the Controller [27-06-2024(online)].pdf 2024-06-27
28 Form 1 [02-05-2017(online)].pdf 2017-05-02
28 201741015479-Written submissions and relevant documents [01-08-2024(online)].pdf 2024-08-01
29 Power of Attorney [02-05-2017(online)].pdf 2017-05-02
29 201741015479-PatentCertificate19-12-2024.pdf 2024-12-19
30 PROOF OF RIGHT [02-05-2017(online)].pdf 2017-05-02
30 201741015479-IntimationOfGrant19-12-2024.pdf 2024-12-19

Search Strategy

1 SS_201741015479E_04-03-2022.pdf

ERegister / Renewals

3rd: 07 Mar 2025

From 02/05/2019 - To 02/05/2020

4th: 07 Mar 2025

From 02/05/2020 - To 02/05/2021

5th: 07 Mar 2025

From 02/05/2021 - To 02/05/2022

6th: 07 Mar 2025

From 02/05/2022 - To 02/05/2023

7th: 07 Mar 2025

From 02/05/2023 - To 02/05/2024

8th: 07 Mar 2025

From 02/05/2024 - To 02/05/2025

9th: 07 Mar 2025

From 02/05/2025 - To 02/05/2026