Abstract: The invention relates to providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world. In one embodiment, a plurality of image frames of a user indicating a first action in real world is received. A second action corresponding to the first action is displayed in the virtual world; the second action resulting in manipulation of said at least one object in the virtual world. Based on the control of said at least one parameter of said at least one object in the virtual world, the feedback for modifying user behaviour in real world is provided.
DESCRIPTION
TECHNICAL FIELD OF INVENTION
The invention generally relates to modifying user behaviour. More particularly, the invention relates to providing a feedback for modifying user behaviour in real world based on control of a parameter of at least one object in a virtual world.
BACKGROUND OF INVENTION
In past, many computing applications, such as computer games, multimedia applications, and therapeutic interactive applications have used sensors, controllers, keyboards, remotes, computer mouse, or the like to allow users to manipulate virtual objects. More recently, some of these applications have begun employing cameras and motion recognition for the same purpose. However, virtual reality based therapy or training is one application where this concept has not been used to its potential. Virtual reality based therapy or training basically utilize specially programmed computers, visual immersion devices and artificially created environments to give users a simulated experience that can be used to diagnose and treat psychological conditions that cause any physical difficulty.
One exemplary area where virtual reality based therapy and training has been utilized to some extent is neck pain, which is a common musculoskeletal complaint in the modern world with significant ramifications to the injured individuals and to society at large. Neck pain constitutes a major cause of disabilities, such as range of motion (ROM) limitation, repositioning disability, isometric strength, and endurance of cervical muscle. The assessment and rehabilitation of neck pain typically requires multiple visits to the clinics or hospitals. With their busy schedules, people have less time at their hands to spend at the clinics or hospitals.
The assessment of cervical range of motion (CROM) is frequently used to quantify the level of impairment associated with neck pain, and to assess the effectiveness of therapeutic interventions. A variety of assessment devices and methodologies thereof for assessment of CROM are known in the art. Some of these assessment devices involve simple visual ROM estimation, inclinometers, and potentiometers assessing static ROM, while more sophisticated assessment devices involve optic, ultrasonic, and electromagnetic 3-dimensional dynamic tracking systems.
One particular solution for assessment of CROM is sensor based kinematic measurement of various movements of neck assisted with a virtual or interactive guide to be followed during the assessment. This solution tries to stimulate users to increase or decrease their range of motion. Instead of the virtual or interactive guide, a video game is used in some cases. Such solutions emphasize on tracking 3D motion of a user using a sensor based tracking located on the user’s body.
Other known solutions include use of motion tracker and simulating apparatus for kinematic assessment of CROM, markers placed shoulders of a user to perform a relative shoulder and head movement analysis for neck movement measurement, a virtual trainer for relative mark up of exercise pattern, or a combination of a haptic device and a computer-assisted medical system is used for interactive haptic positioning of a medical device coupled to the haptic device.
All these known solutions provide a system which uses a virtual scenario to help users in training or doctors in operating. Accordingly, these known solutions are helpful in assessment and rehabilitation of musculoskeletal complaints. However, all of such known solutions are costly and not meant for use at home generally. Furthermore, these known solutions are insufficient in terms of user engagement and usually require a user to wear some sensor based devices.
SUMMARY OF INVENTION
In accordance with the purposes of the invention, as embodied and broadly described herein, the invention provides a solution for proportional modification of user behaviour in real world based on feedback from virtual world. The solution includes providing a feedback for modifying user behaviour in real world based on control of a parameter of at least one object in a virtual world. For this purpose, a plurality of image frames of a user indicating a first action in real world is received. A second action corresponding to the first action is displayed in the virtual world, which results in manipulation of said at least one object in the virtual world. Based on the control of the parameter of said at least one object in the virtual world, the feedback for modifying user behaviour in real world is provided.
In one example, the invention could be used for assessment and rehabilitation of musculoskeletal complaints, such as neck pain. There is a growing use of smart devices, such as smartphones and other hand-held devices in everyday life. The invention can be integrated with such smart devices to form a virtual reality based intelligent system for assessment and rehabilitation for the people suffering with musculoskeletal complaints, which can fit in busy schedule of people. This virtual reality intelligent system infuses learning, which can be ultimately translated into real world. To this end, the invention includes varying a parameter, such as a spring constant (K) of a virtual object, such as a spring object, to affect user behaviour in real world. It will be appreciated by those skilled in the art that any virtual object whose one or more parameters can be controlled could be used as the virtual object within the spirit of the invention. In this virtual reality based intelligent system, a user or a doctor can choose an exercise profile by dynamically shifting down/up of the spring constant of the spring object. Further, optimum spring constant can be assessed for the user, thereby determining severity of the musculoskeletal complaints. Further, weight objects are added to the spring objects in the virtual world for an imposed adaptation of the real world to impart training for muscle strength. Adding weights (W) or increasing the value of the spring constant enables strength training for individuals and would relieve pain in some cases. The reason behind the same is that when one lifts a weight and tries to hold it in that position for some time, muscles get used to that weight, hence increasing the strength of the muscles.
To this end, the invention enables users to move their head against a virtual spring weight system. The virtual spring weight system has a threshold set for exercise, requiring the users to hold their head at a current position for some time. When the users hold their head at the current position for some time, it enables the virtual spring weight system to move until the length of the spring becomes short enough to be moved further, i.e., longer the users hold their head at the current position, shorter is the length of the spring.
This invention could be used for objective evaluation of impairments and disabilities associated with musculoskeletal complaints, such as neck pain for both diagnostic as well as prognostic purposes. The invention could also be used for interactive, immersive, engaging, feedback oriented virtual reality technique for improved assessment and better rehabilitation of musculoskeletal complaints.
The advantages of the invention include, but are not limited to that the system can be implemented in a smartphone or any other hand-held device using its camera for the assessment and rehabilitation of musculoskeletal complaints. Therefore, it is a user friendly, easy to use, low cost, home based system for which the user does not need to visit hospitals for rehabilitation. Further, it is more intuitive, interactive, engaging, and rewarding in terms of recovery process. Further, it is a home based system. Further, it enables an interactive comparative analysis pre/post assessment.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
To further clarify advantages and features of the invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings in which:
Figure 1 illustrates an exemplary welcome screen of the virtual reality based musculoskeletal complaint assessment and rehabilitation system, in accordance with an embodiment of the invention.
Figure 2 illustrates an exemplary settings screen for the assessment and rehabilitation system, in accordance with an embodiment of the invention.
Figure 3 illustrates an exemplary screen for procedures at the start of the application, in accordance with an embodiment of the invention.
Figure 4 illustrates an exemplary screen for hinge location criterion, in accordance with an embodiment of the invention.
Figure 5 illustrates an exemplary screen where patients can select the hinge by themselves, in accordance with an embodiment of the invention.
Figure 6 illustrates an exemplary hinge selection for the down/up motion, in accordance with an embodiment of the invention.
Figure 7 illustrates an exemplary screen for stick figure representation, in accordance with an embodiment of the invention.
Figure 8 illustrates an exemplary screen for selecting a type of head movement exercise, in accordance with an embodiment of the invention.
Figure 9 illustrates an exemplary overview of the virtual reality based musculoskeletal complaint assessment and rehabilitation system, in accordance with an embodiment of the invention.
Figure 10 illustrates an exemplary patient perspective of the virtual world as seen in the real world, in accordance with an embodiment of the invention.
Figure 11 illustrates a situation with negligible K values and increased head motion, in accordance with an embodiment of the invention.
Figure 12 illustrates a situation with increased K value and effect in the real world, in accordance with an embodiment of the invention.
Figure 13 illustrates an exemplary screen for analyzing patient’s rehabilitation data by the doctor against K, in accordance with an embodiment of the invention.
Figure 14 illustrates an exemplary screen for analyzing patient’s rehabilitation data by the doctor against W, in accordance with an embodiment of the invention.
Figure 15 illustrates an exemplary screen for manually setting up time or changing other assessment/rehabilitation parameters, in accordance with an embodiment of the invention.
Figure 16 illustrates a situation with increasing/decreasing W values, in accordance with an embodiment of the invention.
Figures 17(a) and (b) illustrate exemplary flow charts for dynamic assessment of the using virtual spring and weight respectively, in accordance with an embodiment of the invention.
Figure 18 illustrates an exemplary system for spring constant/weight based dynamic assessment and rehabilitation, in accordance with an embodiment of the invention.
Figure 19 illustrates an exemplary method for providing a feedback for modifying user behaviour in real world based on control of a parameter of at least one object in a virtual world
Figure 20 illustrates an exemplary interface for choosing the face size/shape, in accordance with an embodiment of the invention.
Figure 21 illustrates an exemplary interface for user’s range of motion (ROM), in accordance with an embodiment of the invention.
Figure 22 illustrates user’s range of motion as a function of time and length contraction of the spring, in accordance with an embodiment of the invention.
Figure 23 illustrates length compression during user’s task of moving the block attached to the spring, in accordance with an embodiment of the invention.
Figure 24 illustrates dynamic change in spring constant and related compression for strength training, in accordance with an embodiment of the invention.
Figure 25 illustrates an exemplary virtual spring weight system for strength training, in accordance with an embodiment of the invention.
Figure 26 illustrates an exemplary virtual spring weight system for assessment of range of motion for an injured hand of a user on a laptop having internal camera, in accordance with an embodiment of the invention.
Figure 27 illustrates an exemplary virtual spring weight system for assessment of range of motion for an injured leg of a user on a computing system and an external camera, in accordance with an embodiment of the invention.
Figure 28 illustrates an exemplary virtual spring weight system for exercise of an eye ball of a subject on a touchscreen phone having a front camera, in accordance with an embodiment of the invention.
Figure 29 illustrates an exemplary virtual aerobic ball system for exercise of a cervical neck of a user on a computing device having touch screen surface, in accordance with an embodiment of the invention.
Figure 30 illustrates an exemplary balloon buster system for exercise of a hand of a user on a computer tablet having touch screen surface and a front camera, in accordance with an embodiment of the invention.
It may be noted that to the extent possible, like reference numerals have been used to represent like elements in the drawings. Further, those of ordinary skill in the art will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily drawn to scale. For example, the dimensions of some of the elements in the drawings may be exaggerated relative to other elements to help to improve understanding of aspects of the invention. Furthermore, the one or more elements may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof. Throughout the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
Reference throughout this specification to “an embodiment”, “another embodiment” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems.
Some key embodiments of the invention are listed below:
In one embodiment, the invention includes a method for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, said method comprising the steps of: receiving a plurality of image frames of a user indicating a first action in real world; displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world; allowing for the control of said at least one parameter of said at least one object in the virtual world; and based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
In one embodiment, the at least one object is a combination of a virtual spring and a virtual weight.
In one embodiment, said at least one parameter is a physical property associated with said at least one object or the virtual world.
In one embodiment, the physical property is spring constant (K) of a virtual spring.
In one embodiment, the physical property is a weight (W) of a virtual weight.
In one embodiment, the physical property is the elasticity of said at least one object.
In one embodiment, the physical property is density of medium in the virtual world.
In one embodiment, the first action is Extension (E), Flexion (F), Right rotation (RR), Left Rotation (LR), Left lateral flexion (LLF), or Right lateral flexion (RLF) of the cervical neck.
In one embodiment, the second action corresponding to the first action is generated on a real time basis.
In one embodiment, the method further includes receiving a value of said at least one parameter from a user interface.
In one embodiment, the method further includes recording data corresponding to a user using a unique user-id.
In one embodiment, the invention includes a device for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, said device comprising: an input unit for receiving a plurality image frames of a user indicating a first action in real world; a processing unit coupled with a display unit for displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world; the processing unit being adapted to allow for the control of said at least one parameter of said at least one object in the virtual world; and the processing unit and the display unit being further adapted to, based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
In one embodiment, the at least one object is a combination of a virtual spring and a virtual weight.
In one embodiment, said at least one parameter is a physical property associated with said at least one object or the virtual world.
In one embodiment, the physical property is spring constant (K) of a virtual spring.
In one embodiment, the physical property is a weight (W) of a virtual weight.
In one embodiment, the physical property is the elasticity of said at least one object.
In one embodiment, the physical property is density of medium in the virtual world.
In one embodiment, the system further includes a face detector coupled with the input unit for detecting face of a user based on the received plurality of image frames.
In one embodiment, the system further includes a stick figure model for generating the second action corresponding to the first action on a real time basis.
In one embodiment, the invention includes a computer readable data storage medium storing a computer program for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, the computer program when executed on a computer, execute the steps of: receiving a plurality of image frames of a user indicating a first action in real world; displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world; allowing for the control of said at least one parameter of said at least one object in the virtual world; and based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
These and other embodiments of the invention will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates an exemplary welcome screen 100 of the virtual reality based musculoskeletal complaint, such as cervical/neck pain assessment and rehabilitation system, in accordance with an embodiment of the invention. When a doctor/patient starts the cervical/neck pain assessment and rehabilitation system, he/she is greeted with this welcome screen 100. The doctor/patient is asked to press a button 101 on the welcome screen 100 to proceed further. Additionally, the doctor/patient is also asked to fill in the personal details such as age, sex, etc. in an user interface 102, for records before he/she goes on to a next screen.
Figure 2 illustrates an exemplary settings screen 200 for the assessment and rehabilitation system, in accordance with an embodiment of the invention. The settings screen 200 includes a user interface 201 that displays few settings that the doctor/patient needs to set. The examples of the settings include, but are not limited to a unique user id to uniquely identify a user; whether a user is first time user (Yes/No); what is the purpose – Assessment or Rehabilitation (A/R); selection of total assessment/ rehabilitation time through a pop-up window 202; selection of virtual reality exercise type and its difficulty level through another pop-up window 203; and whether to record exercise video (Yes/No). Throughout this detailed description section, Yes/No indicates that the user has to choose between the ‘Yes’ option or the ‘No’ option. All these settings are stored in a database (not shown) of the virtual reality based cervical/neck pain assessment and rehabilitation system. If needed, patient’s data can be backtracked from the database at a later stage using their unique user-id, say, for the purpose of examination or advice. In one implementation, the unique user ID is automatically assigned to the patient once the doctor/patient is done with entering other settings. In the end, a user guide 204 suggests to swipe across a user interface to go to a next screen. Once the settings screen 200 disappears the doctor/patient can launch main application to begin with the assessment/rehabilitation.
Figure 3 illustrates an exemplary screen 300 for procedures at the start of the application, in accordance with an embodiment of the invention. If the patient is inside the clinic, the doctor may ask the patient to position his/her head within the dotted circle 301 displayed in the screen 300. Once the patient positions his/her head inside the dotted circle 301, the doctor makes a selection 302 of the circumference of patient’s head, for example through a touchscreen. The circumference could be one of the differentiating factors and one of the variables while doing the post-processing of the patient’s assessment/rehabilitation data. Here, the system also provides a method as well as the user interface 303 to fix the face once properly positioned inside the dotted circle 301. If the doctor sees any discrepancy or inaccuracy in the process he can reset the whole head positioning and circumference selection process through a resetting user interface 304 and start over again. There then comes the role of the hinge 306 which is displayed in the small square below the chin of the patient in the neck region and a user interface 305 for fixing the hinge 306. An on-screen voice or text based guide is made available to a user through a user interface 307.
In one implementation, the face detection is performed using Haar Casacade Classifier. In operation, the processing is done over each frame. For each frame, the position of the head is calculated with reference to the previous frame, which gives the instantaneous speed. Also, the distance travelled is also summed over the frames to calculate the average speed. The rotation angles are measured with reference to the hinge as marked by the user himself/herself. The angle is calculated between the vertical neck line and the extremities of the face, i.e., line joining the hinge and the midpoint of the left/right edge of face rectangle.
Figure 4 illustrates an exemplary screen for hinge location criterion, in accordance with an embodiment of the invention. The screen 400 gives an insight elaboration of the hinge selection procedure. The hinge is one of the important aspect with reference to which the angles are measured. The location of the hinge is very pivotal for the rest of the assessment/rehabilitation steps. As shown in Figure 4, the doctor has designated a virtual location of the hinge once the patient’s head location and circumference selection is over. This becomes a reference area for hinge selection of the patient when he/she is using the system on their own after the preliminary assessment. A particular hinge location is selected in order to measure the left/right rotation angles of the head during the head motion. Once the hinge location is fixed by the doctor, the patient can also similarly select the hinge for their self-assessment/rehabilitation afterwards. The Figure 5 illustrates an exemplary screen 500 for hinge selection by the patient himself/herself for left/right rotation, while Figure 6 illustrates an exemplary screen 600 for hinge selection for the down/up motion. For this purpose, the patients move their head up-down and the angular measurement is done on that basis. As shown in Figure 6, it can be observed that the hinge is located at a position which is at a level of the ear of the patient thus measuring the angles accordingly.
Figure 7 illustrates an exemplary screen 700 for stick figure representation, in accordance with an embodiment of the invention, where the doctor/patient can see display of the stick figure model of the patients head movements whether left/right or down/up on a real time basis. The screen 700 includes a user interface 701 for selecting stick figure model (Yes/No). If the user choses ‘Yes’ option, then the stick figure model may be shown through a pop-up window 702 as shown in the Figure 7. The screen 700 also includes another user interface 703 for displaying on-screen graphs on real time basis (Yes/No). If the user choses ‘Yes’ option, then the graphs may be shown through another pop-up window 704 as shown in the Figure 7. In one implementation, there could also be a voice/text based interactive guide 705 (Yes/No) which can ask patients to move their head in correct direction while doing the assessment/rehabilitation for accurate measurement and improved performance.
Figure 8 illustrates an exemplary screen 800 for selecting a type of head movement exercise, in accordance with an embodiment of the invention. The screen 800 includes a sliding user interface 801 for selecting the number of days, while the types of movements which can be performed by the patient can be selected using another sliding user interface 802 by the doctor/patient. There are two types of movements which can predominantly be performed - the left/right rotation of the head and the down/up rotation. These two movements are depicted through a user interface 803. Further, these two movements can be mapped onto different types of interactive virtual reality objects, preferably the spring and weight virtual objects, as shown in some of the subsequent figures.
In operation, the system measures two types of angles (in degrees) from the same reference point, i.e., from the 'Hinge': (1) an angle made by the right extremity of the face (angle between the lines connecting midpoint of the far right edge of the face & the hinge, and the neck line, which is line connecting the midpoint of the bottom edge of the face and the hinge); and (2) an angle made by the left extremity of the face, i.e., angle between the lines connecting midpoint of the far left edge of the face & the hinge, and the neck line, which is line connecting the midpoint of the bottom edge of the face and the hinge.
The system also measures the average and instantaneous speed of the movement of the face in pixels/sec: the instantaneous speed of the head movement is computed using the displacement measured between two consecutive frames during the exercise procedure, where fps (frames per second) is say 24; and the average speed of the overall head motion during the exercise is computed using the total distance travelled over the whole exercise procedure.
Figure 9 illustrates an exemplary screen 900 of the virtual reality based musculoskeletal complaint, such as cervical/neck pain, assessment and rehabilitation system, in accordance with an embodiment of the invention. Here, each module of the system as it looks when the patient is presented with this system for their neck pain assessment/rehabilitation and other related ailments for their cervical neck. As shown, the screen 900 includes a stick figure stimulation 901; image/video of the user 902; various virtual objects and markers 903; user interfaces 904 for fixing face, resetting, and fixing hinge; and various graphs 905 depicting real-time angular movements and instantaneous speed 904.
The Figures 10 and 17 illustrate the role of the relation between the virtual world and real world and how both of them are intertwined in the present system for the transfer of learning and adaptation according the change of spring constant (K) or weight (W) in the virtual world.
More specifically, Figure 10 illustrates an exemplary patient perspective of the virtual world as seen in the real world, in accordance with an embodiment of the invention. Further, this figure demonstrates a comparative scenario 1000 where it has been demonstrated how the patient feels when a parameter, such as spring constant, of an virtual world object, such as the spring, is changed and their self-adjustments or adaptation accordingly for their neck pain assessment and rehabilitation. It also demonstrates how they see the virtual world and how they adjust their head movements in the real world upon the change of the spring constant in the virtual world.
Stressing on the interaction of the rectangle which is associated with the face detection of the patient and the weights which are attached to the springs on one end, it can be observed from the figure 10 that the K values are negligible and the patient is performing left/right rotation of his head. The rectangle which is associated with the face is colliding with the weights attached to the spring. Therefore, the weights attached to the springs would be pushed on either side of the face during the head movements. This would give the range of motion of the left/right rotation in terms of degrees of movement.
Figure 11 demonstrates a scenario 1100 when the doctor/patient does not make any changes to the K value and in case the patient moves his head faster the rectangle detecting the face would pass by the weights attached to the springs undetected (without detecting any collision) and without pushing the weights attached to the springs. There would be visual alarm raised during this case which would in the form of a cross 1101 popping in from the corner of a screen, thereby advising the patient to slow down his/her speed.
When the value of K is increased in the virtual world the patient in the real world has to move his/her head as shown in the scenario 1200 of Figure 12. The patient has to hold each current position (movement of head either left or right) for the collision to be detected in the virtual world (movement of weights attached to the springs). This holding of the left/right head rotation step would provide some kind of muscular strain on the user’s neck in the real world for training their muscles strength. This would also provide a subjective measure of the patient’s posture for that particular interval of the assessment/rehabilitation part. In this way, any changes in the virtual reality medium would have effects on the patient’s range of motion in real world during the assessment/rehabilitation of the cervical neck motions.
Once the doctor has performed the assessment of the patient and has advised for a rehabilitation program, for example, a one week program. The doctor would advise on the type of exercise to do and the type of interactive virtual systems to choose from as well as the duration and the target range of motion to be achieved in that one week program. Also, the doctor would advise on the K value corresponding to the interactive virtual systems for a particular exercise type. Once these data has been logged in to the system of the patient, the patient can be released. Once the patient comes back after that week long virtual reality assisted self-rehabilitation program, the doctor can check on the progress and other parameter of the patient through a personalized user interface, which is also interactive in nature.
Figure 13 illustrates such an exemplary screen 1300 for analyzing patient’s rehabilitation data by the doctor against K values. The screen 1300 includes a user interface 1301 for selecting a parameter, such as K or W, of virtual objects, such as spring and weight. As shown in the figure 13, the doctor has chosen K. The screen 1300 also includes a user interface 1302 for selecting number of days for backtracking patient’s data. The screen 1300 also includes a user interface 1303 for selecting the type of exercise. The screen 1300 also includes a sliding user interface 1304 for selecting K values, and identifying difference between the achieved ROM and expected ROM. The screen 1300 also includes a button 1305 for backtracking patient’s data once all various user interfaces of the screen 1300 have been set as per requirement.
Figure 14 illustrates a similar exemplary 1400 screen for analyzing patient’s rehabilitation data by the doctor against W. As shown, here the doctor has chosen W parameter for analysis.
Figure 15 illustrates an exemplary screen 1500 for manually setting up time or changing other assessment/rehabilitation parameters, in accordance with an embodiment of the invention. As shown, the screen 1500 includes a user interface 1501 for setting time, changing K value, changing W value, and opening data logger. The user interface 1501 may be opened through a button 1502. the screen 1500 presents and overview of the system where the doctor does few selections where he had the range of motion advised by him for a particular exercise type and the range of motion achieved by the patient on an interactive scale. These ranges of motions are displayed either against K or W values. The screen 1500 presents the range of motion for a particular exercise type on a given day against the K values.
Figure 16 illustrates a scenario 1600 with increasing/decreasing W values. The screen 1600 presents the range of motion for a particular exercise type on a given day against the W values. If the doctor/patient wants to manually select the K/W values at the start of the assessment/rehabilitation or during the analysis after the post-processing of data they can do so using the interface provided in the
Figure 15. Here, there is a circular dot which the doctor/patient touches and brings up a selection interface there upon making the selection of the K values or accessing the databases or setting up the assessment/rehabilitation time. As in case of changing/selecting the K values if the doctor/patient wants to vary the weight for a different virtual experience the Figure 16 depicts an interface for the same.
Figures 17(a) and (b) illustrate exemplary flow charts for dynamic assessment of the using virtual spring and weight respectively, in accordance with an embodiment of the invention. As shown, common reference numerals are indicated for both figures.
The Figure 17(a) describes a process for dynamic assessment using virtual spring object. At step 1701a, an application for virtual reality based cervical/neck pain assessment and rehabilitation is started. At step 1702a, face of a user is detected. At step 1703a, data recording is enabled. At step 1704a, circumference of the face is selected through a user interface. At step 1705a, a value for a parameter, such as the spring constant K, of the virtual spring object is fixed. At step 1706a, If the spring constant K is equal to a threshold value, then the value of spring constant K is revised through ‘No’ route. Otherwise, the dynamic assessment is started at step 1707a through ‘Yes’ route.
Similarly, the Figure 17(b) describes a process for dynamic assessment using virtual weight object. At step 1701b, an application for virtual reality based cervical/neck pain assessment and rehabilitation is started. At step 1702b, face of a user is detected. At step 1703b, data recording is enabled. At step 1704b, circumference of the face is selected through a user interface. At step 1705b, a value for a parameter, such as the weight W, of the virtual weight object is fixed. At step 1706b, if the weight W is equal to a threshold value, then the value of weight W is revised through ‘No’ route. Otherwise, the dynamic assessment is started at step 1707b through ‘Yes’ route.
Figure 18 illustrates an exemplary system 1800 for spring constant/weight based dynamic assessment and rehabilitation, in accordance with an embodiment of the invention. The various components of the system 1800 and their functionalities are described below. The system includes a face detector 1801 that receives a plurality of image frames, such as a set of images taken at certain intervals of time or a video, as an input through an input unit 1802. The plurality of image frames is provided by a camera 1803, such the camera of a smart phone or hand-held device. The camera 1803 may be internal or external to the system 1800. The camera 1803 acts as an input device to detect the face of a user using the feature points of the face. The system also includes an interactive physics based assessment and rehabilitation module 1804 that implements the concepts of using a varying spring constant in the virtual reality medium to affect the user behaviour in real world; exercise profile by dynamic shifting down/up of K of spring; and assessing optimum K for the patient thereby determining the severity of the problem; and addition of weights to the spring in the virtual world for an imposed adaptation in the physical world implies robust training for muscle strength. The system further includes a stick Figure Model 1805 that represents the real time movement precision during the head movement followed by the feature point detection for the module 1804. The system also includes a graphical user interface GUI display unit 1806 that is coupled to system’s database (not shown). The GUI display unit 1806 can be intuitively operated during and after the assessment and the rehabilitation which consists of the comparative analysis of the spring length contraction relative to the range of motion, time taken to achieve the target spring contraction and so on. The system also includes one or more processing units 1807 that takes care of the real-time processing for the face detector 1801, stick figure model 1805, a GUI display unit 1806, and on-screen visual graphs 1808.
In one embodiment, the system 1800 is used for providing a feedback for modifying user behaviour in real world based on control of a parameter, such as spring constant K or weight W, of at least one object, such as a virtual spring object or virtual weight object, or combination thereof, in a virtual world. In one example, elasticity E of said at least one object or density D of medium in the virtual world could be used as the parameter. In fact, any such physical property associated with the at least one object of the virtual world could be used as a parameter. In said embodiment, the input unit 1802 receives a plurality of image frames, such as a set of images taken at certain intervals of time, or a video, of a user indicating a first action in real world. Examples of the first action include any type of movement or rotation of neck, say left/right/up/down rotation or movement. In said embodiment, the one or more processing units 1807 coupled with the GUI display unit 1806 displays, in the virtual world, a second action corresponding to the first action, i.e., the movement or rotation of the neck is also reflected in the virtual word on a real time basis. Further, this second action in the virtual world results in manipulation of said at least one object in the virtual world, i.e., the movement or rotation of the neck causes the shifting of the virtual weight object and contraction of the virtual spring object. Here, manipulation means any type of change in position or shape of said at least one object. Example of manipulation of said at least one object include, but are not limited to movements, acceleration, contraction of said at least one object. In said embodiment, the processing unit (1807) is adapted to allow for the control of at least one parameter of said at least one object in the virtual world. For example, the value of spring constant K or weight W can be increased or decreased anytime during operation of the system 1800. In said embodiment, the processing unit (1807) and the display unit (1806) are further adapted to, based on the control of at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world. For example, if the value of spring constant K or weight W is increased, the user has to hold his/her neck at a tilted position for a longer time in order to manipulate the virtual spring and virtual weight. In another example, if the value of spring constant K or weight W is decreased, the user can manipulate the virtual spring and virtual weight easily or freely. Further, the feedback for modifying user behaviour in the real world is proportional to activities in the virtual world. For example a feedback is provided to proportionally modify the user’s posture in the real world depending upon how a user has performed an exercise in the virtual world.
Figure 19 illustrates an exemplary method 1900 for providing a feedback for modifying user behaviour in real world based on control of a parameter of at least one object in a virtual world, in accordance with one embodiment of the invention. Here, real world is the conjectured state of things as they actually exist, rather than as they may appear or might be imagined. The real world means the actual physical world where the user is physically present, whereas the virtual world means a computer-based simulated environment which is based on augmented reality or virtual reality. In a preferred embodiment, the at least one object is a combination of a virtual spring and a virtual weight, and wherein the parameter is a spring constant (K) of the virtual spring or a weight (W) of the virtual weight. In one example, elasticity of said at least one object or density of medium in the virtual world could be used as the parameter. In fact, any such physical property associated with the at least one object of the virtual world could be used as a parameter.
In said method, a value of the parameter is received from a user interface at step 1901. For example, the value of spring constant (K) of the virtual spring or a weight (W) of the virtual weight or elasticity (E) of the virtual spring or density (D) of medium in the virtual world is received through a user interface from a user or doctor.
At step 1902, a plurality of image frames, such as a set of images taken at certain intervals of time, or a video, of a user indicating a first action in real world is received through an input unit. Examples of the first action include any type of movement or rotation of cervical neck of a user. The example of the movement or rotation include, but are not limited to Extension (E), Flexion (F), Right rotation (RR), Left Rotation (LR), Left lateral flexion (LLF), or Right lateral flexion (RLF) of the cervical neck.
At step 1903, a second action is displayed in the virtual world, which is corresponds to the first action on a real time basis, i.e., the movement or rotation of the neck is also reflected in the virtual word on a real time basis. The second action results in manipulation of said at least one object in the virtual world, i.e., the movement or rotation of the neck causes the shifting of the virtual weight object and contraction of the virtual spring object.
At step 1904, the control of the parameter of said at least one object in the virtual world is allowed. For example, the value of spring constant K or weight W elasticity E or density D can be changed.
At step 1905, based on the control of the parameter of said at least one object in the virtual world, the feedback for modifying user behaviour in real world is provided. For example, if the value of spring constant K or weight W is increased; the user has to hold his/her neck at a tilted position for a longer time in order to manipulate the virtual spring and virtual weight. In another example, if the value of spring constant K or weight W is decreased; the user can manipulate the virtual spring and virtual weight easily or freely.
At step 1906, data corresponding to a user is recorded using a unique user-id. In one implementation, this data is stored in a database for any future use, such as analysis of data or tracking patient’s history.
In one implementation, the method is implemented by the system 1800 or a smart device, such a smart phone or hand-held. Once the preliminary set up of the system 1800 is done, at the clinic or at home the doctor or user locates his/her face on the camera 1802 and performs a gesture to record the circumference of the face for record to perform systematic analysis thereafter, as shown in Figure 20. The circumference of the face appears between virtual springs having spring constants K1 and K2, and lengths L1 and L2 respectively. This circumference measurement gets translated into the stick figure model 1804 of Figure 18, in order to enable resizing of the stick figure model and the real-time animation of the activity being performed by the user’s neck. Upon the sizing of the users face as shown in Figure 20, the measurements of a current state are taken. For this purpose, a scale is set with the normal movement range of a healthy person as reference. Suppose the normal movement range is between -300 to +300 of a healthy person and the user is able to move in the initial assessment (current state) up from -150 to +150 as shown in Figure 21. These can be further drawn on a separate interface as a function of time and length contraction, as shown in Figure 22. Similarly, Figure 23 and Figure 24 represent the length contraction variation with respect to the spring constant as well as with respect to time for the current state as well as the target state. Here, the current state is the state when the user/user is asked to perform the assessment at the clinic by a therapist/doctor or the user/user does the same themselves, the target state is the state which the therapist/doctor sets in terms of the range of motion to achieve over a period of specified days, keeping in mind the dynamic spring constant shifting. More specifically, the Figure 23 illustrates length compression from L1 to L11 and from L2 to L21 for each of the virtual spring. The figure 24 illustrates dynamic change in spring constant (K’1 and K’2) and related compression for strength training. The improvement of the user or the degradation of their range of motion can be analyzed through the data logged in the system and the interactive data display interface, where the doctor can slide over the changes in a quick succession.
Figure 25 illustrates an exemplary virtual spring weight system for strength training, in accordance with an embodiment of the invention. This spring weight includes a left spring object having a spring constant K1 and length L1; a right spring object having a spring constant K2 and length L2; a left weight W1 attached to one end of the left spring K1, and a right weight W2 attached to one end of the right spring. Here, the other ends of the left and right springs are fixed to a virtual wall.
Figure 26 illustrates an exemplary virtual spring weight system for assessment of range of motion for an injured hand 2601 of a user on a laptop 2602 having internal camera 2603 and a display screen 2604. The laptop 2602 receives via its camera 2603 a plurality of image frames, such as a video, of movements of the injured hand 2601 in real world. The display screen 2604 of the laptop 2602 displays the plurality of image frames resulting in manipulation of at least one of the virtual objects, such as virtual spring objects and the virtual weight objects. The laptop 2602 allows control of at least one parameter of at least one of the virtual objects in a virtual world shown on the display screen 2604. For example, the spring constants K1 and K2 of the virtual spring objects could be controlled using a computer mouse. In another example, the weights W1 and W2 of the virtual weight objects could be controlled using the computer mouse. Based on the control of said at least one parameter of virtual objects, the laptop 2602 provides a feedback to the user for modifying user behaviour in real world. In one implementation, the laptop 2602 may be configured to store all these activities in its memory (not shown). In one implementation, the user may anytime, i.e., before/during/after operation, change the value of the at least one parameter of any of the virtual objects.
Figure 27 illustrates an exemplary virtual spring weight system for assessment of range of motion for an injured leg 2701 of a user on a computing system 2702 having an external camera 2703 and a computer monitor 2704. The computing system 2702 receives via the external camera 2703 a plurality of image frames, such as a video, of movements of the injured leg 2701 in real world. computer monitor 2704 of the computing system 2702 displays the plurality of image frames resulting in manipulation of at least one of the virtual objects, such as virtual spring objects and the virtual weight objects. The computing system 2702 allows control of at least one parameter of at least one of the virtual objects in a virtual world shown on the computer monitor 2704. For example, the spring constants K1 and K2 of the virtual spring objects could be controlled using a computer mouse. In another example, the weights W1 and W2 of the virtual weight objects could be controlled using the computer mouse. Based on the control of said at least one parameter of virtual objects, the computing system 2702 provides a feedback to the user for modifying user behaviour in real world. In one implementation, the computing system 2702 may be configured to store all these activities on a server (not shown) or in its internal/external memory (not shown). In one implementation, the user may anytime, i.e., before/during/after operation, change the value of the at least one parameter of any of the virtual objects.
Figure 28 illustrates an exemplary virtual spring weight system for exercise of an eye ball 2801 of a subject on a smart device 2802, such a smartphone or hand held device, having a front camera 2803 and a touchscreen display 2804. The smart device 2802 receives via its front camera 2803 a plurality of image frames, such as a video, of movements of the eye ball 2801 in real world. The touchscreen display 2804 of the smart device 2802 displays the plurality of image frames resulting in manipulation of at least one of the virtual objects, such as virtual spring objects and the virtual weight objects. The smart device 2802 allows control of at least one parameter of at least one of the virtual objects in a virtual world shown on the touchscreen display 2804. For example, the spring constants K1 and K2 of the virtual spring objects could be controlled through user input on the touchscreen display 2804. In another example, the weights W1 and W2 of the virtual weight objects could be controlled through user input on the touchscreen display 2804. Based on the control of said at least one parameter of virtual objects, the smart device 2802 provides a feedback to the user for modifying user behaviour in real world. In one implementation, the smart device 2802 may be configured to store all these activities in its memory (not shown). In one implementation, the user may anytime, i.e., before/during/after operation, change the value of the at least one parameter of any of the virtual objects.
In one embodiment, instead of using the spring and weight system as an virtual object any other virtual object that enables change of shape or position may be incorporated to implement the invention. In one example, a user may crush a virtual balloon object through movements of his/her cervical neck in real world, which is reflected in the virtual world on a real time basis. In another example, a user may deform the shape of an aerobic ball through movements of his/her cervical neck in real world, which is reflected in the virtual world on a real time basis. It will be appreciated by those skilled in the art that any virtual object, whose position or size could be controlled, could be used as the virtual object within the spirit of the invention.
Figure 29 illustrates an exemplary virtual aerobic ball system 2901 for exercise of a cervical neck 2902 of a user on a computing device 2903 having a touchscreen display 2904. The computing device 2903 receives via an internal or external camera (not shown) a plurality of image frames, such as a video, of movements of the cervical neck 2902 in real world. The touchscreen display 2904 of the computing device 2903 displays the plurality of image frames resulting in manipulation of at least one of the virtual objects, for example, deformation of a left virtual aerobic ball as shown in the figure. The computing device 2903 allows control of at least one parameter of at least one of the virtual objects in a virtual world shown on the touchscreen display 2904. For example, the elasticity E1 and E2 of the virtual aerobic balls could be controlled through user input on the touchscreen display 2904. Based on the control of said at least one parameter of virtual objects, the computing device 2903 provides a feedback to the user for modifying user behaviour in real world. In one implementation, the computing device 2903 may be configured to store all these activities on a server (not shown) or in its internal/external memory (not shown). In one implementation, the user may anytime, i.e., before/during/after operation, change the value of the at least one parameter of any of the virtual objects.
Figure 30 illustrates an exemplary virtual balloon buster system for exercise of a hand 3001 of a user on a computer tablet 3002 having a front camera 3003 and a touch screen display 3004. For exercise, the user needs to move his/her hand in such way that a virtual racket having nails busts the virtual balloons, as shown in the figure. To this end, the computer tablet 3002 receives via its front camera 3003 a plurality of image frames, such as a video, of movements of the hand 3001 in real world. The touchscreen display 3004 of the computer tablet 3002 displays the plurality of image frames, resulting in manipulation of at least one of the virtual objects, such as virtual spring objects and the virtual weight objects. The computer tablet 3002 allows control of at least one parameter of at least one of the virtual objects or the virtual world shown on the touchscreen display 3004. For example, speed S1 and S2 of the virtual racket shown in the figure could be controlled through user input on the touchscreen display 3004. In another example, the density of medium of the virtual world could be controlled through user input on the touchscreen display 3004. Based on the control of said at least one parameter of virtual objects, the computer tablet 3002 provides a feedback to the user for modifying user behaviour in real world. In one implementation, the computer tablet 3002 may be configured to store all these activities in its memory (not shown). In one implementation, the user may anytime, i.e., before/during/after operation, change the value of the at least one parameter of any of the virtual objects.
The systems and methods, implemented as per the invention, could be used to monitor cervical/neck movement so that users could correct their posture to avoid problems related to the cervical/neck pain. The invention could also be used for the assessment of cervical range of motion as well as an interactive rehabilitation. The invention could be used to assist doctors in assessment or diagnostics of cervical/neck pain to help them achieve improved results alongside their conventional techniques or manual physical assessment. The invention could be used to do an interactive comparative analysis with minor corrections on deviations which the user would have had following the assessment.
While certain present preferred embodiments of the invention have been illustrated and described herein, it is to be understood that the invention is not limited thereto, but may be otherwise variously embodied and practiced within the scope of the following claims.
CLAIMS:We claim:
1. A method for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, said method comprising the steps of:
receiving a plurality of image frames of a user indicating a first action in real world;
displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world;
allowing for the control of said at least one parameter of said at least one object in the virtual world; and
based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
2. The method as claimed in claim 1, wherein the at least one object is a combination of a virtual spring and a virtual weight.
3. The method as claimed in claim 1, wherein said at least one parameter is a physical property associated with said at least one object or the virtual world.
4. The method as claimed in claim 3, wherein the physical property is spring constant (K) of a virtual spring.
5. The method as claimed in claim 3, wherein the physical property is a weight (W) of a virtual weight.
6. The method as claimed in claim 3, wherein the physical property is the elasticity of said at least one object.
7. The method as claimed in claim 3, wherein the physical property is density of medium in the virtual world.
8. The method as claimed in claim 1, wherein the first action is Extension (E), Flexion (F), Right rotation (RR), Left Rotation (LR), Left lateral flexion (LLF), or Right lateral flexion (RLF) of cervical neck.
9. The method as claimed in claim 1, wherein the second action corresponding to the first action is generated on a real time basis.
10. The method as claimed in claim 1 further comprising:
receiving a value of said at least one parameter from a user interface.
11. The method as claimed in claim 1 further comprising:
recording data corresponding to a user using a unique user-id.
12. A system (1800) for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, said system (1800) comprising:
an input unit (1802) for receiving a plurality of image frames of a user indicating a first action in real world;
a processing unit (1807) coupled with a display unit (1806) for displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world;
the processing unit (1807) being adapted to allow for the control of said at least one parameter of said at least one object in the virtual world; and
the processing unit (1807) and the display unit (1806) being further adapted to, based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
13. The system (1800) as claimed in claim 12, wherein the at least one object is a combination of a virtual spring and a virtual weight.
14. The system (1800) as claimed in claim 12, wherein said at least one parameter is a physical property associated with said at least one object or the virtual world.
15. The system (1800) as claimed in claim 14, wherein the physical property is spring constant (K) of a virtual spring.
16. The system (1800) as claimed in claim 14, wherein the physical property is a weight (W) of a virtual weight.
17. The system (1800) as claimed in claim 14, wherein the physical property is the elasticity of said at least one object.
18. The system (1800) as claimed in claim 14, wherein the physical property is density of medium in the virtual world.
19. The system (1800) as claimed in claim 12, further comprising:
a face detector (1801) coupled with the input unit (1802) for detecting face of a user based on the received plurality of image frames.
20. The system (1800) as claimed in claim 12, further comprising:
a stick figure model (1805) for generating the second action corresponding to the first action on a real time basis.
21. A computer readable data storage medium storing a computer program for providing a feedback for modifying user behaviour in real world based on control of at least one parameter of at least one object in a virtual world, the computer program when executed on a computer, execute the steps of:
receiving a plurality of image frames of a user indicating a first action in real world;
displaying, in the virtual world, a second action corresponding to the first action, the second action resulting in manipulation of said at least one object in the virtual world;
allowing for the control of said at least one parameter of said at least one object in the virtual world; and
based on the control of said at least one parameter of said at least one object in the virtual world, providing the feedback for modifying user behaviour in real world.
| # | Name | Date |
|---|---|---|
| 1 | 2640-DEL-2014-GPA-(25-09-2014).pdf | 2014-09-25 |
| 1 | 2640-DEL-2014-IntimationOfGrant24-08-2022.pdf | 2022-08-24 |
| 2 | 2640-DEL-2014-PatentCertificate24-08-2022.pdf | 2022-08-24 |
| 2 | 2640-DEL-2014-Correspondence-Others-(25-09-2014).pdf | 2014-09-25 |
| 3 | specification.pdf | 2014-09-26 |
| 3 | 2640-DEL-2014-Written submissions and relevant documents [12-07-2022(online)].pdf | 2022-07-12 |
| 4 | IP_14_02_81_Drawings Final.pdf | 2014-09-26 |
| 4 | 2640-DEL-2014-Correspondence to notify the Controller [23-06-2022(online)].pdf | 2022-06-23 |
| 5 | FORM 5.pdf | 2014-09-26 |
| 5 | 2640-DEL-2014-FORM-26 [23-06-2022(online)].pdf | 2022-06-23 |
| 6 | FORM 3.pdf | 2014-09-26 |
| 6 | 2640-DEL-2014-US(14)-HearingNotice-(HearingDate-27-06-2022).pdf | 2022-05-26 |
| 7 | 2640-DEL-2014-FER.pdf | 2019-07-31 |
| 7 | 2640-DEL-2014-ABSTRACT [14-01-2020(online)].pdf | 2020-01-14 |
| 8 | 2640-DEL-2014-PA [18-09-2019(online)].pdf | 2019-09-18 |
| 8 | 2640-DEL-2014-CLAIMS [14-01-2020(online)].pdf | 2020-01-14 |
| 9 | 2640-DEL-2014-FER_SER_REPLY [14-01-2020(online)].pdf | 2020-01-14 |
| 9 | 2640-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf | 2019-09-18 |
| 10 | 2640-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf | 2019-09-18 |
| 10 | 2640-DEL-2014-OTHERS [14-01-2020(online)].pdf | 2020-01-14 |
| 11 | 2640-DEL-2014-Correspondence-101019.pdf | 2019-10-14 |
| 11 | 2640-DEL-2014-OTHERS-101019.pdf | 2019-10-14 |
| 12 | 2640-DEL-2014-Correspondence-101019.pdf | 2019-10-14 |
| 12 | 2640-DEL-2014-OTHERS-101019.pdf | 2019-10-14 |
| 13 | 2640-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf | 2019-09-18 |
| 13 | 2640-DEL-2014-OTHERS [14-01-2020(online)].pdf | 2020-01-14 |
| 14 | 2640-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf | 2019-09-18 |
| 14 | 2640-DEL-2014-FER_SER_REPLY [14-01-2020(online)].pdf | 2020-01-14 |
| 15 | 2640-DEL-2014-CLAIMS [14-01-2020(online)].pdf | 2020-01-14 |
| 15 | 2640-DEL-2014-PA [18-09-2019(online)].pdf | 2019-09-18 |
| 16 | 2640-DEL-2014-ABSTRACT [14-01-2020(online)].pdf | 2020-01-14 |
| 16 | 2640-DEL-2014-FER.pdf | 2019-07-31 |
| 17 | 2640-DEL-2014-US(14)-HearingNotice-(HearingDate-27-06-2022).pdf | 2022-05-26 |
| 17 | FORM 3.pdf | 2014-09-26 |
| 18 | 2640-DEL-2014-FORM-26 [23-06-2022(online)].pdf | 2022-06-23 |
| 18 | FORM 5.pdf | 2014-09-26 |
| 19 | IP_14_02_81_Drawings Final.pdf | 2014-09-26 |
| 19 | 2640-DEL-2014-Correspondence to notify the Controller [23-06-2022(online)].pdf | 2022-06-23 |
| 20 | specification.pdf | 2014-09-26 |
| 20 | 2640-DEL-2014-Written submissions and relevant documents [12-07-2022(online)].pdf | 2022-07-12 |
| 21 | 2640-DEL-2014-PatentCertificate24-08-2022.pdf | 2022-08-24 |
| 21 | 2640-DEL-2014-Correspondence-Others-(25-09-2014).pdf | 2014-09-25 |
| 22 | 2640-DEL-2014-IntimationOfGrant24-08-2022.pdf | 2022-08-24 |
| 22 | 2640-DEL-2014-GPA-(25-09-2014).pdf | 2014-09-25 |
| 1 | search_29-07-2019.pdf |