Abstract: An interactive humanoid robot (200) that assists in teaching in a classroom learning environment is disclosed herewith. The robot (200) receives a signal indicating a question being raised by a student. The robot (200) processes the signal using a processor (210), the processor (210) is configured to provide contextual based interaction in a collaborative classroom environment. The collaborative classroom environment comprises one or more students, one or more teachers and the robot (200). Further, the robot (200) does a speech to text conversion. The robot determines the identity of the student. Further, the robot (200) determines the answer for the question raised by the student and delivers the answer to the question. To be published with FIG.2
DESC:FIELD OF THE INVENTION
[001] The present disclosure generally relates to robots, and more specifically, relates to a humanoid robot that can interact with students and teachers to assist students in collaborative classroom learning.
BACKGROUND OF THE INVENTION
[002] Access to quality education is a fundamental right and in many parts of the world there is scarcity of quality teachers. Quality education involves providing academic education along with the overall intellectual, emotional, social, physical, and creative development of the student. At times, even teachers who are efficient in academic teaching tend to shift focus as they are burdened with other work like administrative work, taking attendance, conducting assessments etc.
[003] Currently, there are many online web based platforms and electronic devices available for assisting students in their academic learning pursuit. However, it has been found that students learn better when there is an embodied social agent teaching them as compared to an online web based platform or an electronic device. Humanoid robots can function as the embodied social agent to teach like a teacher in a classroom environment. Humanoid robots are robots that are built to resemble a human body and configured to achieve a specific task or a function, for example, assist in teaching. These robots can relieve the teacher of the many duties including lesson planning, completing the syllabus etc., and allow the teacher to give personalized learning to students by focusing on competency building and other individual needs of the student.
[004] Existing humanoid robots are limited in their cognitive capability of understanding a classroom learning environment. Existing humanoid are not scalable to understand the various tasks that happen in a classroom learning environment, like answering to student’s questions, conducting assessments, delivering educational content according to the academic syllabus, and unable to assist students in collaborative classroom learning.
BRIEF SUMMARY OF THE INVENTION
[005] This summary is provided to introduce a selection of concepts in a simple manner that is further described in the detailed description of the disclosure. This summary is not intended to identify key or essential inventive concepts of the subject matter nor is it intended for determining the scope of the disclosure.
[006] The present disclosure discloses an interactive humanoid robot to facilitate collaborative classroom learning. The interactive humanoid robot includes a processor configured to provide contextual based interaction in a collaborative classroom environment. The collaborative classroom environment comprises one or more students, one or more teachers and the robot.
[007] The present disclosure discloses a method of collaborative classroom learning. The method of collaborating classroom learning includes recognizing a query from a student by the humanoid robot, determining identity of the student by the humanoid robot, analyzing context of the query by the humanoid robot, and providing answer to the query by the humanoid robot. Providing answer to the query includes searching within a database, searching from the internet, or passing on the query to a teacher by the humanoid robot if the robot is unable to answer the query.
[008] To further clarify advantages and features of the present disclosure, a more particular description of the disclosure will be rendered by reference to specific embodiments thereof, which is illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting of its scope. The disclosure will be described and explained with additional specificity and detail with the accompanying figures.
BRIEF DESCRIPTION OF THE FIGURES
[009] These and other features, aspects, and advantages of the exemplary embodiments can be better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0010] FIG. 1 illustrates an anterior view of an interactive humanoid robot in accordance with an embodiment of the present disclosure;
[0011] FIG. 2 illustrates an anterior view of an interactive humanoid robot in accordance with another embodiment of the present disclosure;
[0012] FIG. 3 illustrates a flow diagram of how the interactive humanoid robot identifies a student and interacts with the student, in accordance with an embodiment of the present disclosure;
[0013] FIG. 4 illustrates a collaborative learning model, in accordance with an embodiment of the present disclosure;
[0014] FIG. 5 illustrates a flow diagram of an objective type assessment model, in accordance with an embodiment of the present disclosure;
[0015] FIG. 5A illustrates a teacher user-interface webpage, in accordance with an embodiment of the present disclosure;
[0016] FIG. 5B shows a student webpage, in accordance with an embodiment of the present disclosure;
[0017] FIG. 6 illustrates a control centre architecture for managing robots, in accordance with an embodiment of the present disclosure;
[0018] FIG. 6A and FIG.6B illustrates dashboards, in accordance with an embodiment of the present disclosure; and
[0019] FIG. 6C, FIG. 6D, and FIG. 6E illustrates robot interface page, in accordance with an embodiment of the present disclosure.
[0020] Further, skilled artisans will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the robot, one or more components of the robot may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the figures with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION OF THE INVENTION
[0021] For promoting an understanding, the principles of the invention, reference will now be made to the embodiments illustrated in the figures and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0022] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
[0023] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion such that a process or method that comprises a list of steps does not comprise only those steps but may comprise other steps not expressly listed or inherent to such a process or a method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components. Appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0024] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
[0025] In addition to the illustrative aspects, exemplary embodiments, and features described above, further aspects, exemplary embodiments of the present disclosure will become apparent by reference to the drawings and the following detailed description.
[0026] Embodiments of the present disclosure describes an interactive humanoid robot that assists in teaching in a classroom learning environment.
[0027] FIG. 1 illustrates an anterior view of an interactive humanoid robot in accordance with an embodiment of the present disclosure. A humanoid robot is a robot that is built to resemble a human body and configured to achieve a specific task or a function. The present disclosure discusses how the humanoid robot interacts with the students and teachers in a classroom learning environment.
[0028] The interactive humanoid robot 100 includes a head 105, an upper body110, hands (115A, 115B), and lower body (120).
[0029] The head 105 is manufactured by 3D printing and is made of plastic material, for example, acrylonitrile butadiene styrene (ABS) which is a common thermoplastic polymer. The head 105 comprises eyes and jaws which are static. Further it also includes neck that is movable in the left and right direction.
[0030] The upper body 110 is not movable, and it is in static position. The upper body houses a camera for facial recognition of students. Further, the hands (115A, 115B) are robotic arms that are programmed to provide direct and inverse kinematics. The translation and rotation motion of the joints in the hands (115A, 115B) can be achieved with direct kinematics. The overall motion of the hands (115A, 115B) can be defined with inverse kinematics. The hand (115A, 115B) can be divided into multiple parts that are connected using joints. Parts can resemble human body parts like shoulder, wrist etc., while joints are locations where motion is allowed between parts. Each joint can have a certain degree of freedom. The lower body 120 is designed to look like a woman’s skirt. The upper body 110 and lower body 120 house the internal components like battery and NUC. One of the core system components is Next Unit of Computing (NUC) system by Intel®. NUC system has internal processor and is used to power the entire robot and all its movements. In one embodiment, the motors used for motion of the robots is the Dynamixel Model 64-T and Model AX-18. Further the robot is programed with intelligent functionalities like speech to text and text to speech conversion, identification of the student, understanding the questions raised by a student, answering the questions raised by a student according to the grade of the student, conducting formative assessment of the students, etc.
[0031] FIG. 2 illustrates an anterior view of an interactive humanoid robot 200 in accordance with another embodiment of the present disclosure. The head 205 can be touch screen to input various commands to the robot. It is to be noted that the external appearance of the humanoid robot can vary and is not restrictive to the embodiments of the present disclosure. The interactive humanoid robot facilitates collaborative classroom learning. The interactive humanoid robot includes a processor 210 configured to provide contextual based interaction in a collaborative classroom environment. The collaborative classroom environment comprises one or more students, one or more teachers and the robot. The robot is configured to: recognize a query from a student, determine identity of the student, analyse context of the query, and provide answer to the query. The robot provides answer to the query by one of searching within a database, searching from the internet; or the robot passes on the query to a teacher if the robot is unable to answer the query. Another aspect of the robot is the self-learning feature. It understands the context, updates in case of corrections or new context through reinforcement learning. Another aspect of the robot is it can analyze sentiments of a student through text and verbal inputs. It can provide positive as well as negative review of the student answers. Further, the robot is configured to conduct assessment of the students.
[0032] Referring to FIG. 3, FIG. 3 illustrates a flow diagram of how the interactive humanoid robot identifies a student and interacts with the student, in accordance with an embodiment of the present disclosure.
[0033] The flow diagram starts at step 300.
[0034] At step 305, robot receives a signal indicating a question being raised by a student. The student raises hand or stands up to speak. The robot detects the motion and the origin of the sound. Further, the robot turns towards the student to answer the student.
[0035] At step 310, the robot recognizes the question. Recognizing the question involves understanding start point and end point of the question. This is achieved through a microphone beep at the student’s end through which the robot determines the start point and the end point of the question. Further, the robot does a speech to text conversion.
[0036] At step 315, robot determines the identity of the student. Determining the identity of the student involves multiple sub steps. Firstly, the robot retrieves the name of the student and other identification aspects of the student from a database. Next, the robot determines the level score of the student. Lastly, the robot identifies the grade in which the student is studying.
[0037] At step 320, robot determines the answer for the question raised by the student. Determining the answer involves searching a predefined database. The predefined database includes answers for the various questions. In one embodiment, a single question can have multiple answers according to the grade of the student. For example, a 7th grade student grasping level of a certain concept, for example, “about black holes”, will be different from a 12th grade student grasping level of the same concept. The robot can identify the grade of the student and accordingly choose the appropriate answer from the set of answers to a question. In another embodiment, the robot can look up the internet to find answer to the question.
[0038] At step 325, the robot delivers the answer to the question. Answering the question involves multiple steps. Firstly, the answer needs to be converted from text to speech. Then it needs to be delivered to the student. Lastly, the robot needs to determine if the student is satisfied with the answer to the question. If the student is unable to understand the answer, the robot determines the complexity and the grade level, and tries to deliver the next simplified relevant answer to the question.
[0039] At step 330, the flow diagram stops.
[0040] In one embodiment, the robot conducts formative assessments in the class. During the assessment, if the robot finds out that majority of students are unable to answer it since complexity level of the questions in the assessment is high, then the robot can restructure the assessment to step down the difficulty level of the assessment. It is to be noted that, the step-down or step-up of the difficulty level is performed instantaneously and is done on the go during the assessment. Once assessments are done, the robot stores the results in the database and can provide feedback to the students and teachers about the performance of the students. The stored results are further used to assess the student while the student raises a question, and further used to answer according to the students grasping capacity. This process is iterative, and the robot assesses the students throughout their academic tenure in the campus. Formative assessments can be used to determine level score of the students.
[0041] The robot provides collaborative classroom learning. The method of collaborating classroom learning includes recognizing a query from a student by the humanoid robot, determining identity of the student by the humanoid robot, analyzing context of the query by the humanoid robot, and providing answer to the query by the humanoid robot. Providing answer to the query includes searching within a database, searching from the internet, or passing on the query to a teacher by the humanoid robot if the robot is unable to answer the query.
[001] FIG. 4 illustrates a collaborative learning model, in accordance with an embodiment of the present disclosure. The collaborative learning model is classroom learning model where teacher, students, and robot interact to provide a collaborative classroom learning. Typically, classroom interactions between students and the teacher are usually limited to content based questions that take up most of the teacher’s time, not allowing him/her to delve into areas of the lesson that cannot be covered by the internet, such as the concept and relevance. In order to overcome the aforementioned problem, a collaborative learning model is developed. In collaborative learning, an automated conversational model is developed using the robot, that will provide the students with accurate and detailed information in a human like manner, allowing the teacher to focus on the concept and relevance of the lesson. The collaborative learning model also interchangeably referred to as conversational model involves the student, the teacher and the robot. The robot will ask a question to student and the student will appropriately answer the robot. This works vice versa too, where the student asks robot a question. If the robot is unable to provide appropriate answers, then the teacher will solve the students’ queries. The system for the conversational model consists of three parts. They are given below:
1. Q & A Sources
2. Database
3. Reinforcement learning model
[002] 1. Q&A Sources: Generally, teachers provide most FAQs that will be uploaded to the database. But sometimes these questions aren’t sufficient. In that case the robot looks for answers from within the content. If the answer is not found within the content, then the robot will search the Internet and solve the student queries. If robot is unable to find answers over the Internet, then the question will pass on to the Teacher. The lesson content is also used for question generation. For question generation, the robot uses Text to Text Transformer and Generative Pre-trained Transformer 2 (GPT2) model.
[003] 2. Database: In this model, the MongoDB is used as database because of the NOSQL nature, easily scalable and Json format of storage. MongoDB has the Database at top layer and Collection is 2nd layer and Documents are 3rd and bottom layer of Database. For Q&A, the same Database is used. Collection is collections of same type of documents.
[004] 3. Reinforcement Learning Model: Reinforcement Learning Model has the capability to learn. This model has basic and hardcoded rules. In one example, the basic and hardcoded rules are as given below:
I. If the fed in correct answer and the student’s answer are more than 50% similar and sentiment is same, or at least one keyword is available in the answer, then the robot considers it as a right answer.
II. If the student’s question is similar to the database question and the meaning is same with more than 50% match, the robot chooses that question and give student answer to that question.
III. If the database doesn’t have similar question, then robot will search answer in lesson script. In this case if the answer accuracy is more than 70%, then only will the answer be considered as the right answer.
IV. If the 2nd and 3rd rule are not satisfied, then robot will use Internet query for answers. With help of the 3rd rule, the robot will select the best answer out of internet answers.
V. All the question generated from transformer model should have a minimum accuracy of 80%.
[005] All these rules use baselines of Reinforcement learning and the Q-learning approach is used as Learning Approach that is basically reward and penalty based approach. Reinforcement learning involves an agent, a set of states, and a set of actions per state. By performing an action, the agent transitions from state to state. Executing an action in a specific state provides the agent with a reward.
[006] The Goal of the agent is to maximize its total reward. It does this by adding the maximum reward attainable from future states to the reward for achieving its current state, effectively influencing the current action by the potential future reward. This potential reward is the weighted sum of the expected values of rewards of all future steps starting from the current state.
[007] FIG. 5 illustrates a flow diagram of an objective type assessment model, in accordance with an embodiment of the present disclosure. Using the objective type assessment model, one can assess if the student has understood the concept of a lesson taught in the class and solves doubts that may arise during the course of learning. Functioning of Assessment model consists of the following:
[008] 1. Teacher user-interface webpage,
[009] 2. Student Webpage of quiz, result, future reading material, feedbacks,
[0010] 3. Server Database,
[0011] 4. Doubt clearing based on results through robot.
[0012] 1. Teacher user-interface webpage: Teacher user-interface webpage is shown in the FIG. 5A. Every assessment consists of two concepts and each concept has two questions. The teacher will need to go to the webpage and upload the assessment. All lessons are designed such that the lesson can have only two concepts and questions are chosen such they that clearly assess conceptual understanding.
[0013] 2. Student Webpage of quiz, result, future reading material, feedbacks:
FIG. 5B shows the student webpage with the ‘Give Test’ and ‘Results’ links. Through the ‘Give Test’ icon the student can attempt the assessment and through the ‘Results’ icon the student can see the assessment results along with further reading materials.
[0014] 3. Server Database: The Server Database stores all the assessment information, results, future reading material and student’s feedback. The server database is further explained in detail in conjunction with FIG. 6.
[0015] 4. Doubt clearing based on results through robot: Apart from individual results, the robot also analyses the overall class performance and checks if more than 50 % student gave the wrong answer for either of the questions. Then the robot further clarifies the concept.
[0016] FIG. 6 illustrates a control centre architecture for managing robots, in accordance with an embodiment of the present disclosure. In general, managing robots, including controlling, updating and handling and monitoring of multiple robots is a troublesome job. Apart from this due to cost and memory (RAM) requirement of advance NLP program, implementation of individual programs to make the robot intelligent is not an efficient way. It is an objective of the present disclosure to build a Control Centre that is efficient, scalable and that can control, debug and transfer data with minimal lag to all robots, where all the advance programs of NLP can be implemented at one place and be used by all the robots. The Control Centre consists of the following modules:
1. APIs for data transfer between robots and server,
2. Dashboard for graphical representation,
3. Files system for lesson upload, view and transfer,
4. Different role of users according to functions,
5. Robots interface page giving control and access for monitoring robots.
[0017] 1. APIs for data transfer between robots and server: Flask is used as an API library between robot and server because of microframework structure and flask security used for security. Every function has a specific python function for data transfer.
[0018] 2. Dashboard for graphical representation: The dashboard shown in FIG. 6A and FIG.6B shows the data for the total number of robots, students, classes and lessons, overall feedback as well as module based feedback.
[0019] 3. Files system for lesson upload, view and transfer: Files system is common for all schools. Every school has multiple curriculum and every curriculum have multiple languages. The Files System contains all subjects, topics and sub-topics grade-wise and curriculum wise.
[0020] 4. Different roles of users according to functions: Control Centre has multiple roles for seamless function and to avoid overlap. With the Control Centre all roles can be assigned, edited or modified. We can assign student, teacher, manager (one for one school) and admin manager roles.
[0021] 5. Robot’s interface page giving control and access for monitoring robots: Through the robot interface page, the admin manager can monitor school activity, and grades as well check the present working status of all robots in a school. FIG. 6C, FIG. 6D, and FIG. 6E shows Robot information details.
[0022] It is to be noted that, implementation of the present disclosure is not limited to determining the question raised by a student and delivering the answer, or to conduct formative assessments of students, but also it comprises various other functions that the robot can do including interacting with teachers and understanding their work load and the syllabus coverage, getting feedback from the teachers regarding the delivery of the content, guiding students in their overall competency level and in extra-curricular activities, etc. The robot can deliver content through graphics, audio, video, or any multimedia. The robot can also ask questions to students and evaluate their answers apart from answering student queries. The collaborative learning model enables the teacher to focus on better learning outcomes for the students and removes the mundane task of delivering content which otherwise can be delivered by the robot. Classroom interactions between students and the teacher are usually limited to content based questions that take up most of the teacher’s time, not allowing him/her to delve into areas of the lesson that cannot be covered by the internet, such as the concept and relevance, the humanoid robot aids the teachers in achieving better results for their students.
[0023] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0024] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible.
,CLAIMS:1. An interactive humanoid robot (200) to facilitate collaborative classroom learning, the interactive humanoid robot comprising:
a processor (210) configured to provide contextual based interaction in a collaborative
classroom environment, wherein the collaborative classroom environment comprises one or more students, one or more teachers and the robot.
2. The interactive humanoid robot (200) as claimed in claim 1, wherein the robot is configured to:
recognize a query from a student;
determine identity of the student;
analyse context of the query; and
provide answer to the query.
3. The interactive humanoid robot (200) as claimed in claim 2, wherein the robot provides answer to the query by one of:
searching within a database;
searching from the internet; and
passing on the query to a teacher if the robot is unable to answer the query.
4. The interactive humanoid robot (200) as claimed in claim 1, wherein the robot performs self-learning through reinforcement learning.
5. The interactive humanoid robot (200) as claimed in claim 1, wherein the robot is configured to analyze sentiments of a student through text and verbal inputs.
6. The interactive humanoid robot (200) as claimed in claim 1, wherein the robot is configured to conduct assessment of the students.
7. A method of collaborative classroom learning, the method comprising:
recognizing a query from a student by a humanoid robot;
determining identity of the student by the humanoid robot;
analyzing context of the query by the humanoid robot; and
providing answer to the query by the humanoid robot, wherein providing answer to the query comprises:
searching within a database;
searching from the internet; and
passing on the query to a teacher by the humanoid robot if the robot is unable to answer the query.
8. The method as claimed in claim 7, comprising self-learning through reinforcement learning.
9. The method as claimed in claim 7, comprising analyzing sentiments of a student through text and verbal inputs.
10. The method as claimed in claim 7, comprising conducting assessment of the students by the humanoid robot.
| # | Name | Date |
|---|---|---|
| 1 | 202041034719-STATEMENT OF UNDERTAKING (FORM 3) [12-08-2020(online)].pdf | 2020-08-12 |
| 2 | 202041034719-PROVISIONAL SPECIFICATION [12-08-2020(online)].pdf | 2020-08-12 |
| 3 | 202041034719-FORM 1 [12-08-2020(online)].pdf | 2020-08-12 |
| 4 | 202041034719-DRAWINGS [12-08-2020(online)].pdf | 2020-08-12 |
| 5 | 202041034719-DECLARATION OF INVENTORSHIP (FORM 5) [12-08-2020(online)].pdf | 2020-08-12 |
| 6 | 202041034719-Proof of Right [03-11-2020(online)].pdf | 2020-11-03 |
| 7 | 202041034719-FORM-26 [03-11-2020(online)].pdf | 2020-11-03 |
| 8 | 202041034719-PostDating-(12-08-2021)-(E-6-208-2021-CHE).pdf | 2021-08-12 |
| 9 | 202041034719-APPLICATIONFORPOSTDATING [12-08-2021(online)].pdf | 2021-08-12 |
| 10 | 202041034719-ENDORSEMENT BY INVENTORS [14-09-2021(online)].pdf | 2021-09-14 |
| 11 | 202041034719-DRAWING [14-09-2021(online)].pdf | 2021-09-14 |
| 12 | 202041034719-CORRESPONDENCE-OTHERS [14-09-2021(online)].pdf | 2021-09-14 |
| 13 | 202041034719-COMPLETE SPECIFICATION [14-09-2021(online)].pdf | 2021-09-14 |
| 14 | 202041034719-Request Letter-Correspondence [12-10-2021(online)].pdf | 2021-10-12 |
| 15 | 202041034719-Covering Letter [12-10-2021(online)].pdf | 2021-10-12 |
| 16 | 202041034719-FORM 3 [18-11-2021(online)].pdf | 2021-11-18 |
| 17 | 202041034719-FORM 3 [21-04-2022(online)].pdf | 2022-04-21 |
| 18 | 202041034719-OTHERS [16-08-2023(online)].pdf | 2023-08-16 |
| 19 | 202041034719-FORM-9 [16-08-2023(online)].pdf | 2023-08-16 |
| 20 | 202041034719-EDUCATIONAL INSTITUTION(S) [16-08-2023(online)].pdf | 2023-08-16 |
| 21 | 202041034719-FORM 18A [22-08-2023(online)].pdf | 2023-08-22 |
| 22 | 202041034719-FER.pdf | 2023-10-30 |
| 23 | 202041034719-FORM 3 [31-03-2024(online)].pdf | 2024-03-31 |
| 24 | 202041034719-FORM 4(ii) [29-04-2024(online)].pdf | 2024-04-29 |
| 25 | 202041034719-OTHERS [30-07-2024(online)].pdf | 2024-07-30 |
| 26 | 202041034719-FER_SER_REPLY [30-07-2024(online)].pdf | 2024-07-30 |
| 27 | 202041034719-DRAWING [30-07-2024(online)].pdf | 2024-07-30 |
| 28 | 202041034719-CLAIMS [30-07-2024(online)].pdf | 2024-07-30 |
| 29 | 202041034719-ABSTRACT [30-07-2024(online)].pdf | 2024-07-30 |
| 30 | 202041034719-PatentCertificate02-09-2024.pdf | 2024-09-02 |
| 31 | 202041034719-IntimationOfGrant02-09-2024.pdf | 2024-09-02 |
| 32 | 202041034719-Response to office action [11-02-2025(online)].pdf | 2025-02-11 |
| 1 | SearchHistoryE_27-10-2023.pdf |