Sign In to Follow Application
View All Documents & Correspondence

Shopping Assistive Device

Abstract: A shopping assistive device, comprises of a U-shaped frame 101 having an elongated handle 102 to acquire a grip, an L-shaped telescopic gripper 108 for gripping a shopping bag, a force sensor for applying a force on bag to check strength, a user interface to input items to be purchased, a touch-enabled display unit 103 for inputted items and specific stores at which offers are to be availed, an imaging unit 104 to record an item to be purchased, a tactile sensor to detect hardness of item, a sensing unit 105 to determine defects in item, an ultrasonic sensor for detecting cracks, moisture sensor for detecting moisture content of item, a laser sensor for detecting dimensions of item, a NIR (near infrared) sensor for detecting chemical composition of item, a projection unit 106 for detecting defect, a microphone 109 for actuation of features of device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 March 2025
Publication Number
13/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Ibrahim Khalil
Department of Computer Engineering-Artificial Intelligence, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.
2. Dr. Biswaranjan Acharya
Department of Computer Engineering-Artificial Intelligence, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.
3. Prof. (Dr.) Madhu Shukla
Department of Computer Engineering-Artificial Intelligence, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a shopping assistive device that is capable of assisting the user in the shopping by detecting the hardness of the item for determining the quality of the material and detecting the defect in the purchasing item for preventing the user from buying of defected item.

BACKGROUND OF THE INVENTION

[0002] Ensuring the quality and durability of purchased item is essential for making informed buying decisions. Without accurate assessment, buyers may end up with substandard or damaged products, leading to financial loss and dissatisfaction. A shopping assistant minimizes the risk of buying low quality items, ensures better value for money, and enhances overall shopping experience by providing real-time, reliable quality assessment. It is also important for preventing misjudgments and ensuring that purchased products meet the required standards.

[0003] Traditional methods include visual inspection, manual touch assessments, seller claims, basic scratch tests and comparison with similar products. Visual inspection helps identify surface flaws but cannot detect internal defects. Manual assessment relies on experience but lack consistency. Seller claims often influence purchasing decisions yet they may be exaggerated. Basic scratch tests can cause damage to the item. These methods while commonly used often lead to inaccurate assessments and maladjustments.

[0004] WO9818094A1 discloses about an invention that relates to a personal shopping system, including personal shopping devices carried with customers during purchase and selectively communicating with the store computer and a point of sale device. The system further includes a first storage device assigned to each customer with information regarding the customer's shopping profile and a second storage device assigned to the customer, with information on special offers of reduced prices on selected items, the special offers generated by the store computer in accordance with the contents of the first storage device.

[0005] US10280054B2 discloses about an invention that relates to a shopping facility personal assistance system comprises: a plurality of motorized transport units located in and configured to move through a shopping facility space; a plurality of user interface units, each corresponding to a respective motorized transport unit during use of the respective motorized transport unit; and a central computer system having a network interface such that the central computer system wirelessly communicates with one or both of the plurality of motorized transport units and the plurality of user interface units, wherein the central computer system is configured to control movement of the plurality of motorized transport units through the shopping facility space based at least on inputs from the plurality of user interface units.

[0006] Conventionally, many devices are available in the market that helps the user in assisting during shopping. However, the devices mentioned in the prior arts are lacks in assisting the user in shopping by detecting hardness for determining quality of the product. In addition, the mentioned devices are incapable of detecting the defect in purchase and projecting model of purchased item for decision making efficiency of user.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that helps in assisting during shopping and detecting hardness for determining quality of the product. Also, the device is capable of detecting the defect in purchase and projecting model of purchased item for decision making efficiency of user.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a device that is capable of assisting the user in the shopping by detecting the hardness of the item for determining the quality of the material for better choice.

[0010] Another object of the present invention is to develop a device that is capable of detecting the defect in the purchasing item for preventing the user from buying of defected item.

[0011] Yet another object of the present invention is to develop a device that is capable of projecting the model of the purchased item on the installation site for a preview for enhancing the decision-making efficiency of the user.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to a shopping assistive device that is capable of projecting the model of the purchased item on the installation site for a preview for enhancing the decision-making efficiency of the user.

[0014] According to an embodiment of the present invention, a shopping assistive device, comprises of a U-shaped frame having an elongated handle at a rear portion of the frame to acquire a grip, an L-shaped telescopic gripper is installed on frame for gripping a shopping bag, a force sensor embedded in gripper for applying a force on bag to check a strength of bag, a user interface to enable a user to input items to be purchased and inputs an occasion for which apparel is to be purchased to generate suggestion of available items along with details, suitable for the occasion, a touch-enabled display unit mounted on the handle for inputted items and specific stores at which the offers are to be availed, an artificial intelligence-based imaging unit installed in the frame for recording and processing images to record an item to be purchased.

[0015] According to another embodiment of the present invention, the device comprises of a tactile sensor embedded in the frame to detect hardness of the item and determine a type of material, a sensing unit installed in the frame to determine defects in the item to make alert the user and comprises an ultrasonic sensor for detecting cracks, a colour sensor for detecting colour of item, moisture sensor for detecting moisture content of item, a laser sensor for detecting dimensions of item, a NIR (near infrared) sensor for detecting chemical composition of item, a projection unit mounted on the frame by means of a ball and socket joint to project visuals highlighting the detected defect and project a model of a purchased item onto an installation site for a preview of by user, a microphone is provided on the handle to provide voice commands for actuation of features of device.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a shopping assistive device.

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to a shopping assistive device that is capable of detecting the defect in the purchasing item for preventing the user from buying of the defected item and projecting the model of the purchased item on the installation site for a preview for enhancing the decision-making efficiency of the user.

[0022] Referring to Figure 1, an isometric view of a shopping assistive device is illustrated, comprising a U-shaped frame 101 having an elongated handle 102, a touch-enabled display unit 103 mounted on the handle 102, an artificial intelligence-based imaging unit 104 installed in the frame 101, a sensing unit 105 installed in the frame 101, a projection unit 106 mounted on the frame 101 by means of a ball and socket joint 107, L-shaped telescopic gripper 108 is installed on the frame 101, a microphone 109 is provided on the handle 102.

[0023] The device disclosed herein employs a U-shaped frame 101 having an elongated handle 102 at a rear portion of the frame 101 to enable a user to acquire a grip over the frame 101. This frame 101 is typically constructed from material that include but not limited to high-strength materials such as reinforced steel or durable aluminum alloys, which provide a robust and resilient enclosure capable of withstanding physical impacts and environmental stressors. The handle 102 assists the user in handling the device.

[0024] For activating the device, the user needs to press a push button which is arranged on the frame 101 which in turn activates all the related components for performing the desired task. After pressing the button, a closed electrical circuit is formed and current starts to flow that powers an inbuilt microcontroller to allow all the linked components to perform their respective task upon actuation.

[0025] On the handle 102, a microphone 109 is provided that is linked with the microcontroller to enable the user to provide voice commands for actuation of the features of the device. The microphone 109 processes the voice command from the user for actuation of the features of the device by converting sound waves into electrical signals. The signals are analog in nature. These analog signals are then digitized using an analog-to-digital converter (ADC) for further processing. The digital data undergoes pre-processing, including noise reduction and filtering, to improve clarity by eliminating background noise. The cleaned signal is passed for speech recognition powered by artificial intelligence, which analyzes the input to detect keywords or phrases. Once recognized, the microcontroller maps the command and triggers the actuation of the features of the device.

[0026] For enabling a user to input the items to be purchased, a user interface is installed with a computing unit. A communication unit (not shown) is linked with a microcontroller provided in the handle 102, to receive the inputted items from the computing unit. The user interface allows the user to input items to be purchased. The interface communicates with the computing unit to process inputs and provides visual feedback, ensuring a seamless shopping experience. The computing unit processes user inputs received from the interface, organizes the data, and formats it for transmission. The processed information is then sent to the communication unit, ensuring that the microcontroller in the handle 102 receives structured data for further processing and display. The communication unit facilitates data exchange between the computing unit and the microcontroller. The communication unit transmits the list of inputted items using protocols such as Wi-Fi. By ensuring real-time data transfer, it enables the microcontroller to process the user input.

[0027] On the handle 102, a touch-enabled display unit 103 is mounted where a suggestion module linked with the microcontroller fetches and displays the best offers available for the inputted items and specific stores at which the offers are to be availed, to display on the display unit 103 for reference of the user. The touch-enabled display unit 103 operates by receiving processed data from the microcontroller, which analyzes input from the user interface. This data is converted into a digital format and transmitted to the display via an integrated driver circuit. The panel, typically an LCD, uses pixels controlled by electrical signals to visually represent the the best offers available for the inputted items and specific stores at which the offers are to be availed. The suggestion module assists in retrieving and displaying the best available offers for items inputted by the user. When the user scans an item, the microcontroller processes this input and communicates with the suggestion module. The module then fetches relevant offers, taking into account factors such as store-specific discounts, ongoing deals, and user preferences. Once the best offers are identified, they are relayed to the touch-enabled display unit 103, allowing the user to view and compare deals conveniently.

[0028] An artificial intelligence-based imaging unit 104 is configured in the frame 101 and integrated with a processor for recording and processing images in a vicinity of the frame 101, to record an item to be purchased in front of the frame 101. The imaging unit 104 comprises of an image capturing arrangement including a set of lenses that captures multiple images in vicinity of the frame 101, and the captured images are stored within a memory of the imaging unit 104 in form of an optical data. The imaging unit 104 also comprises of the processor that is integrated with artificial intelligence protocols, such that the processor processes the optical data and extracts the required data from the captured images. The extracted data is further converted into digital pulses and bits and are further transmitted to the microcontroller. The microcontroller processes the received data and evaluates the item to be purchased in front of the frame 101. The microcontroller actuates the display unit 103 to display details of the item for reference of the user.

[0029] In the frame 101, a tactile sensor (not shown) is embedded to detect the hardness of the item and determine a type of material of the item by working in synchronisation with the imaging unit 104. Further, the material is displayed on the display unit 103. The tactile sensor detects the hardness of the item by measuring the force and deformation characteristics when the item comes into contact with the sensor surface. Internally, the sensor consists of pressure-sensitive elements such as piezo resistive components that respond to applied force. When an item is pressed against the sensor, the material's resistance to deformation is recorded as a force-displacement relationship. Softer materials cause greater deformation under lower force, while harder materials show minimal deformation even under higher force. The sensor transmits this data to the microcontroller, which processes it in synchronization with an imaging unit 104 to analyze the texture and structural properties of the item. Hence, the type of material of the item is determined based on the detected hardness of the item.

[0030] The imaging unit 104 works in synchronisation with a sensing unit 105 installed in the frame 101, to determine defects in the item to actuate the display unit 103 to alert the user regarding the defect. The sensing unit 105 comprises of an ultrasonic sensor for detecting cracks, a colour sensor for detecting colour of the item, moisture sensor for detecting moisture content of the item, a laser sensor for detecting dimensions of the item, a NIR (near infrared) sensor for detecting chemical composition of the item. The ultrasonic sensor detects cracks by emitting high-frequency sound waves and analyzing their reflections from the surface of the item. The sensor consists of a transmitter that generates ultrasonic pulses and a receiver that captures the echoes. When the ultrasonic waves travel through the material, they typically pass through uninterrupted if the surface is intact. However, if there is a crack or defect, part of the wave gets reflected back earlier than expected or scatters differently. By measuring the time taken for the echoes to return and analyzing variations in wave intensity, the presence of cracks is determined.

[0031] The color sensor detects the color of the item by analyzing the light reflected from its surface. The color sensor typically consists of a light source, photodetectors, and optical filters. The sensor illuminates the item using an LED, and the reflected light is captured by photodetectors that measure the intensity of different wavelengths corresponding to red, green, and blue (RGB) components. By comparing the reflected light levels, the sensor determines the exact color of the item. Based on the identified color the user determines if the food is spoiled or not.

[0032] The moisture sensor detects the moisture content of the item by measuring changes in electrical properties such as resistance or capacitance. The moisture sensor typically consists of two probes that come into contact with the item’s surface. When moisture is present, it affects the conductivity between the probes, as water enhances electrical flow. In resistance-based sensors, higher moisture levels result in lower resistance. The microcontroller processes these readings and compares them to predefined moisture levels to determine the exact moisture content.

[0033] The laser sensor detects the dimensions of the item by emitting a laser beam and measuring the time or angle of the reflected light. The laser sensor typically operates using either the time-of-flight (ToF) principle or triangulation. In ToF-based sensors, the sensor emits a laser pulse toward the item, and the time taken for the light to reflect back is used to calculate the distance. By scanning multiple points, the sensor determines the length, width, and height of the item. Hence, the dimension of the item is determined.

[0034] The Near-Infrared (NIR) sensor detects the chemical composition of the item by analyzing how it absorbs and reflects near-infrared light. The sensor emits NIR light, typically in wavelengths ranging from 700 nm to 2500 nm, onto the item's surface. Different chemical compounds absorb and reflect specific wavelengths of NIR light in unique patterns based on their molecular structure. The reflected light is captured by a detector, and the absorption spectrum is analyzed to identify the presence and concentration of specific chemical components. So, the chemical composition of the item is analyzed.

[0035] Upon detection of the defects in the item, a projection unit 106 is mounted on the frame 101 projects visuals onto the items thereby highlighting the detected defect. The projection unit 106 functions by receiving defect detection data from the sensing unit 105 and visually marking the detected defects directly on the item’s surface. Internally, the projection unit 106 consists of a light source, typically a laser, and an optical arrangement that precisely controls the projection. When the sensing unit 105 identifies defects such as cracks, color inconsistencies, or surface irregularities, the microcontroller processes this data and generates a corresponding visual highlight, such as a colored symbol. The projection unit 106 then aligns the projected visuals with the defect locations in real-time, ensuring accurate marking thereby highlighting the detected defect.

[0036] The projection unit 106 is mounted on the frame 101 by means of a ball and socket joint 107. The motorized ball and socket joint 107 enables precise rotational movement in multiple directions by integrating an electric motor. The ball, is typically attached to a shaft, fits into the socket, allowing it to rotate freely around several axes. The motor is responsible for rotating the ball within the socket, providing controlled movement along different planes for easing the projection of the visuals on the defected item.

[0037] An L-shaped telescopic gripper 108 is mounted on the frame 101 for gripping a shopping bag. The L-shaped telescopic gripper 108 is designed to securely grip the shopping bag by extending and contracting the telescopic body based on the bag’s position. The gripper 108 consists of a telescopic mechanism that allows controlled extension and retraction. The telescopic gripper 108 utilizes the pneumatic unit for the operation. The pneumatic unit for extension and retraction operates using compressed air to drive a piston inside a cylinder. When air is supplied to one side of the piston, it creates pressure that pushes the piston rod outward, causing extension. To retract, air is supplied to the opposite side while the initial chamber is vented, pulling the piston rod back. The bag is securely gripped by the gripper 108 arranged at the end of the telescopic body of the telescopic gripper 108.

[0038] In the gripper 108, a force sensor is embedded which enables a regulated actuation of the gripper 108 for applying a force on the bag. The force is applied to check a strength of the bag. The imaging unit 104 detects a damage to the bag upon applying the regulated force. The force sensor in the gripper 108 enables controlled actuation by measuring the force applied to the shopping bag. The sensor operates based on strain gauge principles, where the sensor detects minute deformations caused by applied force. When the gripper 108 engages with the bag, the force sensor continuously monitors the pressure exerted to avoid excessive force that could damage the bag. As the force increases, the microcontroller processes the sensor data and determines the bag’s strength based on predefined thresholds. Simultaneously, the imaging unit 104 observes any visible signs of damage, such as tears or stretching, caused by the applied force.

[0039] With the microcontroller, an installation module is configured to actuate the projection unit 106 to project a model of a purchased item onto an installation site selected by the user, for a preview of the user. The installation module works in coordination with the microcontroller to project a virtual model of a purchased item onto a user-selected installation site, allowing for a realistic preview. When the user selects a location for installation, the module processes spatial data to analyze the environment. Based on this analysis, the module generates a scaled 3D projection of the item, aligning it with real-world dimensions and perspectives. The projection unit 106 then displays the model onto the installation site, to ensure an accurate representation. This allows users to visualize how the item will fit and appear in the desired space before final placement, enhancing decision-making efficiency. The user inputs the purchased item to be previewed and the installation site via the user interface.

[0040] The user inputs an occasion for which apparel is to be purchased into the user interface to actuate the suggestion module to generate suggestions of available items along with details, suitable for the occasion, for reference of the user. The suggestion module operates by generating recommendations for apparel based on the occasion inputted by the user. When the user selects or enters an occasion (such as a wedding, formal event, or casual outing) into the user interface, the suggestion module analyzes factors such as event formality and trending styles to filter the most suitable clothing items. The selected apparel suggestions are displayed on the user interface, enabling the user to make an informed purchase decision based on occasion-appropriate attire.

[0041] The present invention works best in the following manner, where the U-shaped frame 101 is having the elongated handle 102 to enable the user to acquire the grip over the frame 101. The user interface is adapted to be installed with the computing unit to enable the user to input items to be purchased. The communication unit receives the inputted items from the computing unit. The microphone 109 enables the user to provide voice commands for actuation of features of the device. The touch-enabled display unit 103, where the suggestion module linked with the microcontroller fetches and displays best offers available for the inputted items and specific stores at which the offers are to be availed, to display on the display unit 103 for reference of the user. The artificial intelligence-based imaging unit 104 is for recording and processing images in the vicinity of the frame 101 to record an item to be purchased in front of the frame 101. The tactile sensor detect hardness of the item and determine the type of material of the item in synchronisation with the imaging unit 104 to display the material on the display unit 103. The sensing unit 105 is installed in the frame 101, which is in synchronisation with the imaging unit 104 to determine defects in the item to actuate the display unit 103 to make alert the user regarding the defect.

[0042] In continuation, the sensing unit 105 comprises an ultrasonic sensor for detecting cracks, the colour sensor for detecting colour of the item, the moisture sensor for detecting moisture content of the item, the laser sensor for detecting dimensions of the item, the NIR (near infrared) sensor for detecting chemical composition of the item. The projection unit 106 projects visuals onto the items highlighting the detected defect. The projection unit 106 is mounted on the frame 101 by means of the ball and socket joint 107. The L-shaped telescopic gripper 108 is for gripping the shopping bag, the force sensor enables the regulated actuation of the gripper 108 for applying the force on the bag to check the strength of the bag, the imaging unit 104 is detecting the damage to the bag upon applying the regulated force. The installation module is to actuate the projection unit 106 to project the model of the purchased item onto the installation site selected by the user for the preview of the user. The user inputs the purchased item to be previewed and the installation site via the user interface. The user inputs the occasion for which apparel is to be purchased into the user interface to actuate the suggestion module to generate suggestion of available items along with details, suitable for the occasion for reference of the user.

[0043] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A shopping assistive device, comprising:

i) a U-shaped frame 101 having an elongated handle 102 at a rear portion of said frame 101 to enable a user to acquire a grip over said frame 101;

ii) a user interface is adapted to be installed with a computing unit to enable a user to input items to be purchased, wherein a communication unit is linked with a microcontroller provided in said handle 102, to receive said inputted items from said computing unit;

iii) a touch-enabled display unit 103 is mounted on said handle 102 and connected with said microcontroller provided in said frame 101, wherein a suggestion module is linked with said microcontroller, which fetches and displays best offers available for said inputted items and specific stores at which said offers are to be availed, to display on said display unit 103 for reference of said user;

iv) an artificial intelligence-based imaging unit 104, is installed in said frame 101 and is integrated with a processor for recording and processing images in a vicinity of said frame 101, to record an item to be purchased in front of said frame 101, to trigger said microcontroller to actuate said display unit 103 to display details of said item for reference of said user;

v) a tactile sensor is embedded in said frame 101 to detect hardness of said item and determines a type of material of said item in synchronisation with said imaging unit 104, to display said material on said display unit 103;

vi) a sensing unit 105 is installed in said frame 101, which is in synchronisation with said imaging unit 104, to determine defects in said item to actuate said display unit 103 to make alert said user regarding said defect; and

vii) a projection unit 106 is mounted on said frame 101 to project visuals onto said items highlighting said detected defect.

2) The device as claimed in claim 1, wherein said projection unit 106 is mounted on said frame 101 by means of a ball and socket joint 107.

3) The device as claimed in claim 1, wherein said sensing unit 105 comprises of many sensors including an ultrasonic sensor for detecting cracks, a colour sensor for detecting colour of said item, moisture sensor for detecting moisture content of said item, a laser sensor for detecting dimensions of said item, a NIR (near infrared) sensor for detecting chemical composition of said item.

4) The device as claimed in claim 1, wherein an L-shaped telescopic gripper 108 is installed on said frame 101 for gripping a shopping bag, wherein a force sensor is embedded in said gripper 108, enables a regulated actuation of said gripper 108 for applying a force on said bag to check a strength of said bag, said imaging unit 104 is detecting a damage to said bag upon applying said regulated force.

5) The device as claimed in claim 1, wherein an installation module is configured with said microcontroller, to actuate said projection unit 106 to project a model of a purchased item onto an installation site selected by said user, for a preview of said user.

6) The device as claimed in claim 1, wherein said user inputs said purchased item to be previewed and said installation site via said user interface.

7) The device as claimed in claim 1, wherein a microphone 109 is provided on said handle 102, which is linked with said microcontroller to enable said user to provide voice commands for actuation of features of said device.

8) The device as claimed in claim 1, wherein said user inputs an occasion for which apparel is to be purchased, said user interface furthet actuates said suggestion module to generate suggestion of available items along with details, suitable for said occasion, for reference of said user.

Documents

Application Documents

# Name Date
1 202521024289-STATEMENT OF UNDERTAKING (FORM 3) [18-03-2025(online)].pdf 2025-03-18
2 202521024289-REQUEST FOR EXAMINATION (FORM-18) [18-03-2025(online)].pdf 2025-03-18
3 202521024289-REQUEST FOR EARLY PUBLICATION(FORM-9) [18-03-2025(online)].pdf 2025-03-18
4 202521024289-PROOF OF RIGHT [18-03-2025(online)].pdf 2025-03-18
5 202521024289-POWER OF AUTHORITY [18-03-2025(online)].pdf 2025-03-18
6 202521024289-FORM-9 [18-03-2025(online)].pdf 2025-03-18
7 202521024289-FORM FOR SMALL ENTITY(FORM-28) [18-03-2025(online)].pdf 2025-03-18
8 202521024289-FORM 18 [18-03-2025(online)].pdf 2025-03-18
9 202521024289-FORM 1 [18-03-2025(online)].pdf 2025-03-18
10 202521024289-FIGURE OF ABSTRACT [18-03-2025(online)].pdf 2025-03-18
11 202521024289-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-03-2025(online)].pdf 2025-03-18
12 202521024289-EVIDENCE FOR REGISTRATION UNDER SSI [18-03-2025(online)].pdf 2025-03-18
13 202521024289-EDUCATIONAL INSTITUTION(S) [18-03-2025(online)].pdf 2025-03-18
14 202521024289-DRAWINGS [18-03-2025(online)].pdf 2025-03-18
15 202521024289-DECLARATION OF INVENTORSHIP (FORM 5) [18-03-2025(online)].pdf 2025-03-18
16 202521024289-COMPLETE SPECIFICATION [18-03-2025(online)].pdf 2025-03-18
17 Abstract.jpg 2025-03-25
18 202521024289-FORM-26 [03-06-2025(online)].pdf 2025-06-03