Sign In to Follow Application
View All Documents & Correspondence

Mobile Brain Tumor Detection System

Abstract: A mobile brain tumor detection system comprises of a computing unit with a processor and a Neural Processing Unit (NPU), a user interface, for allowing a user to upload an MRI scan, and view the tumor location and boundary mask with a single tap, a Yolo11n module, for detecting a brain tumor in the MRI scan by analyzing image data and identifying tumor locations, where the NPU accelerates the analysis for real-time tumor detection, a report generation module for displaying the identified tumor on the user interface with a boundary mask, and allowing the user to create a report with tumor details, a Lightweight MobileSAM module is optimized using pruning to reduce model size and improve segmentation speed on the system.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 August 2025
Publication Number
35/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.

Inventors

1. P. Krupa Chary
SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.
2. Dr. Ch. Rajendra Prasad
SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.
3. Dr. K. Raj Kumar
SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a mobile brain tumor detection system that is capable of detecting real time tumor and segmenting the tumor, maintaining high precision while simultaneously decreasing computational demands, response time, and power usage.

BACKGROUND OF THE INVENTION

[0002] A mobile brain tumor refers to a tumor within the brain that exhibits significant movement or shifting position, often due to factors like cerebrospinal fluid dynamics or surgical manipulation, which can complicate detection and treatment. Detection of brain tumor involves advanced imaging techniques such as magnetic resonance imaging (MRI) or computed tomography (CT) scans that provide detailed cross-sectional images of brain tissue. These imaging modalities help identify the tumor's location, size, and characteristics, allowing clinicians to distinguish it from surrounding healthy tissue. In cases where the tumor is mobile or its position changes over time, repeated imaging or real-time monitoring may be necessary to accurately track its movement, ensuring precise diagnosis, targeted treatment planning, and effective intervention.

[0003] Traditionally, the detection of mobile brain tumors relies primarily on repeated imaging techniques such as magnetic resonance imaging (MRI) and computed tomography (CT) scans taken at different times to monitor any changes in tumor location or size. These imaging modalities provide detailed snapshots of the brain's internal structures, allowing clinicians to identify the presence of tumors and observe any movement or shifts over time. Because of the potential mobility of some tumors, multiple scans are often necessary to accurately track their position, aiding in diagnosis, treatment planning, and monitoring response to therapy. This conventional approach depends on periodic, high-resolution imaging and expert interpretation to detect and assess tumor mobility within the brain.

[0004] US11227387B2 discloses a Methods, systems, and computer readable media to detect and model a brain tumor in an electronic image and to predict features of the brain tumor based on the model. The method can include classifying one or more magnetic resonance imaging (MRI) images of a brain into one or more of one or more tumorous images containing an image of a tumor or one or more non-tumorous images, wherein the classification is performed using a deep learning CNN system. The method can also include segmenting a tumor region from one of the one or more tumorous images. The segmenting can include a neighboring Fuzzy C-Means (FCM) process. The method can further include classifying the segmented tumor region into one of four classes of brain tumor types. The segmented tumor region is classified as a particular brain tumor type using the deep learning CNN system. The method can also include reconstructing a 3D model of the tumor region and measuring one or more of a location of the tumor, a shape of the tumor, or a volume of the tumor.

[0005] CN104834943A discloses a method for classifying brain tumors based on deep learning. When extracting features of brain tumors, Gabor wavelet transform is firstly used to extract the texture features of brain tumors, and a network of deep learning is constructed based on stacked noise reduction automatic coding. Deep learning extracts higher-level features from these texture features; secondly, the concentric circle method is used to extract the shape features of brain tumors, which are combined with the high-level features extracted by deep learning to form an augmented feature vector, and the features are used as input into the support The classifier is obtained by training in the vector machine; finally, the same method is used to extract the feature vector for the test sample, and the classifier obtained by training is used to classify it; the present invention will improve the doctor's diagnostic accuracy and provide a basis for the formulation of brain tumor surgery plans. useful information.

[0006] Conventionally, many systems have been develop to detect the mobile brain tumor but these devices require high computational resources making them impractical for use, remote healthcare facilities, This necessitates the use of separate computational hardware, making direct mobile application unfeasible. Additionally, the existing devices often lack the ability to accurately segment tumors.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that is capable of real-time detection and segmentation of brain tumors on portable mobile and edge computing devices maintaining high precision while simultaneously decreasing computational demands, response time, and power usage.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system that is capable of detecting a brain tumor by analyzing the image and detecting the location of the tumor accelerating the real-time analysis for real- time tumor detection.

[0010] Another object of the present invention is to develop a system that is capable of displaying the identified tumor on the user interface and allowing the user to create a report with tumor details.

[0011] Another object of the present invention is to develop a system that is capable of reducing the model size and improving the segmentation speed on the system.

[0012] Yet another object of the present invention is to develop a system that is capable of viewing the tumor location and boundary mask in a single tap.

[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0014] The present invention relates to a mobile brain tumor detection system that is capable of detecting the tumor in real time providing high precision segmentation of the tumor that is optimized for mobile. Additionally, the system is lightweight with low power consumption model.

[0015] According to an embodiment of the present invention, a mobile brain tumor detection system comprising a computing unit with a processor and a Neural Processing Unit (NPU), a user interface, installed on the computing unit, connected to the processor, for allowing a user to upload an MRI scan, a Yolo11n module, stored on the computing unit, connected to the processor and NPU, for detecting a brain tumor in the MRI scan by analyzing image data and identifying tumor locations, where the NPU accelerates the analysis for real-time tumor detection the, Yolo11n module is optimized using quantization, connected to the processor, to reduce model size and speed up tumor detection on the system.

[0016] According to another embodiment of the present invention, the system further comprises of a report generation module, connected to the user interface and Yolo11n module, for displaying the identified tumor on the user interface with a boundary mask, and allowing the user to create a report with tumor details, a Lightweight MobileSAM module is optimized using pruning, connected to the processor, to reduce model size and improve segmentation speed on the system, user-interface allows the user to upload an MRI scan and view the tumor location and boundary mask with a single tap.

[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a schematic diagram depicting the workflow of mobile brain tumor detection system.

DETAILED DESCRIPTION OF THE INVENTION

[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0022] The present invention relates to a mobile brain tumor detection system that is capable of segmenting and detecting the tumor in the real time with high precision supporting multiple modalities ensuring data privacy while processing on system.

[0023] Referring to Figure 1, a schematic diagram depicting the workflow of mobile brain tumor detection system. The system disclosed herein includes a computing unit with a processor and a Neural Processing Unit (NPU) like Apple Neural Engine or Qualcomm Hexagon. This approach is engineered for swift, real-time detection and segmentation of brain tumors on portable mobile and edge computing devices. Our model maintains high precision while simultaneously decreasing computational demands, response time, and power usage.

[0024] A user interface is installed on the computing unit, connected to the processor for allowing the user to upload an MRI scan. The user-interface allows the user to upload an MRI scan and view the tumor location and boundary mask with a single tap. When the user taps to upload an MRI scan, the interface, through a graphical user interface (GUI), which allows the user to select and load the image file. The processor then runs image analysis such as segmentation and boundary detection on the MRI data to identify tumor regions. Once processed, the interface displays the original MRI alongside overlayed visualizations of the tumor location and boundary mask, enabling the user to view these details with a single tap. This seamless operation integrates image handling, processing routines, and visualization modules, ensuring real-time or near-real-time feedback, thus providing an intuitive and efficient user experience.

[0025] After the user upload the MRI scan, a yolo11n module that is stored on the computing unit is connected with the processor detects a brain tumor in the MRI scan by analyzing image data and identifying tumor locations. When an MRI image is uploaded, the image data is fed into the model, which has been trained on labeled datasets to recognize tumor features. The model processes the image through a deep convolutional neural network that divides the MRI into grid cells, simultaneously predicting bounding boxes and class probabilities for potential tumor regions within each cell. Using learned patterns, the model quickly analyzes the entire image in a single pass, identifying and localizing tumor locations with bounding boxes and confidence scores. The detected regions are then communicated back to the user interface, highlighting tumor locations directly on the MRI scan, enabling efficient and accurate detection in real-time.

[0026] The NPU accelerates the analysis of real time tumor detection. The Neural Processing Unit (NPU) provides specially optimized deep learning computations, such as matrix multiplications and convolutions, which are fundamental to neural network inference. It offloads these intensive calculations from the main CPU, enabling faster processing speeds and lower latency. The NPU leverages parallel processing architectures and optimized data pathways to efficiently handle large volumes of image data, such as MRI scans, and rapidly execute complex models like tumor detection networks. This acceleration allows for real-time analysis by significantly reducing inference time, improving throughput, and maintaining high accuracy, thus enabling prompt diagnosis and decision-making in clinical settings.

[0027] The Yolo11n module is optimized using quantization, connected to the processor, to reduce model size and speed up tumor detection on the system. The Yolo11n module, optimized through quantization, reduces its numerical precision from 32-bit floating-point to lower-bit representations, such as 8-bit integers, which significantly decreases the model's size and computational requirements. This quantization process involves calibrating the model to maintain accuracy while converting the weights and activations to lower precision formats. When connected to the processor, the optimized, quantized model allows for faster inference speeds and lower memory usage, enabling real-time tumor detection on the system. The processor designed for low-precision arithmetic, facilitating rapid analysis of MRI scans to identify tumor locations efficiently, all while conserving system resources and maintaining high detection accuracy.

[0028] To speed up the process of detecting tumor and reduce the size of the model, a Lightweight MobileSAM module is optimized using pruning, connected to the processor, to reduce model size and improve segmentation speed on the system. The lightweight MobileSAM module, optimized through pruning, works by removing redundant or less important parameters and connections within the model's neural network, effectively reducing its complexity and size without significantly impacting accuracy. This pruning process simplifies the model architecture, enabling it to run more efficiently on the processor. When connected to the system's processor, the optimized MobileSAM acceleration and streamlined computations to perform faster image segmentation tasks, such as delineating tumor boundaries in MRI scans. The reduced model size not only accelerates inference speed but also lowers memory usage, facilitating real-time segmentation with improved efficiency and minimal computational overheads.

[0029] After the tumor is detected, a report generation module, connected to the user interface and Yolo11n module, for displaying the identified tumor on the user interface with a boundary mask, and allowing the user to create a report with tumor details. The report generation module works by receiving the detected tumor data, including boundary masks, from YOLOv11n, which identifies and localizes tumors within medical images. The module overlays the boundary masks onto the original images in the interface, visually highlighting the tumor regions for the user. It then gathers relevant tumor details such as size, location, and confidence scores and compiles this information into a structured report. Users interact with the interface to review these visualizations, add notes, or customize report content. Once finalized, the module generates a comprehensive report, PDF format, consolidating the visual evidence and tumor metrics, enabling clinicians to save, share, or print the findings for further diagnosis or record-keeping.

[0030] The present invention works best in the following manner, the computing unit with a processor and the Neural Processing Unit (NPU). The user interface is installed on the computing unit, connected to the processor for allowing the user to upload an MRI scan and view the tumor location and boundary mask with a single tap. The yolo11n module that is stored on the computing unit is connected with the processor detects the brain tumor in the MRI scan by analyzing image data and identifying tumor locations. The Yolo11n module is optimized using quantization, connected to the processor, to reduce model size and speed up tumor detection on the system. The Lightweight MobileSAM module is optimized using pruning, connected to the processor, to reduce model size and improve segmentation speed on the system. The NPU accelerates the analysis of real time tumor detection. The report generation module, connected to the user interface and Yolo11n module, for displaying the identified tumor on the user interface with a boundary mask, and allowing the user to create a report with tumor details.

[0031] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A mobile brain tumor detection system comprising:

i) a computing unit with a processor and a Neural Processing Unit (NPU);
ii) a user interface, installed on the computing unit, connected to the processor, for allowing a user to upload an MRI scan;
iii) a Yolo11n module, stored on the computing unit, connected to the processor and NPU, for detecting a brain tumor in the MRI scan by analyzing image data and identifying tumor locations, where the NPU accelerates the analysis for real-time tumor detection; and
iv) a report generation module, connected to the user interface and Yolo11n module, for displaying the identified tumor on the user interface with a boundary mask, and allowing the user to create a report with tumor details.

2) The system as claimed in claim 1, wherein the Yolo11n module is optimized using quantization, connected to the processor, to reduce model size and speed up tumor detection on the system.

3) The system as claimed in claim 1, wherein a Lightweight MobileSAM module is optimized using pruning, connected to the processor, to reduce model size and improve segmentation speed on the system.

4) The system as claimed in claim 1, wherein user-interface allows the user to upload an MRI scan and view the tumor location and boundary mask with a single tap.

Documents

Application Documents

# Name Date
1 202541077308-STATEMENT OF UNDERTAKING (FORM 3) [13-08-2025(online)].pdf 2025-08-13
2 202541077308-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-08-2025(online)].pdf 2025-08-13
3 202541077308-PROOF OF RIGHT [13-08-2025(online)].pdf 2025-08-13
4 202541077308-POWER OF AUTHORITY [13-08-2025(online)].pdf 2025-08-13
5 202541077308-FORM-9 [13-08-2025(online)].pdf 2025-08-13
6 202541077308-FORM FOR SMALL ENTITY(FORM-28) [13-08-2025(online)].pdf 2025-08-13
7 202541077308-FORM 1 [13-08-2025(online)].pdf 2025-08-13
8 202541077308-FIGURE OF ABSTRACT [13-08-2025(online)].pdf 2025-08-13
9 202541077308-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-08-2025(online)].pdf 2025-08-13
10 202541077308-EVIDENCE FOR REGISTRATION UNDER SSI [13-08-2025(online)].pdf 2025-08-13
11 202541077308-EDUCATIONAL INSTITUTION(S) [13-08-2025(online)].pdf 2025-08-13
12 202541077308-DRAWINGS [13-08-2025(online)].pdf 2025-08-13
13 202541077308-DECLARATION OF INVENTORSHIP (FORM 5) [13-08-2025(online)].pdf 2025-08-13
14 202541077308-COMPLETE SPECIFICATION [13-08-2025(online)].pdf 2025-08-13