Sign In to Follow Application
View All Documents & Correspondence

System For Ai Powered File Compression

Abstract: A system for AI-powered file compression comprising an AI module embedded within a file manager that selects an optimal compression method based on file attributes, using a machine learning model trained to predict effective compression protocols for minimizing storage while preserving data quality, a decision engine that evaluates file type, and available system resources incorporating a rule-based system to handle real-time trade-offs, a real-time adaptive compression assembly that balances file size reduction and accessibility by adjusting compression parameters, and supports incremental compression for large files, and a self-learning protocols that refines compression efficiency over time through reinforcement learning, adapting to user behavior and new file formats based on feedback from compression and decompression performance.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 August 2025
Publication Number
35/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.

Inventors

1. Damarla Ramesh Babu
Professor, SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.
2. Aravind Reddy Gudi
School of Computer Science & Artificial Intelligence, SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.
3. Gande Nagaraju
School of Computer Science & Artificial Intelligence, SR University, Ananthasagar, Hasanparthy (PO), Warangal-506371, Telangana, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a system for AI-powered file compression integrated within a file manager that selects and applies optimal compression features based on file attributes and user preferences, thereby minimizing storage usage while preserving data quality and enhancing accessibility across local and cloud storage environments.

BACKGROUND OF THE INVENTION

[0002] In the current digital era, the volume of data generated and stored is growing exponentially across various platforms, including local storage systems and cloud environments. Efficient management of this data, minimizing storage space while maintaining accessibility and data safety, has become increasingly critical.

[0003] Traditional file compression methods often apply fixed compression methods without adapting to the unique attributes of files, leading to suboptimal compression results or unnecessary delays during data access. Existing compression solutions lack the capability to balance the trade-off between compression ratio and speed, when handling diverse file types and varying user needs. Furthermore, manual configuration of compression settings in many conventional systems imposes additional complexity on users, which does not possess the technical expertise to optimize such parameters effectively. This limitation often leads to either excessive data loss or inadequate compression.

[0004] US11601136B2 discloses a system is provided for electronic data compression by automated time-dependent compression algorithm. In particular, the system may track instances in which a particular dataset is used, copied, or accessed over time. For certain datasets (e.g., datasets that have not been accessed for a threshold amount of time), the system may use a time-based compression algorithm that progressively removes the least bits of such datasets as time passes. The compression of the datasets may continue until the system detects that further compression would cause the dataset to be unreadable or unrecoverable. In this way, the system may minimize the computing resources allocated to storing datasets that are not frequently accessed.

[0005] US12248509B2 discloses a system and method for managing raw files are disclosed. A plurality of image files associated with one or more raw files that include respective metadata are evaluated. A set of one or more of the images files are grouped together based on matching the respective metadata. An asset file corresponding to the raw file and the grouped set of the image files are generated. The asset file are retrieved in response to a request from a user device. At least one of the image files in the grouped set are selected and a display of the selected image file is generated for rendering.

[0006] Conventionally, many systems programs provide basic compression and decompression functionalities, but these existing systems fail to offer adaptive compression tailored to file characteristics and usage context, thus limiting overall system performance.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that requires to be capable of selecting and adjusting compression methods, thereby improving storage efficiency, maintaining data quality, and enhancing user accessibility with minimal manual intervention.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system capable of automatically selecting an optimal compression method for files to achieve maximum compression efficiency.

[0010] Another object of the present invention is to develop a system capable of providing a decision engine that prioritizes compression parameters, thus balancing compression speed and ratio effectively.

[0011] Another object of the present invention is to provide automatically adjusting compression based on storage availability and network conditions, while supporting synchronization across platforms.

[0012] Yet another object of the present invention is to implement predictive decompression by maintaining a cache for rapid file access with minimal latency.

[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0014] The present invention relates to a system for AI-powered file compression that automatically selects and applies optimal compression methods based on file attributes and system conditions, balancing storage efficiency and data accessibility.

[0015] According to an embodiment of the present invention, a system for AI-powered file compression comprises an AI module embedded within a file manager that selects an optimal compression method based on file attributes, a decision engine incorporating a rule-based system to handle real-time trade-offs, a real-time adaptive compression assembly and supports incremental compression for large files, a self-learning protocol that refines compression efficiency over time through reinforcement learning.

[0016] According to another embodiment of the present invention, the AI module dynamically switches between lossless and lossy compression methods based on file characteristics, local and cloud-based storage environments, automatically adjusting compression, user-configurable settings that allow customization of compression policies for different file types and a predictive model to identify high-priority files based on access patterns and maintaining a cache for rapid retrieval.

[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a flowchart of system for AI-powered file compression.

DETAILED DESCRIPTION OF THE INVENTION

[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0022] The present invention relates to a system for AI-powered file compression that monitors and dynamically adjust compression and decompression processes, thereby optimizing data management, minimizing latency, and providing real-time feedback for improved system efficiency and user experience.

[0023] Referring to Figure 1, a flow chart of system for AI-powered file compression is illustrated.

[0024] The system disclosed herein comprises an AI module integrated within a file manager, designed to select the most suitable compression method by analyzing specific attributes of each file. The file attributes considered include file size, file type, and usage history, which provide essential data points for determining the best compression approach.

[0025] The AI module functions by receiving input data representing these file characteristics. Based on this information, the module references a trained machine learning model that has been developed through extensive exposure to various file types and compression techniques. The machine learning model evaluates the input attributes and predicts the compression protocol that will achieve the highest storage reduction while maintaining the original quality and safety of the data.

[0026] The machine learning model operates by mapping patterns in the training data to corresponding compression outcomes, enabling it to identify effective compression methods for new files. By leveraging this predictive capability, the module recommends or directly applies the optimal compression method tailored to each file’s unique features.

[0027] A decision engine is included in the system, tasked with evaluating various factors to determine the priority between compression speed and compression ratio for each file. This component receives inputs such as the file type, frequency of access, current storage conditions, user preferences, and the availability of system resources, including processing power and memory.

[0028] By analyzing the file type, the decision engine understands the nature of the data whether it is text, image, video, or other formats which influences the choice of compression parameters. The frequency of access provides insight into how often a file is retrieved or modified, guiding whether faster compression or higher compression ratio should be prioritized. Storage conditions, such as remaining disk space, further impact this decision by indicating urgency for maximum size reduction.

[0029] User preferences offer customization options, allowing users to specify whether they value speed or storage savings more, ensuring the process aligns with individual requirements. Additionally, the assessment of available system resources ensures that compression does not overload the system’s processing capabilities or memory, maintaining smooth operation.

[0030] To manage these varying inputs and conflicting objectives, the decision engine incorporates a rule-based framework that applies predefined conditions and priorities. The rule-based framework balances real-time trade-offs, selecting the compression parameters that best fit the current situation. For example, when storage is limited but access speed is less critical, the engine prioritize maximum compression. Conversely, if a file is accessed frequently and resources are abundant, faster compression with moderate size reduction is preferred.

[0031] A real-time adaptive compression arrangement is integrated in the system which serves to balance the trade-off between reducing file size and maintaining file accessibility during compression. This adjusts key compression parameters such as compression level and chunk size based on input from the decision engine and ongoing feedback during the compression process.

[0032] The compression level determines how aggressively data is compressed, higher levels lead to greater size reduction but require more processing time, while lower levels prioritize faster compression with less size reduction. By adjusting this parameter in real time, the arrangement ensures that the compression matches current priorities, whether that is speed or maximum storage savings.

[0033] Chunk size refers to how the file is divided during compression. Larger chunks improve compression efficiency but delay access to specific parts of the file, whereas smaller chunks enable faster retrieval but reduce compression effectiveness. The method selects an appropriate chunk size based on file characteristics and user needs, optimizing both compression and accessibility.

[0034] For large files, the real-time adaptive compression arrangement supports incremental compression, allowing the file to be compressed in segments rather than all at once. This approach reduces memory load and processing spikes, enabling smooth operation even on system with limited resources. Incremental compression also facilitates partial decompression and access, improving user experience when working with large datasets.

[0035] A self-learning protocol is integrated into the system that operates to enhance compression efficiency continuously by analyzing past compression and decompression outcomes. Feedback includes metrics such as compression ratio achieved, processing time, resource consumption, and any quality loss experienced during decompression. By examining these performance indicators, the protocols identify which compression methods and parameter settings yield the best balance for different file types and usage scenarios.

[0036] Over time, the self-learning protocols adapt to user behavior by recognizing patterns such as frequently accessed files, preferred compression settings, and common file formats. Moreover, the self-learning protocols accommodate new or evolving file formats by incorporating their unique characteristics into the learning process. The reinforcement learning approach enables the system to improve without manual intervention, making compression increasingly efficient and tailored to specific user needs and environment conditions.

[0037] The AI module is designed to choose between lossless and lossy compression techniques depending on the specific characteristics of each file. For files like text documents or databases that require exact reproduction, the module applies lossless compression methods. These methods reduce file size without any loss of original data, ensuring safety during decompression.

[0038] Conversely, for media files such as images, audio, or video, where some loss of detail is often acceptable, the module selects lossy compression. This technique reduces file size more aggressively by selectively discarding less perceptible information, thereby preserving perceptual quality while saving storage space.

[0039] In addition, the module supports hybrid compression strategies. Meanwhile, the main content undergoes lossy compression to maximize size reduction without degrading perceived quality.

[0040] Further the system integrates with both local storage systems and cloud-based storage environments, providing flexible file management regardless of the storage location, enabling the compression process to adapt based on the current storage conditions and network parameters.

[0041] When operating with local storage, the system monitors the available disk space in real-time. If storage space becomes limited, the compression arrangement intensifies the compression ratio to free up space while maintaining acceptable data quality. Conversely, when ample storage is available, compression be relaxed to prioritize faster access and processing times.

[0042] In cloud storage scenarios, the system evaluates network conditions such as bandwidth and latency. Under conditions of limited bandwidth or unstable connectivity, the compression strategy adjusts to reduce data size before transmission, improving upload efficiency and minimizing delays. When network conditions improve, the system balance compression parameters to optimize between file size and transfer speed.

[0043] Furthermore, the system supports application programming interface (API)-based synchronization, ensuring smooth compatibility and communication across diverse platforms and storage services. Through standardized APIs, the system synchronizes compressed files, compression settings, and user preferences, maintaining consistency whether files are accessed locally or remotely.

[0044] The system further comprises user-configurable settings that empower users to customize compression policies tailored to different file types, allowing fine control over how files are compressed, enabling the adjustment of compression strength to balance between file size reduction and preservation of data quality according to individual requirements.

[0045] Through a user-friendly interface, included in the system, users select preferred compression protocols for each file category. The user interface presents clear options and settings, making the customization process convenient for non-technical users.

[0046] Additionally, the system supports predefined profiles, which bundle together sets of compression preferences for common use cases. These profiles allow users to quickly apply comprehensive compression strategies without needing to manually configure each parameter. By enabling personalized compression policies and providing accessible control over compression parameters, the system caters to diverse user needs and optimizes file management efficiency in various operating environments.

[0047] The system further includes an automatic decompression feature designed to improve file access speed by selectively decompressing frequently accessed files. The automatic decompression feature relies on a predictive model that analyzes historical access patterns to identify high-priority files, which are more likely to be requested repeatedly or within short intervals.

[0048] Once high-priority files are identified, the system proactively decompresses these files in advance, reducing wait times when the user requests access. Decompressed files are stored in a dedicated cache, which acts as a temporary storage area optimized for rapid retrieval. This cache minimizes latency by allowing immediate access to the decompressed content without requiring decompression at the time of access.

[0049] The predictive model continuously updates its understanding of file usage based on real-time data, enabling dynamic adjustments to which files are decompressed and cached, enhancing user experience by ensuring that frequently used files open swiftly, while still benefiting from compression to save storage space when files are not in active use.

[0050] The present invention works best in the following manner, where the file manager receiving a new or existing file requires compression. The AI module analyzes the file’s attributes, to select an optimal compression method from a set of possible methods, aiming to maximize compression efficiency without compromising file safety. Simultaneously, the decision engine evaluates system factors such as current storage availability, network conditions, user preferences, and processing resources to prioritize whether compression speed or ratio should take precedence. Based on these inputs, the system activates the real-time adaptive compression arrangement, which adjusts compression parameters like compression level and chunk size during the compression process, supporting compression for handling large files efficiently. Throughout compression, the self-learning protocol collects feedback to improve future compression decisions and adapt to new file formats. For files exhibiting specific characteristics, such as high redundancy or media content, the AI module switches between lossless and lossy compression methods, optionally applying a hybrid approach that compresses metadata losslessly while compressing content with lossy methods to optimize storage. The system integrates with both local and cloud storage environments, automatically modifying compression strategies to accommodate available storage space and network bandwidth constraints, ensuring seamless synchronization via API support. User-configurable settings allow customization of compression parameters and policies, enabling users to tailor compression strength, protocol preference, and automation levels for various file types, accessible through a user-friendly interface or preset profiles. The system monitors file access patterns in real-time using a predictive model to identify frequently accessed files, which are automatically decompressed and cached locally to reduce access. The entire process operates autonomously with minimal user intervention, continuously optimizing storage utilization and accessibility according to evolving data and usage conditions.

[0051] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A system for AI-powered file compression, comprising:

i) an AI module embedded within a file manager that selects an optimal compression method based on file attributes, such as file size, type, and usage history, using a machine learning model trained to predict effective compression protocols for minimizing storage while preserving data quality;
ii) a decision engine that evaluates file type, frequency of access, storage conditions, user preferences, and available system resources to prioritize compression speed or ratio, incorporating a rule-based system to handle real-time trade-offs;
iii) a real-time adaptive compression arrangement that balances file size reduction and accessibility by adjusting compression parameters, such as compression level or chunk size, and supports incremental compression for large files; and
iv) a self-learning protocols that refines compression efficiency over time through reinforcement learning, adapting to user behavior and new file formats based on feedback from compression and decompression performance.

2) The system as claimed in claim 1, wherein the AI module dynamically switches between lossless and lossy compression techniques based on file characteristics, such as data redundancy for text files or perceptual quality for media files, and applies hybrid compression to combine lossless metadata handling with lossy content compression where appropriate.

3) The system as claimed in claim 1, wherein the system integrates seamlessly with both local and cloud-based storage environments, automatically adjusting compression based on available storage space and network conditions, and supports API-based synchronization to ensure compatibility across platforms.

4) The system as claimed in claim 1, wherein further comprising user-configurable settings that allow customization of compression policies for different file types, including options to set compression strength, select preferred protocols, and apply policies via a user-friendly interface or predefined profiles.

5) The system vice as claimed in claim 1, wherein the system automatically decompresses frequently accessed files to improve access speed, using a predictive model to identify high-priority files based on access patterns and maintaining a cache for rapid retrieval with minimal latency.

Documents

Application Documents

# Name Date
1 202541077330-STATEMENT OF UNDERTAKING (FORM 3) [13-08-2025(online)].pdf 2025-08-13
2 202541077330-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-08-2025(online)].pdf 2025-08-13
3 202541077330-PROOF OF RIGHT [13-08-2025(online)].pdf 2025-08-13
4 202541077330-POWER OF AUTHORITY [13-08-2025(online)].pdf 2025-08-13
5 202541077330-FORM-9 [13-08-2025(online)].pdf 2025-08-13
6 202541077330-FORM FOR SMALL ENTITY(FORM-28) [13-08-2025(online)].pdf 2025-08-13
7 202541077330-FORM 1 [13-08-2025(online)].pdf 2025-08-13
8 202541077330-FIGURE OF ABSTRACT [13-08-2025(online)].pdf 2025-08-13
9 202541077330-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-08-2025(online)].pdf 2025-08-13
10 202541077330-EVIDENCE FOR REGISTRATION UNDER SSI [13-08-2025(online)].pdf 2025-08-13
11 202541077330-EDUCATIONAL INSTITUTION(S) [13-08-2025(online)].pdf 2025-08-13
12 202541077330-DRAWINGS [13-08-2025(online)].pdf 2025-08-13
13 202541077330-DECLARATION OF INVENTORSHIP (FORM 5) [13-08-2025(online)].pdf 2025-08-13
14 202541077330-COMPLETE SPECIFICATION [13-08-2025(online)].pdf 2025-08-13