Sign In to Follow Application
View All Documents & Correspondence

Intelligent Waste Segregator ‘Binvent’ Bin

Abstract: The invention relates to IOT enabled and ML driven intelligent waste segregator to automatically sorting out waste into dynamically allocated categories. The working of ‘IWS Binvent Bin’ is divided into 3 stages: Stage 1: Waste Receiver: This is the initial stage of the product’s usage and is accomplished by the automatic opening of ‘solo waste input flap’ as soon as the user comes near the ‘IWS Binvent Bin’. The user’s action to dispose off the waste concludes at stage 1. Opened flap closes automatically as soon as waste is received. Stage 2: Waste Categorization: The waste from entry flap enters MLCV Box where the waste is categorized using computer vision and its relevant data is pushed onto the server. Stage 3: Waste Segregation: On the basis of the categorization, the electromechanical system actuates a total of 3 motors to then drop the waste item into its correct compartment. There are three output bins in IWS Binvent Bin.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 June 2022
Publication Number
01/2024
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

VITAL CARBON PRIVATE LIMITED
180/6 Sector 16-B Vasundhra Ghaziabad Uttar Pradesh

Inventors

1. Achal Kasturia
B-302 Chitrakoot Apartments Plot No 9 Sector 22 Dwarka
2. SHIVA BHASKAR BHAGAWATULA
180/6 Sector 16B Vasundhra Ghaziabad

Specification

Description:Stage 1: The user detection sensor (mounted on the front side of the ‘IWS Binvent Bin’ (Refer D1-P3) detects the user when the user is in the vicinity of the system to throw the waste. A threshold distance is set. Any human closer than that distance will be considered a user of the bin. Once the user is detected, the entry flap on the front face (Refer D1-P2) opens inward through the use of a stepper motor and its accompanying linear mechanism. The flap remains open for a strategically allocated time interval giving the user enough time to place their waste inside. The waste is received in box (Refer D2-P3).

Stage 2: As soon as the waste/trash is placed, the camera (Refer D2-P1) is triggered and an image of the trash is captured which is then given as input to the machine learning model. This helps in defining category of the waste/trash. Waste segregation is finalized with the help of camera vision and machine learning.

Stage 3: Once the category is defined, the waste automatically drops to pre-defined bin for specific type of waste. The algorithm is written in such a way so as to automatically drop the segregated waste in pre-defined disposal bin (Refer D1-P4). This is achieved by the synchronous movement of 2 stepper motors to move the waste identification tray or the MLCV box (Refer D2-P4, P5 and P6) and drop the waste into one of the 3 bins (Refer D2-P8) and then come back to its original position to accept input again.

, Claims:What is claimed is:

C-1>>> The user detection sensor (mounted on the front side) of the ‘IWS Binvent Bin’ (Refer D1-P3) detects the user when the user is in the vicinity of the system to throw the waste. This is also named as proximity sensor of ultrasonic nature and gets triggered.

C-2 >>> Automatic opening of the waste receiver bin (Refer D3-P2 and P3). For minimal contact with the bin the opening of IWS Binvent Bin is completely automated. This allows the user to dispose off the waste/trash without touching as well as acts as a physical barrier to disposal while IWS Binvent Bin is completing it’s segregation cycle. Automating the bin opening also tackles the problem of overfilling as the IWS Binvent Bin will stop accepting waste/trash if all bins are full.

C-3 >>> Automatic closure of the waste receiver bin after receiving the waste/trash (for the purpose of categorization and segregation).

C-4 >>> Triggering of high proficiency camera. The camera trigger function is optimized for minimizing processing & latency. The camera is triggered only when waste/trash has been put inside the bin. This improves the user interaction and keeps processing time minimal.

C-5 >>> Automation of waste segregation facilitated by Convolutional Neural Network based Machine Learning Model implemented through python.

C-6 >>> The process is algorithm based and the run time (i.e. the time taken by the system to process one item of trash from input to disposal is just 8.2 seconds.

C-7 >>> Integration of Robust Electromechanical system to cater to all types of waste streams and avoid system failure at all points of the process

C-8 >>> Self Cleaning (brush supported) Mechanism for internal tray. The self cleaning mechanism is designed for places where food waste is a common waste stream. The roller brush rotates along the axis as the plate moves which causes the bristles to clean the surface. The surface is coated with hydrophobic & oleophobic agents to avoid greasy stains. Furthermore, the plate is designed to be perforated to let liquids and small crumbs fall directly into the designated bin. (Refer D6)

C-9 >>> Customizable Computer Vision model for use at varied collection point. Using computer vision for waste detection provides the versatility required for dealing with different waste streams. IWS Binvent Bin can be trained on the local data set to aide it in better segregation. Customizable computer vision model has the flexibility to switch the bin category depending on the incoming waste streams.

C-10>>> Retention based system. New type of waste is analyzed and recorded in the system for future reference. The algorithm keeps track of all the steps in the cycle and their live status throughout the process. In the case of a power failure mid cycle, the retention based feature aids the system to resume from the same step it initially stopped at. This feature increases the robustness of the whole system and effectively mitigates all possibilities of algorithmic error during a running cycle.

C-11>>> Maintenance and cleaning scheduling enabled through IoT. IWS Binvent assists the housekeeping in streamlining their operations. Binvent can use GSM, WLAN or LoRa to relay useful information to the operations team. This feature helps the housekeeping in scheduling the cleaning when required.

C-12>>> Dynamic improvement of ML model and second stage verification of categorization enabled through IoT. Every input trash image classified by the machine learning model is sent to cloud storage using the same interface that is used for communicating with the housekeeping. These images are later used by the machine learning algorithm to improve the waste detection efficiency.

C-13 >>> Using Artificial Intelligence for determining the waste generation patterns of the facility. IWS Binvent has the capability to understand the waste generation patterns and provide insights for waste mitigation. Binvent maps the facility and automate the waste auditing process.

C-14 >>> IWS Binvent is equipped with smart display for outdoor media advertisements & awareness campagins. Leveraging the data collected by the proximity sensors, IWS Binvent can accurately provide quantitative data on outreach and interactions. The data collected is completely anonymous and provides real time metrics for the advertisers. (Refer D1-P1)

Documents

Application Documents

# Name Date
1 202211035130-STATEMENT OF UNDERTAKING (FORM 3) [20-06-2022(online)].pdf 2022-06-20
2 202211035130-POWER OF AUTHORITY [20-06-2022(online)].pdf 2022-06-20
3 202211035130-FORM FOR STARTUP [20-06-2022(online)].pdf 2022-06-20
4 202211035130-FORM FOR SMALL ENTITY(FORM-28) [20-06-2022(online)].pdf 2022-06-20
5 202211035130-FORM FOR SMALL ENTITY [20-06-2022(online)].pdf 2022-06-20
6 202211035130-FORM 1 [20-06-2022(online)].pdf 2022-06-20
7 202211035130-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-06-2022(online)].pdf 2022-06-20
8 202211035130-DRAWINGS [20-06-2022(online)].pdf 2022-06-20
9 202211035130-DECLARATION OF INVENTORSHIP (FORM 5) [20-06-2022(online)].pdf 2022-06-20
10 202211035130-COMPLETE SPECIFICATION [20-06-2022(online)].pdf 2022-06-20
11 202211035130-FORM 18 [17-08-2022(online)].pdf 2022-08-17