Sign In to Follow Application
View All Documents & Correspondence

Nursing Assistance System

Abstract: A nursing assistance system (100) is disclosed that includes a seating assembly (116), a detection unit (112), and a cleaning unit (114). The posture management system including seating assembly (116) holds and moves a patient in a predefined position during excretion and cleaning of the patient. The detection unit (112) detects residual excreta remains attached to one or more body parts of the patient post the excretion, one or more portions of the seating assembly (116), or a combination thereof, using an Artificial Intelligence (AI) model, and determines corresponding positions of the body parts and portions of the seating assembly (116) having the excreta. Moreover, the cleaning unit (114) cleans the excreta stuck on the one or more body parts and the one or more portions of the seating assembly (116), based on the detected positions of the body parts, and the portions of the seating assembly (116) and odour management system including top cover, manage odour in the system.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 May 2023
Publication Number
21/2024
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application

Applicants

Naatscorp Private Limited
Naatscorp Private Limited, Green Garden Road, CSEZ PO, Kochi - 682037, Kerala, India
VISWANATHAN, Nidhinnath Chakkanath
Chakkanath House, Kaipamangalam PO, Thrissur - 680681, Kerala, India

Inventors

1. VISWANATHAN, Nidhinnath Chakkanath
Chakkanath House, Kaipamangalam PO, Thrissur - 680681, Kerala, India
2. LOHIDHAKSHAN, Prajod Thiruvambattil
Thiruvambattil(H), Thaikkad, Guruvayour - 680104, Kerala, India
3. ABHIRAJ, R
Kavungil House, Nenmini Post, Nenmini, Guruvayur, Thaikkad, Thrissur - 680104, Kerala, India

Specification

DESC:TECHNICAL FIELD

The present disclosure relates to aspects of hospital beds, and particularly, to a nursing assistance system.

BACKGROUND

Typically, a patient bed is a bed designed to allow a patient to be nursed while taking a proper rest in any medical care facility. Conventionally, the patient beds are straight beds with a metal structure, which only allows the patient to lie down while resting. With significant advancements in the industry, patient beds which may interchangeably be referred to as hospital beds have the capability of changing the posture of the patient, by actuation of certain portions of the hospital bed.

Additionally, existing systems for providing nursing care to patients are ineffective due to the inability to identify and dispose of the excreta. The existing systems are also incapable of detecting any anorectal issues or providing information in the form of notifications to the caretakers on the abnormalities in excretion, or excretion consistency. Further, the conventional systems are not capable of testing the excreta of the patient and cannot clean excreta stuck onto an intergluteal region of the patient and the nearby areas on the bed. The patient is usually dependent on a caretaker’s assistance for excreting. This may pose a problem for patients, where the patient is completely bedridden or have difficulty getting in and out of bed, which may be caused due to various reasons such as surgery or a bad fall, ageing, dementia etc. In addition, the patient’s dependence on a caretaker for bare minimum tasks such as excretion may affect the mental health of the patient for the worse. Moreover, the conventional system fails to replace contaminated fabric material that comes into contact with patients, thereby increasing the patient's risk of infection.

Accordingly, there remains a need for a technique to provide nursing assistance for bedridden patients.

SUMMARY

This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor intended to determine the scope of the invention.

In an embodiment of the present disclosure, a nursing assistance system is disclosed that includes a seating assembly, a detection unit, a control unit, and a cleaning unit. The seating assembly is used to hold and move a patient in a predefined position during excretion and cleaning of the patient and assist in posture management. Further, the detection unit is in communication with the seating assembly and is used to detect residual excreta remains attached on one or more body parts of the patient post the excretion, one or more portions of the seating assembly, or a combination thereof, using an Artificial Intelligence (AI) model, and to determine corresponding positions of the body parts and portions of the seating assembly having the excreta. Moreover, the cleaning unit is in communication with the detection unit and is used to clean the excreta stuck on the one or more body parts and the one or more portions of the seating assembly, based on the detected positions of the body parts, and the portions of the seating assembly.

In another embodiment of the disclosure, a method for detection and cleaning of excreta using a nursing assistance system is disclosed. The method includes receiving one or more images of a pelvic region and a perineal region of a patient captured using a detection unit associated with the nursing assistance system. Further, the method includes detecting at least one of one or more parameters associated with the patient, the nursing assistance system, or a combination thereof. Moreover, the method includes determining a predefined event by analyzing at least one of the detected parameters and the received images using a control unit communicatively coupled to the nursing assistance system. The method also includes generating an input signal based on the determined predefined event using the control unit. Further, the method includes receiving the input signal from the control unit to a seating assembly in the nursing assistance system. Moreover, the method includes actuating the seating assembly using the input signal from the control unit to facilitate the patient to assume a predefined position to facilitate excretion of the excreta onto a disposable fleece unit in the nursing assistance system. Furthermore, the method includes removing the excreta from the nursing assistance system using the disposable fleece unit. Additionally, the method includes cleaning the pelvic region and the perineal region of the patient and portions of the seating assembly using a cleaning bud along with a cleaning unit in the nursing assistance system to remove any excess excreta stuck on the pelvic region and the perineal region of the patient. Lastly, the method includes disposing, the excreta and the cleaning bud using a fleece bag from the disposable fleece unit.

According to the present disclosure, the nursing assistance system reduces the requirement of manual labour of a nurse/caretaker. Further, the posture management system of the nursing assistance system prevents any bed sores from happening to the patient. Moreover, the seating assembly facilitates excretion by positioning the patient in a zero-gravity position. Furthermore, the use of AI-ML facilitates remote monitoring of the patient along with analysis of the excreta in case any abnormality may arise.

To further clarify the advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail in the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:

Figure 1 illustrates a schematic view of a nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 2 illustrates a schematic view of the positions of cameras/sensors in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 3 illustrates a schematic view of the nursing assistance system in a predefined position with a human-machine interface (HMI) screen, in accordance with an embodiment of the present disclosure;
Figure 4 illustrates a side view of the nursing assistance system in the predefined position, in accordance with an embodiment of the present disclosure;
Figure 5 illustrates a perspective view of a central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 6 illustrates a front view of the central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 7 illustrates a rear view of the central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 8 illustrates a bottom view of the central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 9 illustrates a side view of the central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 10 illustrates a top view of the central functioning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 11 illustrates a perspective view of a disposable fleece unit in the central functioning unit, in accordance with an embodiment of the present disclosure;
Figure 12 illustrates a rear view of the disposable fleece unit in the central functioning unit, in accordance with an embodiment of the present disclosure;
Figures 13A and 13B illustrate different views of the disposable fleece unit in the central functioning unit, in accordance with an embodiment of the present disclosure;
Figure 14 illustrates a front view of a disposable fleece unit with a fleece bag in the central functioning unit, in accordance with an embodiment of the present disclosure;
Figure 15 illustrates a perspective view of a fleece sheet cartridge in the disposable fleece unit, in accordance with an embodiment of the present disclosure;
Figure 16 illustrates a perspective view of a fleece cutting unit in the disposable fleece unit, in accordance with an embodiment of the present disclosure;
Figures 17A and 17B illustrate different views of the fleece cutting unit and a fleece sealing assembly in the disposable fleece unit, in accordance with an embodiment of the present disclosure;
Figures 18A and 18B illustrate different views of side sealing rollers in the disposable fleece unit, in accordance with an embodiment of the present disclosure;
Figure 19 illustrates a perspective view of a fleece sealing assembly in the disposable fleece unit, in accordance with an embodiment of the present disclosure;
Figure 20 illustrates a perspective view of a seating assembly in the central functioning unit, in accordance with an embodiment of the present disclosure;
Figures 21A and 21B illustrate different views of a cleaning arm in a cleaning unit, in accordance with an embodiment of the present disclosure;
Figure 22 illustrates a side view of the cleaning unit in the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 23 illustrates a schematic of the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 24 illustrates a flowchart for a method for detection and cleaning of excreta using the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 25 illustrates a method for the detection of excreta using the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 26 illustrates a method for posture detection using the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 27 illustrates a method for position detection and measurement of distance from reference points using the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 28 illustrates a method for the detection of anal depth using the nursing assistance system, in accordance with an embodiment of the present disclosure;
Figure 29 illustrates a method for initiating the operation of the cleaning arm, according to an embodiment of the present disclosure; and
Figure 30 illustrates a method for actuating the cleaning arm, according to an embodiment of the present disclosure.

Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

DETAILED DESCRIPTION OF FIGURES

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.

The term “some” as used herein is defined as “none, or one, or more than one, or all.” Accordingly, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” The term “some embodiments” may refer to no embodiments or to one embodiment or to several embodiments or to all embodiments. Accordingly, the term “some embodiments” is defined as meaning “no embodiment, or one embodiment, or more than one embodiment, or all embodiments.”

The terminology and structure employed herein are for describing, teaching and illuminating some embodiments and their specific features and elements and do not limit, restrict or reduce the spirit and scope of the claims or their equivalents.

More specifically, any terms used herein such as but not limited to “includes,” “comprises,” “has,” “consists,” and grammatical variants thereof do NOT specify an exact limitation or restriction and certainly do NOT exclude the possible addition of one or more features or elements, unless otherwise stated, and furthermore must NOT be taken to exclude the possible removal of one or more of the listed features and elements, unless otherwise stated with the limiting language “MUST comprise” or “NEEDS TO include.”

Whether or not a certain feature or element was limited to being used only once, either way, it may still be referred to as “one or more features” “one or more elements” “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element does NOT preclude there being none of that feature or element, unless otherwise specified by limiting language such as “there NEEDS to be one or more . . . ” or “one or more element is REQUIRED.”

Unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having an ordinary skill in the art.

Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements presented in the attached claims. Some embodiments have been described for the purpose of illuminating one or more of the potential ways in which the specific features and/or elements of the attached claims fulfil the requirements of uniqueness, utility, and non-obviousness.

Use of the phrases and/or terms such as but not limited to “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “a further embodiment”, “furthermore embodiment”, “additional embodiment” or variants thereof do NOT necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or alternatively in the context of more than one embodiment, or further alternatively in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any feature and/or element described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.

Any particular and all details set forth herein are used in the context of some embodiments and therefore should NOT be necessarily taken as limiting factors to the attached claims. The attached claims and their legal equivalents can be realized in the context of embodiments other than the ones used as illustrative examples in the description below.

Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.

Further, skilled artisans will appreciate those elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

Figure 1 to Figure 4 has been used to discuss the details of a nursing assistance system 100. Figure 1 illustrates a schematic view of a nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 2 illustrates a schematic view of the positions of a plurality of cameras in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 3 illustrates a schematic view of the nursing assistance system 100 in a predefined position with a human-machine interface (HMI) screen, in accordance with an embodiment of the present disclosure. Figure 4 illustrates a side view of the nursing assistance system 100 in the predefined position, in accordance with an embodiment of the present disclosure.

The nursing assistance system 100 is a hospital bed designed to provide comfort to bedridden patients. The nursing assistance system 100 is used to detect and clean excreta, urine, and blood of the bedridden patient. The nursing assistance system 100 may include but is not limited to a first seating portion 102, a headboard 104, a second seating portion 106, a leg board 108, a central functioning unit 110, a detection unit 112, and a cleaning unit 114. The central functioning unit 110 may further include a seating assembly 116, and a disposable fleece unit 118.

The first seating portion 102 may tilt in order to move an upper limb of the patient. The first seating portion 102 may be tilted in a way to attain a predefined position in the upper limb of the patient. Further, the headboard 104 may be positioned adjacent to the first seating portion 102. The headboard 104 is used to limit the movement of the head for the safety of the patient. Moreover, the second seating portion 106 may be positioned opposite to the first seating portion 102. The second seating portion 106 is used to support a thigh/leg region of the patient. In an embodiment, the second seating portion 106 may fold in width in order to facilitate seating of the patient while excreting and also to keep the patient in a comfortable posture as per their convenience. Furthermore, the leg board 108 is positioned adjacent to the second seating portion 106 and is used to limit the movement of the leg region of the patient.

The leg board 108 may include an odor management system 120 which may be used to facilitate ventilation and compress/entrap odour during the excretion and cleaning. The odor management system 120 may include a plurality of exhaust fans 122, activated carbon filters and ionisers, which may be used to circulate and remove the air present in the pelvic and perineal region of the patient to remove the odor of excretion. Further, the nursing assistance system 100 may include a top cover 124 which may be disposed over the second seating portion 106. The top cover 124 may be used to cover the lower body of the patient and assist the odor management system 120 by containing the odor-filled air between the top cover 124 and the second seating portion 106, which can further be removed from the nursing assistance system 100.

In an embodiment, the predefined position may be a zero-gravity position. The patient is supported in the zero-gravity position to facilitate excretion from the bedridden patient, as the patient may not be able to move on their own. Further, being in the zero gravity position the patient may not be required to force excretion of faeces onto the disposable fleece unit 118.

Referring to Figure 1 and Figure 2, the detection unit 112 is in communication with the seating assembly 116. The detection unit 112 is used to detect residual excreta remains attached to one or more body parts of the patient post excretion and on one or more portions of the seating assembly 116 using an artificial intelligence (AI) model. Further, the detection unit 112 is used to determine the corresponding positions of the body parts and portions of the seating assembly 116 having the excreta. Moreover, the detection unit 112 may include but is not limited to a control unit (not shown), a plurality of sensors (not shown), and the plurality of cameras.

The plurality of sensors in the detection unit 112 may include, but is not limited to proximity sensors, fluid detection sensors, image processing sensors, motion sensors, pressure detection sensors, load cells, and temperature sensors. The plurality of sensors may be configured to perform the detection of one or more parameters associated with one or more of the patient, and the nursing assistance system 100. Further, the plurality of sensors may be configured to capture one or more images of a pelvic region and a perineal region of the patient using the plurality of cameras. Moreover, one or more parameters associated with the patient include at least one of a posture, a position, a presence of excreta, blood, and an anal depth of the patient, while in the seating assembly 116.

The plurality of cameras is installed across the second seating portion 106 of the nursing assistance system 100. The plurality cameras may be used to provide visual assistance for the detection and cleaning of the excreta from the patient and the nursing assistance system 100. Further, the plurality of cameras may include but is not limited to, a posture detection camera 126, an excreta detection camera 128, and a location detection camera 130 used for capturing one or more images of the body positions, pelvic region, and the perineal region of the patient. The posture detection camera 126 may be a Red, Green, Blue-infrared (RGB-IR) camera which may be used to detect the posture and excreta of the patient. Further, the excreta detection camera 128 may be an RGB/RGB-IR camera /thermal camera which is used in conjunction with the posture detection camera 126 in the nursing assistance system 100 for excreta detection. Furthermore, the location detection camera 130 may be an RGB-Depth (RGB-D) camera which may be used to detect the anal depth of the patient and coordinates of excreta stuck on body parts or system 100.

The control unit is communicatively coupled to the cleaning unit 114 and the seating assembly 116 and is used to control the functioning of the nursing assistance system 100. The control unit is configured to receive at least one of the detected one or more parameters associated with the patient and the nursing assistance system 100, and the captured images of the posture, body positions, pelvic region, and perineal region of the patient using the plurality of cameras and the feedback from the sensors including but not limited to, Fluid detection sensors, proximity sensors, pressure sensors etc. Further, the control unit is configured to determine a predefined event upon analysing at least one of the received parameters and the received images. Moreover, the control unit is configured to generate an input signal based on the determined predefined event. The aforementioned predefined event may include at least one of excretion, menstruation, and urination by the patient and the predefined event is determined by analysing at least one of the received parameters and the received images using the AI-ML techniques.

Referring to Figure 3, the HMI 302 is positioned on the open edge of the top cover 124. The HMI 302 facilitates communication of the patient with doctors or friends if the patient is conscious. The HMI 302 may also be used to order food and keep track of important data such as the health of the patient in the form of medical reports and share the appropriate data with the doctors and diagnostic centres.

In an embodiment, a remote control (not shown) may be used to activate the HMI 302, where the remote control for the HMI 302 may include switches, voice commands, or gesture control. The remote control may further be used to toggle the HMI 302 in a predefined location based on the visibility range of the patient.

Figure 5 to Figure 10 have been used in order to discuss the details of the central functioning unit 110. Figure 5 illustrates a perspective view of a central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 6 illustrates a front view of the central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 7 illustrates a rear view of the central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 8 illustrates a bottom view of the central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 9 illustrates a side view of the central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. Figure 10 illustrates a top view of the central functioning unit 110 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure.

The central functional unit 110 is a part of the system that is used for cleaning the excreta of the patient. The central functional unit 110 may include but is not limited to the seating assembly 116 and the disposable fleece unit 118. Both the seating assembly 116 and the disposable fleece unit 118 are configured to move across the height of the nursing assistance system 100. In one example, the seating assembly 116 may move in an upward direction to hold the pelvic and perineal region of the patient, while the disposable fleece unit 118 may move in a downward direction for sealing and cutting the fleece sheets in order to form the fleece bags and provide space for working of the cleaning unit 114 to clean the excreta stuck on the body parts of the patient and on the seating assembly 116. The details of the disposable fleece unit 118 and the seating assembly 116 have been explained with reference from Figures 11,19 and Figure 20 respectively.

Figure 11 to Figure 19 illustrate various aspects of the disposable fleece unit 118. Specifically, Figure 11 illustrates a perspective view of the disposable fleece unit 118 in the central functioning unit 110 whereas Figure 12 illustrates a rear view of the disposable fleece unit 118 in the central functioning unit 110, in accordance with an embodiment of the present disclosure. Figures 13A and 13B illustrate different views of the disposable fleece unit 118 in the central functioning unit 110, in accordance with an embodiment of the present disclosure. Figure 14 illustrates a front view of the disposable fleece unit 118 with a fleece bag in the central functioning unit 110, in accordance with an embodiment of the present disclosure. Figure 15 illustrates a perspective view of a fleece sheet cartridge 1104 in the disposable fleece unit 118, in accordance with an embodiment of the present disclosure. Figure 16 illustrates a perspective view of a fleece horizontal sealing and cutting unit in the disposable fleece unit 118, in accordance with an embodiment of the present disclosure. Figures 17A and 17B illustrate different views of the horizontal sealing, a fleece cutting unit and a fleece sealing assembly in the disposable fleece unit 118, in accordance with an embodiment of the present disclosure. Figures 18A and 18B illustrate different views of side sealing rollers in the disposable fleece unit 118, in accordance with an embodiment of the present disclosure. Figure 19 illustrates a perspective view of a fleece sealing assembly in the disposable fleece unit 118, in accordance with an embodiment of the present disclosure.

The disposable fleece unit 118 may be used to contain and pack excreta of the patient to facilitate the cleaning of the patient. Further, the disposable fleece unit 118 may be in communication with the detection unit 112. The disposable fleece unit 118 may have a plurality of fleece sheets 1102 in the fleece sheet cartridge 1104 made of one of a biodegradable material, a plastic, a polyurethane (PU), cotton, paper, fabric, or a combination of these. The disposable fleece unit 118 collects the excreta, urine, and blood samples in the fleece bag 1112 for diagnosis. The disposable fleece unit 118 may include but is not limited to a housing 1106, the fleece sheet cartridge 1104, a fleece sealing assembly 1108, and a fleece cutting unit 1110. The housing 1106 is used to house components of the disposable fleece unit 118. The housing 1106 may be made in the form of a box, with a storage unit that may be used to store a fleece bag 1112 produced post-excretion from the patient.

In one example, the fleece sheet cartridge 1104 may be used to supply the fleece sheets 1102 to hold the excreta on the nursing assistance system 100 and to contain blood or urine from the user or urine pumped in from a urine bag (not shown) which may be used to collect urine using a catheter (not shown). In another example, the urine may be absorbed using the fleece sheets 1102. The fleece sheet cartridge 1104 contains a roll of fleece sheets 1102 which may be mounted on opposite sides of the housing 1106. The fleece sheet 1102 may move from the fleece sheet cartridge 1104 onto the housing 1106 from both ends of the housing 1106 using a first pair of dummy rollers 1114 placed on opposite edges of the housing 1106. Further, post excretion the disposable fleece unit 118 may be activated and the fleece sheets 1102 may move onto the fleece sealing assembly 1108 using a second pair of dummy rollers 1116, placed within an excreta gap 1118. The excreta gap 1118 may allow the excreta stuck on the fleece sheets 1102 and cleaning bud to pass therethrough prior to the formation of the fleece bag 1112.

The fleece sealing assembly 1108 is used to seal the fleece sheet 1102 to form a fleece bag 1112 is used to hold the excreta and control odor. The fleece bag 1112 is formed by sealing the sides of the fleece sheets 1102. Below the excreta gap 1118, two pairs of side sealing rollers 1120 may be placed adjacent to the housing 1106. Each of the two pairs of side sealing rollers 1120 may be heated in order to seal the fleece sheets 1102 to form the fleece bag 1112. The heating may take place using a heater 1122 designed to operate at a predefined temperature in order to seal the fleece sheets 1102 into the fleece bag 1112.

In an embodiment, the sealing of the fleece bag 1112 performed by the fleece sealing assembly 1108 may be performed using at least one of thermal sealing, glueing, suturing, or stapling.

In another embodiment, the urine collected in the catheter may also be pumped into the fleece bag 1112 for disposal. Further, once the fleece sheet 1102 has been pulled from the fleece sheet cartridge 1104 on both sides of the central functioning unit 110, the fleece sheet 1102 may be replaced, and thereafter the cleaning unit 114 is activated for cleaning the pelvic region, and the perineal region of the patient.

The fleece-cutting unit 1110 is used to separate the fleece sheet 1102 from the fleece sheet cartridge 1104 and seal the top edge of the fleece sheet 1102 formed by the fleece sealing assembly 1108. The fleece cutting unit 1110 may include but is not limited to a pair of fleece cutting blade 1124 and a cutting blade actuator 1126. Each of the pair of fleece cutting blade 1124 is used to cut the fleece sheet 1102 to form the fleece bag 1112. At least one of the pairs of fleece cutting blade 1124 is coupled with the cutting blade actuator 1126. The cutting blade actuator 1126 is used to actuate at least one of the pair of fleece cutting blade 1124 linearly in order to press the fleece sheet 1102 between the pair of fleece cutting blade 1124 and separate the fleece sheet 1102 from the fleece sheet cartridge 1104. Further, each of the pairs of fleece cutting blade 1124 is coupled with the heater 1122, which may be used to seal the fleece sheets 1102 simultaneous to the cutting of the fleece sheet 1102 to form the fleece bag 1112.

Figure 20 illustrates a perspective view of a seating assembly 116 in the central functioning unit 110, in accordance with an embodiment of the present disclosure. The seating assembly 116 is used to hold and move the patient in a predefined position during excretion and cleaning of the patient and assist in posture management. The seating assembly 116 may be communicatively connected to the detection unit 112. The seating assembly 116 may include but is not limited to a pair of holding seats 2002, a pair of conveyor motors 2004, and a pair of conveyor mechanisms 2006.

The pair of holding seats 2002 may be placed on opposite edges of the nursing assistance system 100. Upon receiving communication from the detection unit 112, the pair of holding seat 2002 move towards the patient on the nursing assistance system 100 and positioned each of the pair of the holding seat 2002, holding the pelvic region and perineal region of the patient. Once the pair of holding seats 2002 is successfully positioned under the pelvic and perineal region, the pair of holding seats 2002 hold the pelvic and perineal region of the patient in the zero gravity position along with the tilted the first seating portion 102, the second seating portion 106,

Further, the pair of conveyor motor 2004 is used to move and lock the pair of holding seats 2002 in place to hold the pelvic and perineal region of the patient in position. The pair of conveyor mechanisms 2006 are positioned on the opposite ends of the surface of the housing 1106 of the disposable fleece unit 118. The pair of conveyor motors 2004 may allow movement of the pair of holding seats 2002 along the guide rail mechanisms 2006.

In an embodiment, the conveyors on the holding seats 2002 may move in a counterclockwise direction at a left side of the seating assembly 116 and may move in a clockwise direction on the right side to widen the buttocks after catching hold of buttocks to provide better visibility of excreta attached on the pelvic region and the perineal region of the patient as well as to provide better access for the cleaning unit 114. A set of safety sensors have been installed in the seating assembly 116 which may be used to detect the pressure applied on the body of the patient as well for safety.

Figures 21A and 21B illustrate different views of a cleaning arm 2102 in the cleaning unit 114, in accordance with an embodiment of the present disclosure. Figure 22 illustrates a side view of the cleaning unit 114 in the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The cleaning unit 114 is in communication with the detection unit 112 and is used to clean the excreta stuck on the one or more body parts and the one or more portions of the seating assembly 116, based on the detected positions of the body parts, and the portions of the seating assembly 116. The cleaning unit 114 is used to clean the excreta from the body parts of the patient while the patient is held in the seating assembly 116. The cleaning unit 114 may include but is not limited to a base 2104, the cleaning arm 2102, and a bud holder 2106.

The base 2104 may be used to mount the cleaning unit 114 onto the nursing assistance system 100. The cleaning arm 2102 is mounted on the base 2104 and is used to move in a predefined motion based on the detected positions of the body parts and the portions of the seating assembly 116 having the excreta. The bud holder 2106 is movably mounted on the cleaning arm 2102 and is used to hold a cleaning bud picked from the bud cartridge (not shown) which may be used to clean the body parts having the excreta. Post cleaning of the excreta, the cleaning bud is disposed into the fleece bag 1112 carrying the excreta of the patient. The disposable fleece unit 118 is used to prevent spillage of the excreta and assist in the containment of the excreta. Further, the cleaning unit 114 coupled with the disposable fleece unit 118 may be used to collect the excreta, the urine, and the blood samples in the fleece bag 1112 for diagnosis and assist in remote patient monitoring.

In an embodiment, the cleaning arm 2102 may fixed in a different position of nursing assistance system 100 and may be controlled remotely by a caretaker/nurse using an augmented reality (AR)/ virtual reality (VR) device and or neural sensor bands/telerobotics for remote patient touch support and a gesture detection system to feed the patient and the cleaning unit 114 may collect urine and stool samples and test the samples in an embedded testing unit which assist in remote patient diagnosing.

Figure 23 illustrates a schematic of the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The nursing assistance system 100 may include different components that may be used for detecting and cleaning the excreta using the nursing assistance system 100. For instance, the nursing assistance system 100 may include a processor 2302, a memory 2304, module(s) 2306, and data 2308. The memory 2304, in one example, may store the instructions to carry out the operations of the modules 2306. The modules 2306 and the memory 2304 may be coupled to the processor 2302.

The processor 2302 can be a single processing unit or several units, all of which could include multiple computing units. The processor 2302 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processor, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 2302 is configured to fetch and execute computer-readable instructions and data stored in the memory 2304.

The memory 2304 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.

The modules 2306, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The modules 2306 may also be implemented as, signal processor 2302(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions.

Further, the modules 2306 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 2302, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor 2302 which executes instructions to cause the general-purpose processor 2302 to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the modules 2306 may be machine-readable instructions (software) which, when executed by a processor 2302/processing unit, perform any of the described functionalities. Further, the data serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the modules 2306. The data 2308 may include information and/or instructions to perform activities by the processor 2302.

The module(s) 2306 may perform different functionalities which may include, but may not be limited to, detecting and cleaning of the excreta using the nursing assistance system 100. Accordingly, the module(s) 2306 may include but are not limited to an excreta detection module 2310, a posture detection module 2312, a position detection module 2314, and an anal depth detection module 2316. Further, each of the aforementioned modules are configured to implement an artificial intelligence/ machine learning model to perform their respective operation.

The excreta detection module 2310 is used to detect, in the captured one or more images, the presence of excreta, blood or abnormalities on the pelvic region and the perineal region of the patient along with the seating assembly 116 in the nursing assistance system 100. Further, the posture detection module 2312 is used to detect the posture of the patient using an imaging grid. Moreover, the position detection unit 2314 is used to detect the position of the patient as one of a supine position or a non-supine position and to map the position of body parts and the posture of the patient. Furthermore, the anal depth detection module 2316 is used to track coordinates of the pelvic region and the perineal region of the patient in order to target cleaning of the pelvic region and perineal region of the patient and the seating assembly 116 of the nursing assistance system 100 using the cleaning unit 114.

Figure 24 illustrates a flowchart for a method 2400 for the detection and cleaning of excreta using the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The order in which the method 2400 steps are described below is not intended to be construed as a limitation, and any number of the described method 2400 steps can be combined in any appropriate order to execute the method 2400 or an alternative method 2400. Additionally, individual steps may be deleted from the method 2400 without departing from the scope of the subject matter described herein.

In an embodiment, the method 2400 may be performed partially or completely by the nursing assistance system 100 shown in Figure 23. Prior to the beginning of the method 2400, the excreta detection module 2310, the posture detection module 2312, the position detection module 2314, and the anal depth detection module 2316 may train the artificial intelligence training model. In order to train the artificial intelligence learning model, the excreta detection module 2310, the posture detection module 2312, the position detection module 2314, and the anal depth detection module 2316 may map the contexts with one or more components from a training set. As mentioned before, the components may include one or more images and one or more parameters.

The method 2400 begins at step 2402, the control unit receives one or more images of a pelvic region and a perineal region of a patient captured using a detection unit 112 associated with the nursing assistance system 100. At step 2404, the detection unit 112 detects the at least one of one or more parameters associated with the patient, the nursing assistance system 100, or a combination thereof. At step 2406, the control unit determines the predefined event by analysing at least one of the detected parameters and the received images and sensory signals using the control unit communicatively coupled to the nursing assistance system 100. At step 2408, the control unit generates an input signal based on the determined predefined event. At step 2410, the seating assembly 116 receives the input signal sent from the control unit in the nursing assistant system 100.

At step 2412, the posture management system including the seating assembly 116 is actuated using the input signal from the control unit to facilitate the patient to assume a predefined position to facilitate excretion of the excreta onto the disposable fleece unit 118 in the nursing assistance system 100. At step 2414, the excreta may be removed from the nursing assistance system 100 using the disposable fleece unit 118. At step 2416, the pelvic region and the perineal region of the patient and portions of the seating assembly 116 are cleaned using the cleaning bud along with the cleaning unit 114 in the nursing assistance system 100 to remove any excess excreta stuck on the pelvic region and the perineal region of the patient. At step 2418, the excreta and the cleaning bud are disposed into the fleece bag 1112 from the disposable fleece unit 118.

Figure 25 illustrates a method 2500 for the detection of excreta using the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The method 2500 may be used to detect the presence of excreta, excreted by the patient in the nursing assistance system 100. Prior to the beginning of the method 2500, the excrete set of video frame ‘F’ may be received by the firmware. Thereafter, the method begins at step 2502, at which activation of the excreta detection camera 128 is executed. At step 2504, a video feed with video frame ‘F’ of the patient is captured in the nursing assistance system 100 using the excreta detection camera 128. At step 2506, the presence of excreta is detected using the video feed of the patient in the nursing assistance system 100. The excreta detection is primarily achieved using the excreta detection camera 128. At step 2507, another 200 frames are checked to confirm the detection excreta. At step 2508, a notification is generated for the detection of the excreta in the video feed of the patient using a predefined number of frames from the video feed. At step 2510, in case the excreta is not detected at step 2506, the posture detection camera 126 is activated for improved analysis. The excreta may not be detected if the excreta is outside the field of view of the excreta detection camera 128 or if the excreta detection camera 128 is blocked by any obstacle. In such a case, the excreta is detected with the help of videos taken by the posture detection camera 126 at step 2511. In another embodiment, the excreta detection camera 128 can move from its position to a different position using linear actuators to avoid obstacles and to have clear visibility.

At step 2512, the video feed from the posture detection camera 126 is captured and at step 2513, another 200 frames are checked to confirm the detection excreta. At step 2514, the presence of excreta is detected using the video feed received from the posture detection camera 126. At step 2516, the excreta detection camera 128 is re-activated after lowering the central functional unit by 75mm to have more visibility to perform a secondary check and a second video feed is captured from the excreta detection camera 128. At step 2518, excreta detection is performed for another 200 frames for confirmation. Finally, at step 2520, a notification is generated for the detection of the excreta in the video feed of the patient using a predefined number of frames from the video feed.

Figure 26 illustrates a method 2600 for posture detection using the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The method 2600 may be used to detect the posture of the patient, while the patient is in the nursing assistance system 100. At step 2602, the method 2600 begins with setting a time duration t and starting posture detection. At step 2604, the posture detection begins and at step 2606, the posture detection camera 126 is activated to capture a posture of the patient in the nursing assistance system 100. At step 2608, the activation of the posture detection camera 126 is confirmed. In case the posture detection camera 126 is not activated, the method 2600 concludes. In case the posture detection camera 126 is activated, the posture detection is performed for t minutes at step 2610. Thereafter, at step 2612, whether the position is supine or non-supine is determined. At step 2614, a notification is generated to display a position status of the patient, wherein, for the non-supine position, the posture detection is re-initiated after a predefined delay or posture is corrected using a posture management system. The predefined delay in re-initiation of posture detection may be a time span of 2 minutes.

Figure 27 illustrates a method 2700 for position detection using the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The method 2700 may be used to detect the position of the patient, while the patient is in the nursing assistance system 100. At step 2702, the method 2700 begins with an activation of the posture detection camera 126 and at step 2703, an input video feed of the patient is received. At step 2704, the position of the patient is determined using the video feed by referencing the frames using a predefined imaging grid. In one example, two imaginary lines are drawn as a reference at the edge of the bed using open CV. Thereafter, at step 2705, the centre of the line points (x, y) and (u, v) are extracted. At step 2706, coordinates for one or more features of the body of the patient are extracted using the predefined imaging grid. For example, the points [left hip (x1,y1), right hip (x2, y2), left knee (x3, y3), right knee (x4, y4)] are extracted. At step 2708, the distance and the position of the body parts of the patient, using the extracted coordinates from each side of the patient are calculated using at least one of an Euclidean distance calculation or a Manhattan distance calculation. The extracted coordinates can be intergluteal coordinates of the buttock. In one example, at step 2708, distance calculations of d1 and d2 corresponding to a right side distance, i.e., a distance of the right hip from a centre, may be determined using a formula d1= euclideandist (x,x1,y,y1) d2=euclideandist (x,x2,y,y2). Thereafter, the mean of (d1, d2) may be determined. Based on the calculation, the right side distance may be determined at step 2710 which is stored as RSD=mm at step 2712. Similarly, at step 2716, distance calculations of d3 and d4 corresponding to a left side distance, i.e., the distance of the left hip from the center, may be determined using a formula d3= euclideandist (x,x3,y,y3) d4= euclideandist (x,x4,y,y4). Thereafter, the mean of (d3, d4) may be determined. Based on the calculation, the left side distance may be determined at step 2718 which is stored as RSD=mm at step 2720.

Figure 28 illustrates a method 2800 for the detection of anal depth using the nursing assistance system 100, in accordance with an embodiment of the present disclosure. The method 2800 may be used to detect the anal depth of the patient, especially for detecting the area in the pelvic region and perineal region where the excreta of the patient may be attached. At step 2801, a command from a master controller (part of the control unit) for anal position detection and tracking is received. At step 2802, the method 2800 begins with an activation of the location detection camera 130 based on an input received from the control unit. At step 2803, activation of the location camera 130 is checked. In case the location camera 130 is not detected, an error signal to the master controller is sent. In case the location camera 130 is detected, the same is notified to the master controller. At step 2804, a video feed is captured for a duration of a time span of t mins of the patient in the nursing assistance system 100 using the location detection camera 130. At step 2806, the captured video feed from the location detection camera 130 is converted into frames. At step 2808, multiple coordinates of the pelvic region and the perineal region of the patient are created using one or more image contours, image segmentation techniques, and bounding box techniques. In one example, the AI training model may be implemented to determine coordinates of the pelvic region and the perineal region of the patient are tracked. At step 2810, the tracked coordinates of the pelvic region and the perineal region of the patient are saved using an image segmentation technique in a format, such as a .csv file readable for the cleaning arm 2102. At step 2812, the cleaning unit 114 is actuated to move to the tracked coordinates for cleaning the pelvic region and the perineal region of the patient and at step 2813, the command “Robotic arm reached to default position” is received. At step 2814, a notification is generated to display the cleaning status of the patient, which may be repeated until the patient and the seating assembly 116 are completely clean. At step 2816, the location detection camera 130 is deactivated after cleaning has been completed.

The operational details of the cleaning arm 2102 are now described with respect to Figures 29 and 30. Specifically, Figure 29 illustrates a method 2900 for initiating the operation of the cleaning arm 2102, according to an embodiment of the present disclosure. The method 2900 begins at step 2902 at which the nursing assistance system 100 is activated. Thereafter, at step 2904, the master controller is powered ON. The master controller is powered ON before the patient is onboarded as mentioned in step 2906. Once the master controller is powered ON, the master controller initiates the Power ON self-test (POST) at step 2908. At step 2910, the cleaning arm 2102 performs the POST. At step 2912, the master controller checks if the POST for the cleaning arm 2102 is successful. In case POST is unsuccessful, the method proceeds to step 2914. On the other hand, in case the POST is successful, at step 2916, the cleaning arm 2102 moves to its home position and stays there and finally, at step 2918, the patient is onboarded. Further, subsequent to both steps 2916 and 2918, the master controller is notified at step 2920 about the successful POST of the cleaning arm 2102.

Figure 30 illustrates a method 300 for actuating the cleaning arm 2102, according to an embodiment of the present disclosure. The method begins at step 3002 at which the excreta is detected using the AL training model. In one example, the step 3002 may correspond to step 2513 in Figure 25. The same is communicated by the master controller at step 3004. Further, at step 3006, the master controller checks that the pre-requisites P1, P2, and P3 of the cleaning arm 2102 are met. P1 may correspond to the maximum height of the bed is set whereas P2 may correspond to the backrest angle of the first seating portion 102 which is set at a minimum angle of 30 degrees. Further, P3 corresponds to the condition where the adjustments to the bed height and the backrest angle are disabled from both the HMI and the wired handset. In case the conditions are not met, at step 3008, the cleaning arm does not move. In case the conditions P1, P2, and P3 are met, the interlocking for all robotic operations is achieved. The method proceeds to step 3010, at which the calibration of the cleaning arm 2102 is performed. Thereafter, at step 3012, successful calibration is checked. In case the calibration is not OK at step 3014, the master controller receives an error message. In case the calibration is OK, the method 3000 proceeds to step 3016, and the cleaning arm 2102 is moved to its home position.

At step 3018, the cleaning arm 2102 picks the cleaning bud. As a part of picking the cleaning bud. In one example, the cleaning bud is picked from the bud cartridge. In addition, the available bud count is read from a yml file. In addition, the bud pick coordinates are read from the available bud count. Upon reading the coordinates, the bud is picked and finally, the cleaning arm 2102 moves to the park position. At step 3020, if the bud is picked properly, is checked. In case the bud is not picked properly, an error message is sent to the master controller at step 3022. In case the bud is picked properly, the method proceeds to step 3024 at which the bud count is updated and at step 3026, the cleaning arm 2102 waits till excreta completion is received from the master controller. The confirmation on the cleaning may be received from the master controller which may receive the information from steps 2414 in Figure 24. At step 3028, upon receipt of the confirmation from the master controller, the cleaning arm 2102 begins the cleaning at step 3030. As a part of cleaning, the cleaning arm 2102 performs the camera-to-robot transformation. Further, the control unit cleaning arm 2102 performs the approach and tilt angle calculations and actuates the cleaning arm 2102 to perform the cleaning using the bud. Further, to perform the aforementioned actions, the cleaning arms interlock upon the execution of pre-requisites P4 to P6. P4 corresponds to receiving the intergluteal coordinates from the master controller. P5 corresponds to moving the fleece bag 112 down by 300mm and P6 corresponds to the patient resting in a zero gravity position. Once the cleaning is done, the cleaning arm 2102 drops the bud into the fleece bag 1112. Following the cleaning, steps 3018 are repeated twice to ensure complete excreta removal. Further at step 3032, the cleaning arm 2102 moves to the home position. Thereafter, at step 3034, the interlock is released and the same is communicated to the master controller.

The present disclosure various technical advancements such as reducing the requirement of manual labour of a nurse/caretaker. Further, the disclosure provides odor management and ventilation with the use of filters and exhaust fans which prevents any infection from spreading within the nursing assistance system 100. The posture management system of the disclosure prevents any bed sores from happening to the patient. Moreover, the seating assembly 116 facilitates excretion by positioning the patient in any position and if excreta is detected it transforms the patient into a zero-gravity position, in case of spinal injury or as per user convenience this step is avoided. Furthermore, the use of AI-ML facilitates remote monitoring of the patient along with analysis of the excreta in case any abnormality may arise.

While specific language has been used to describe the present subject matter, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. ,CLAIMS:WE CLAIM:

1. A nursing assistance system (100), comprising:
a seating assembly (116) adapted to hold and move a patient in a predefined position during excretion and cleaning of the patient and assist in posture management;
a detection unit (112) in communication with the seating assembly (116) and adapted to detect residual excreta remains attached on one or more body parts of the patient post the excretion, one or more portions of the seating assembly (116), or a combination thereof, using an Artificial Intelligence (AI) model, and to determine corresponding positions of the body parts and portions of the seating assembly (116) having the excreta; and
a cleaning unit (114) in communication with the detection unit (112) and adapted to clean the excreta stuck on the one or more body parts and the one or more portions of the seating assembly (116), based on the detected positions of the body parts, and the portions of the seating assembly (116).

2. The nursing assistance system (100) as claimed in claim 1, comprising a disposable fleece unit (118) having a plurality of fleece sheets (1102) made of one of a biodegradable material, a plastic, a polyurethane (PU), or a cotton, or combination of these and adapted to collect and contain the excreta of the patient to facilitate the cleaning of the patient.

3. The nursing assistance system (100) as claimed in claim 1, wherein the seating assembly (116) comprises:

a first seating portion (102) adapted to tilt to move an upper limb of the patient;
a headboard (104) positioned adjacent to the first seating portion (102);
a second seating portion (106) positioned opposite to the first seating portion (102); and
a leg board (108) positioned adjacent to the second seating portion (106) and, the leg board (108) comprising an odor management system (120) adapted to facilitate ventilation during the excretion and cleaning.

4. The nursing assistance system (100) as claimed in claim 1, wherein the nursing assistance system (100) comprising a top cover (124) disposed over the second seating portion (106) adapted to cover the lower body of the patient and assist the odor management system (120).

5. The nursing assistance system (100) as claimed in claim 1, wherein the cleaning unit (114) is adapted to clean the excreta from the body parts of the patient while the patient is held in the seating assembly (116) and comprises:

a base (2104);
a cleaning arm (2102) mounted on the base (2104) and adapted to move in a predefined motion based on the detected positions of the body parts and the portions of the seating assembly (116) having the excreta; and
a bud holder (2106) movably mounted on the cleaning arm (2102) and adapted to hold a cleaning bud adapted to clean the body parts having the excreta.

6. The nursing assistance system (100) as claimed in claim 5, wherein the disposable fleece unit (118) adapted to prevent spillage of the excreta and assist in the containment of excreta.

7. The nursing assistance system (100) as claimed in claim 1, wherein the detection unit (112) comprises:

a control unit communicatively coupled to the cleaning unit (114) and the seating assembly (116) and adapted to control the functioning of the nursing assistance system (100); and
a disposable fleece unit (118) adapted to contain and pack excreta of the patient to facilitate the cleaning of the patient.

8. The nursing assistance system (100) as claimed in claim 1, wherein the detection unit (112) comprises a plurality of sensors configured to perform at least one of:
detecting one or more parameters associated with one or more of the patient, and the nursing assistance system (100); and
capturing one or more images of a pelvic region and a perineal region of the patient.

9. The nursing assistance system (100) as claimed in claim 8, wherein the one or more parameters associated with the patient include at least one of a posture, a position, a presence of excreta, blood, and an anal depth of the patient, while in the seating assembly (116).

10. The nursing assistance system (100) as claimed in claim 8, wherein one of a posture detection camera (126), an excreta detection camera (128), and a location detection camera (130) adapted for capturing one or more images of the body positions, pelvic region and the perineal region of the patient.

11. The nursing assistance system (100) as claimed in claim 10, wherein the posture detection camera (126) is a red, green, blue infra-red (RGB-IR) camera, the excreta detection camera (128) is an RGB camera/ thermal camera, and the location detection camera (130) is an RGB-depth (RGB-D) camera.

12. The nursing assistance system (100) as claimed in claim 8, wherein the plurality of sensors comprise one or more of proximity sensors, fluid detection sensors, image processing sensors, motion sensors, pressure detection sensors, load cells, and temperature sensors.

13. The nursing assistance system (100) as claimed in claim 8, wherein the control unit is configured to:
receive at least one of the detected one or more parameters associated with a patient and the nursing assistance system (100), and the captured images of the posture, body positions, pelvic region and the perineal region of the patient;
determine a predefined event upon analyzing at least one of the received parameters and the received images; and
generate an input signal based on the determined predefined event.

14. The nursing assistance system (100) as claimed in claim 13, wherein the predefined event includes at least one of excretion, menstruation, and urination by the patient and the predefined event is determined by analyzing at least one of the received parameters and the received images using the AI-ML techniques.

15. The nursing assistance system (100) as claimed in claim 2, wherein the disposable fleece unit (118) comprises:

a fleece sheet cartridge (1104) adapted to supply a fleece sheet (1102) to hold the excreta on the nursing assistance system (100) and urine from the patient or from a urine bag adapted to collect urine using a catheter;
a fleece sealing assembly (1108) adapted to seal the fleece sheet cartridge (1104) to form a fleece bag (1112) used to hold the excreta and control odor; and
a fleece-cutting unit (1110) adapted to separate the fleece sheet (1102) from the fleece sheet cartridge (1104) and seal a top edge of the fleece sheet (1102) formed by the fleece sealing assembly (1108).

16. The nursing assistance system (100) as claimed in claim 15, wherein one of the cleaning unit (114) or the disposable fleece unit (118) collects the excreta, the urine, and blood samples in the fleece bag (1112) for diagnosis and assists in remote patient monitoring.

17. A method (2400) for detection and cleaning of excreta using a nursing assistance system (100), the method comprising:
receiving (2402) one or more images of a pelvic region and a perineal region of a patient captured using a detection unit (112) associated with the nursing assistance system (100);
detecting (2404) at least one of one or more parameters associated with the patient, the nursing assistance system (100), or a combination thereof;
determining (2406) a predefined event by analyzing at least one of the detected parameters and the received images using a control unit communicatively coupled to the nursing assistance system (100);
generating (2408) an input signal based on the determined predefined event using the control unit;
receiving (2410) the input signal from the control unit to a seating assembly (116) in the nursing assistance system (100);
actuating (2412) posture management system including the seating assembly (116) using the input signal from the control unit to facilitate the patient to assume a predefined position to facilitate excretion of the excreta onto a disposable fleece unit (118) in the nursing assistance system (100);
removing (2414) the excreta from the nursing assistance system (100) using the disposable fleece unit (118);
cleaning (2416) the pelvic region and the perineal region of the patient and portions of the seating assembly (116) using a cleaning bud along with a cleaning unit (114) in the nursing assistance system (100) to remove any excess excreta stuck on the pelvic region and the perineal region of the patient; and
disposing (2418), the excreta and the cleaning bud using a fleece bag (1112) from the disposable fleece unit (118).

18. The method (2400) as claimed in claim 17, wherein the one or more parameters associated with the patient include at least one of a posture, a position, an excreta, and an anal depth of the patient.

19. The method (2400) as claimed in claim 18, wherein the excreta detection (2500) comprising:

capturing (2504), using an excreta detection camera (128), a video feed of the patient in the nursing assistance system (100);
detecting (2506) presence of the excreta from the video feed of the patient;
generating (2508) a notification for detection of the excreta in the video feed of the patient using a predefined number of frames from the video feed, and
activating (2510) a posture detection camera (126) when excreta is not detected using the excreta detection camera (128) for an improved analysis.

20. The method (2400) as claimed in claim 18, wherein detecting (2600) at least one of one or more parameters comprises:
activating (2604) the posture detection camera (126) to capture a posture of the patient in the nursing assistance system (100);
detecting (2606) the posture of the patient as one of a supine position or a non-supine position;
generating (2608) a notification to display a position status of the patient, wherein, for the non-supine position, the posture detection is re-initiated after a predefined delay or posture is corrected using a posture management system.

21. The method (2400) as claimed in claim 20, wherein the predefined delay in re-initiation of posture detection is a time span of 2 minutes.

22. The method (2400) as claimed in claim 18, wherein detecting (2700) at least one of one or more parameters comprises:
activating (2702) the posture detection camera (126) to capture a video feed of the patient in the nursing assistance system (100);
determining (2704) a position of the patient using the video feed by referencing the frames using a predefined imaging grid;
extracting (2706) coordinates for one or more features of the pelvic region and the perineal region of the patient;
calculating (2708), a distance and a position of the body parts of the patient using the extracted coordinates from each side.

23. The method (2400) as claimed in claim 22, wherein the distance is calculated using at least one of an Euclidean distance calculation and a Manhattan distance calculation.

24. The method (2400) as claimed in claim 18, wherein a detection of anal depth (2800) comprising:
capturing (2804), using a location detection camera (130), a video feed of the patient in the nursing assistance system (100);
tracking (2808) coordinates of the pelvic region and the perineal region of the patient using one or more of image contours, image segmentation technique, and bounding box technique;
saving (2810) the tracked coordinates of the pelvic region and the perineal region of the patient using an image segmentation technique;
actuating (2812) the cleaning unit (114) for cleaning the pelvic region and the perineal region of the patient;
generating (2814, 2816) a notification to display a cleaning status of the patient and deactivation of the location detection camera (130).

Documents

Application Documents

# Name Date
1 202341034915-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [18-05-2023(online)].pdf 2023-05-18
2 202341034915-STATEMENT OF UNDERTAKING (FORM 3) [18-05-2023(online)].pdf 2023-05-18
3 202341034915-PROVISIONAL SPECIFICATION [18-05-2023(online)].pdf 2023-05-18
4 202341034915-OTHERS [18-05-2023(online)].pdf 2023-05-18
5 202341034915-FORM FOR STARTUP [18-05-2023(online)].pdf 2023-05-18
6 202341034915-FORM FOR SMALL ENTITY(FORM-28) [18-05-2023(online)].pdf 2023-05-18
7 202341034915-FORM 1 [18-05-2023(online)].pdf 2023-05-18
8 202341034915-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-05-2023(online)].pdf 2023-05-18
9 202341034915-EVIDENCE FOR REGISTRATION UNDER SSI [18-05-2023(online)].pdf 2023-05-18
10 202341034915-DRAWINGS [18-05-2023(online)].pdf 2023-05-18
11 202341034915-DECLARATION OF INVENTORSHIP (FORM 5) [18-05-2023(online)].pdf 2023-05-18
12 202341034915-Proof of Right [04-08-2023(online)].pdf 2023-08-04
13 202341034915-FORM-26 [04-08-2023(online)].pdf 2023-08-04
14 202341034915-STARTUP [20-05-2024(online)].pdf 2024-05-20
15 202341034915-FORM28 [20-05-2024(online)].pdf 2024-05-20
16 202341034915-FORM-9 [20-05-2024(online)].pdf 2024-05-20
17 202341034915-FORM 18A [20-05-2024(online)].pdf 2024-05-20
18 202341034915-ENDORSEMENT BY INVENTORS [20-05-2024(online)].pdf 2024-05-20
19 202341034915-DRAWING [20-05-2024(online)].pdf 2024-05-20
20 202341034915-CORRESPONDENCE-OTHERS [20-05-2024(online)].pdf 2024-05-20
21 202341034915-COMPLETE SPECIFICATION [20-05-2024(online)].pdf 2024-05-20
22 202341034915-Request Letter-Correspondence [17-06-2024(online)].pdf 2024-06-17
23 202341034915-FORM28 [17-06-2024(online)].pdf 2024-06-17
24 202341034915-Covering Letter [17-06-2024(online)].pdf 2024-06-17
25 202341034915-FORM 3 [28-06-2024(online)].pdf 2024-06-28
26 202341034915-FORM 3 [13-01-2025(online)].pdf 2025-01-13