Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Computer Vision Based Blood Cell Counting

Abstract: The present invention relates to a system 100 and a method 400 for computer vision based blood cell counting. The system 100 includes a stained slide 102, and a computing device 104. The computing device 104 is configured to capture images of four different sections of the stained slide 102. Once the images of four different sections of the stained slide 102 are captured, the computing device 104 analyses the images of four different sections of the stained slide 102 using artificial intelligence algorithm to identify a plurality of blood components. The computing device 104 further analyses the plurality of blood components identified in the images of four different sections of the stained slide 102 based on predefined parameters. The computing device 104 further determines the counts of each blood component in the plurality of blood components. The computing device 104 then displays the counts of each blood component of the plurality of blood components instantly on a display unit of the computing device 104.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 August 2019
Publication Number
35/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
iprdocketing@sagaciousresearch.com
Parent Application

Applicants

SEVAMOB VENTURES LIMITED
511, Rajendra Nagar, Lucknow-226004, Uttar Pradesh, India (IN)

Inventors

1. SAXENA, Shelley
159 Ridley Circle, Decatur, GA, 30030, USA

Specification

[001] Various embodiments of the disclosure relate to automated blood cell counting; more specifically, to system for computer vision based blood cell counting. Furthermore, the present invention relates to method of computer vision based blood cell counting.
BACKGROUND OF THE INVENTION
[002] Blood tests are prescribed to diagnose certain diseases and conditions of an individual. Generally, a blood test involves manual analyses of blood samples collected from an individual. Typically, such blood samples are analysed at pathology centres to determine the number of red blood cells, white blood cells and platelet counts in the blood samples.
[003] However, such conventional blood cell counting involves various problems. For example, the conventional blood cell counting is performed manually, i.e. the pathologist manually counts the components, namely the red blood cells, white blood cells and platelets in the blood stained glass slide. Furthermore, such conventional blood cell counting is cumbersome, inefficient and inherently flawed due to human intervention. Furthermore, due to human intervention the conventional blood cell counting typically requires more than a day, which in some events could prove to be fatal for the individual being diagnosed.
[004] Additionally, few blood cell counting techniques or configurations known in the art may be configured to count blood cells more promptly than the manual counting techniques. However, such techniques or configurations also consist of various limitations. For example, such techniques or configurations are generally formed as huge installations. In other words, such techniques or configurations are implemented in huge medical or industrial facilities. Therefore, such techniques or configurations may not be readily accessible to an individual at all times. Moreover, the installation and implementation of such techniques or configurations for blood cell counting is generally expensive and impractical for various geographic locations.
[005] Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks associated with the conventional blood cell counting.
SUMMARY OF THE INVENTION
[006] The present disclosure seeks to provide a system for computer vision based blood cell counting. Furthermore, the present invention also seeks to provide a method of computer vision based blood cell counting. The present disclosure seeks to provide a solution to the existing problems associated with the conventional blood cell counting. An aim of the present disclosure is to provide a solution that at least partially overcomes the problems encountered with the conventional blood cell counting, and provides an improved, easy-to-use, economical and user friendly system and method of blood cell counting.
[007] In a first aspect, an embodiment of the present disclosure provides a system for computer vision based blood cell counting, the system comprising a stained slide, where the stained slide with a drop of patient’s blood is stained with Leishman stain. The system further provides a computing device, where the computing device is configured to capture images of four different sections of the stained slide of blood smear. Once the images are captured, the computing device analyses the images using artificial intelligence algorithm to identify a plurality of blood components. The computing device further analyses the plurality of blood components identified in the images based on predefined parameters. The computing device further determines the counts of each blood component in the plurality of blood components. The computing device then displays the counts of each blood component of the plurality of blood components instantly on a display unit of the computing device.
[008] Optionally, the plurality of blood components determined in the images includes red blood cell, white blood cells and platelets.
[009] In a second aspect, an embodiment of the present disclosure provides a method of computer vision based blood cell counting is disclosed. The method for providing computer vision based blood cell counting includes capturing images of four different sections of the stained slide of blood smear. The method further includes analysing the images of the stained slide using artificial intelligence algorithm. The method further includes analysing a plurality of blood components identified in the images of the stained slide based on predefined parameters. The method further includes determining the counts of each blood component in the plurality of blood components. The method further includes displaying the counts of each blood component of the plurality of blood components instantly on a display unit of the computing device.
[0010] These and other objects, features and advantages of the present invention will become apparent from a review of the following drawings and detailed description of the preferred embodiments of the invention.
[0011] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The summary above, as well as the following detail description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary construction of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those skilled in the art will understand that the drawings are not to scale. Wherever possible, like element have been indicated by identical numbers.
[0013] Embodiments of the present disclosure will now be described, by the way of example only, with reference to the following diagram wherein:
FIG. 1(a) is a block diagram of a system for computer vision based blood cell counting, in accordance with an embodiment of the present disclosure;
Fig. 1(b) is a flowchart for determining the blood cell count, in accordance with an embodiment of present disclosure;
FIG. 2 is a schematic illustration of an exemplary embodiment of the system of FIG.1 (a), in accordance with an embodiment of the present disclosure;
FIG. 3 A - 3D are schematic illustrations of exemplary user interfaces of a computing device, in accordance with an embodiment of the present disclosure; and
FIG. 4 is an illustration of steps of a method of computer vision based blood cell counting, in accordance with an embodiment of the present disclosure.
[0014] In the accompanying drawings, an underline number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF THE INVENTION
[0015] While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described, and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claim. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
[0016] In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
[0017] The present invention is described hereinafter by various embodiments with reference to the accompanying drawing, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only, and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary, and are not intended to limit the scope of the invention.
[0018] It is envisaged here, that automated blood cell counting systems and methods are important to determine the counts of each blood component and to encounter the problems associated with the conventional blood cell counting. The present invention provides an improved, easy-to-use, economical and user friendly system and method of blood cell counting.
[0019] The present invention will now be described in detail with reference to the accompanying drawings.
[0020] Referring to FIG.1 (a), shown is a schematic illustration of a block diagram of a system 100 for computer vision based blood cell counting, in accordance with an embodiment of the present disclosure. The system 100 includes a stained slide 102 and a computing device 104 for computer vision based blood cell counting. The stained slide 102 refers to thin flat piece of glass holding blood smear thereon, and stained in a way to allow the various blood cells to be examined microscopically. In an example, the stained slide 102 is formed by placing a drop of blood smear on one end of a slide, and using a spreader slide to disperse the blood along the slide's (namely, the stained slide 102) length. In such example, the blood stain may be further stained with a chemical agent, Leishman stain.
[0021] The computing device 104 relates to an electronic device associated with (or used by) a user that is capable of enabling the user to perform specific tasks associated with the aforementioned system/method, namely computer vision based blood cell counting. It will be appreciated that, throughout the present disclosure, the term ‘user’ as used herein relates to any entity including a person (i.e., human being), or a virtual personal assistant (an autonomous program or a bot) using the computing device 104 as described herein.
[0022] Furthermore, the computing device 104 is intended to be broadly interpreted to include any electronic device that may be used as a standalone device for processing data (for example image, textual, voice and the like) and communicating with the user. Examples of user equipment may include, but are not limited to, cellular phones, handheld/ wearable smart devices, laptop computers, personal computers, personal digital assistants (PDAs), and the likes. Moreover, computing device 104 may alternatively be referred to as a mobile station, a mobile terminal, a subscriber station, a remote station, a user terminal, a terminal, a subscriber unit, an access terminal, etc. Additionally, the computing device 104 includes a casing, a memory, a processor, a network interface card, a microphone, a speaker, a keypad, and a display.
[0023] The computing device 104 includes a user interface to enable the user to operate the computing device 104, upload captured images of the stained slide of blood sample, and view the results of the blood cell counting and the like. Throughout the present disclosure, the term ‘user interface (UI)’ relates to a structured set of user interface elements rendered on a display screen. Optionally, the user interface (UI) rendered on the display screen is generated by any collection or set of instructions executable by an associated digital system. Additionally, the user interface (UI) is operable to interact with the user to convey graphical and/or textual information and receive input from the user. Specifically, the user interface (UI) used herein is a graphical user interface (GUI). Furthermore, the user interface (UI) elements refer to visual objects that have a size and position in user interface (UI). A user interface element may be visible, though there may be times when a user interface element is hidden. A user interface control is considered to be a user interface element. Text blocks, labels, text boxes, list boxes, lines, and images windows, dialog boxes, frames, panels, menus, buttons, icons, etc. are examples of user interface elements. In addition to size and position, a user interface element may have other properties, such as a margin, spacing, or the like.
[0024] Optionally, the computing device 104 includes an imaging unit that is operated by the user via the user interface. Throughout the present disclosure, the term “imaging unit” is to be broadly interpreted as an imaging device that includes at least one lens and image sensor to acquire a reflectance from a reflected visible light that is reflected from a planer surface, namely the stained slide 102. In an example, the imaging unit is operatively coupled with the computing device 104, such as via wired or wireless connecting means. Optionally, the imaging unit is a built-in camera within the computing device 104, such as the imaging unit is camera in a smartphone.
[0025] Optionally, the imaging unit is operable to capture images (described herein later) of four different sections of the stained slide 102 via the photographic objective and store the captured images in a storage component included in an electronic circuitry of the computing device 104. In an instance, the imaging unit may be peripherally connected to the computing device 104, such as a laptop computer. In such instance, the imaging unit may be configured to transfer the images of four different sections of the stained slide 102 to be stored in the storage unit, such as a hard drive of the computing device 104 connected therein, via wired or wireless means. In another instance, the imaging unit may be a built-in camera in a smartphone. In such instance, the imaging unit may be configured to capture and store the images of four different sections of the stained slide 102 in the memory of the smartphone
[0026] Furthermore, each image refers to any optical pattern of optical contrast produced on the inside of the imaging unit that has similar appearance to an object namely a magnified view of the stained slide 102. Specifically, each image is a digital image. Therefore, the optical pattern produced on the inside of the imaging unit is a set of digital information and an optical image of the magnified view of the stained slide 102. Optionally, the magnified view of the stained slide 102, when viewed under a magnifying arrangement, namely a microscope is within a range 30X to 50X of magnification at an objective of the microscope, and a range of 5X to 15X of magnification at an eye piece of the microscope. More optionally, the magnified view of the stained slide 102, when viewed under the microscope has 40X of magnification at the objective of the microscope, and 10X of magnification at the eye piece of the microscope. Optionally, each image may be two-dimensional and/or three dimensional image.
[0027] Optionally, the user interface instructs the user to capture the images of four different sections of the stained slide 102. The instructions in the user interface instructs the user to capture images of four different sections from a tail end of the stained slide 102 in different angles such as at 45°, at 90°, at 125°, and at 145°. Optionally, the instructions in the user interface instruct the user to provide access of the captured images of four different sections of the stained slide 102 to an application installed in the computing device 104.
[0028] The application installed in the computing device 104 includes any collection or set of instructions stored in the memory of the computing device that are being executable by the processor of the computing device 104 so as to configure the computing device 104 to perform one or more tasks that is the intent of the aforementioned system, namely computer vision based blood cell counting. Optionally, such application refers to software application. Additionally, such software application is intended to encompass instructions that can be stored in storage medium such as RAM, memory, a hard disk, optical disk, or so forth. Furthermore, the software application may be organized in various ways, for example the software application includes components organized as libraries, source code, interpretive code, object code, directly executable code, and so forth. It may be appreciated that the software application may invoke system-level code or calls to other software residing on a server or other location to perform certain functions. Moreover, the process may be pre-configured and pre-integrated with an operating system of the computing device 104.
[0029] Optionally, the software application includes artificial intelligent algorithms. Throughout the present disclosure, the term ‘Artificial intelligence (AI) algorithm’ as used herein relates to any mechanism or computationally intelligent system that combines knowledge, techniques, and methodologies for controlling a bot or other element within a computing environment. Furthermore, the artificial intelligence (AI) is configured to apply knowledge, to adapt it-self and learn to do better in changing environments. Additionally, employing any computationally intelligent technique, the artificial intelligence (AI) algorithm is operable to adapt to unknown or changing environment for better performance. The artificial intelligence (AI) algorithm includes fuzzy logic engines, decision-making engines, pre-set targeting accuracy levels, and/or programmatically intelligent software.
[0030] Artificial intelligence (AI) algorithm in the context of the present disclosure relates to software-based algorithms that are executable upon computing device 104 and are operable to adapt and adjust their operating parameters in an adaptive manner depending upon information that is presented to the algorithms when executed upon the computing device 104. Optionally, the artificial intelligence (AI) may include neural networks such as recurrent neural networks, recursive neural networks, feed-forward neural networks, convolutional neural networks, deep belief networks, and convolutional deep belief networks; self-organizing maps; deep Boltzmann machines; and stacked de-noising auto-encoders. Optionally, the “artificial neural network” or simply a “neural network” as used herein, can include a highly interconnected network of processing elements, each optionally associated with a local memory.
[0031] Optionally, in the present disclosure the artificial intelligent algorithms are configured to extract and process information from the images, namely the images of four different sections of the stained slide 102. Furthermore, the artificial intelligent algorithms as used herein are configured to perform object recognition. Specifically, the artificial intelligent algorithms as used herein is configured to identify one or more blood components based on the highly interconnected network of processing elements that can be locally stored in the form of a library in a local memory.
[0032] Optionally, the artificial intelligent algorithms, specifically, the neural networks therein, are created by feeding in numerous pre-labelled images of various components of bloods, blood stained slides and the like. In such instance, the artificial intelligent algorithms are configured to process the individual pixels of an image, namely the images of four different sections of the stained slide 102, and subsequently recognize the components of bloods present in the stained slide 102. In other words, the artificial intelligent algorithms performs a recognition technique referred to as “template matching”; wherein a first image referred to as a target image, namely the images of four different sections of the stained slide 102, is searched for regions that are similar to a second image referred to as a reference image, namely the numerous pre-labelled images used in forming the neural networks included in the artificial intelligent algorithms in for identifying known blood components. Typically, the search is performed using two-dimensional correlation.
[0033] In an embodiment of the present disclosure, the user utilizes the computing device 104 to capture images of four different sections of the stained slide 102 and subsequently provide the captured images to the application installed in the computing device 104. Furthermore, the application uses the artificial intelligent algorithms included therein to process the provided images of four different sections of the stained slide 102. The processing of the captured images includes plurality of stages.
[0034] Referring to FIG. 1 (b), a flowchart for determining the blood cell count is illustrated in accordance with an embodiment of present disclosure. In one of the pluralities of stages the artificial intelligent algorithms may be configured to filter each of the provided images based on shapes and colour of the blood component (identified). It will be appreciated that the filtering of the image based on colour may include a colour range. In other words, a colour of the blood component after straining with a chemical agent is compared with a predefined set of colours. Thereafter, each of the provided images is cleared of noise. In an example, each of the provided images may be sharpened by performing a series of erosion and dilation operations. Upon clearing the noise, the contours of each remaining shape in each of the provided image is identified and analysed. Thereafter, the identified shapes are analysed based on a set of predefined parameters.
[0035] A size X micrometre may be considered as the predefined threshold size of any identified blood component. If a size of any given shape of an identified blood component is less than X than the artificial intelligent algorithm is configured not to consider it as a specific type of blood component, namely white blood cells. If a size of any given shape of an identified blood component is more than X than the artificial intelligent algorithm is configured to consider the identified blood component as a specific type of blood component, namely white blood cells. Similarly, the images of four different sections of the stained slide 102 are analysed, the count of white blood cells in each of the images are considered, and a numerical factor is used to determine an average of the white blood cells. Thereafter, the count is provided as a final count of white blood cells in the stained slide 102.
[0036] Each of the images of four different sections of the stained slide 102 is converted in a different image format, in other words, each of the images of four different sections of the stained slide 102 are converted to grey scale. Thereafter, the images (that are grey scaled) are processed to remove further noise, in other words, each of the images of four different sections of the stained slide 102 are blurred. Upon clearing the noise the contours of each remaining shape of identified blood component in each of the images (that are grey scaled) of four different sections of the stained slide 102 is considered.
A diameter Y micrometre may be considered as the predefined radius threshold for a circular shape of any identified blood component. If the radius of an identified blood component is more than Y and shape of the said blood component is roughly circular having high solidity then the artificial intelligent algorithm is configured to consider the identified blood component as a specific type of blood component, namely red blood cells. Similarly, each of the provided images are analysed; the count of the red blood cells in each of the images are considered. Furthermore, count of the red blood cells is adjusted by subtracting the count of white blood cells in the slide, and a numerical factor is used to determine an average of the red blood cells. Thereafter, the count is provided as a final count of red blood cells in the stained slide 102. If the radius of the identified blood component is more than Y, and the shape of the identified blood component is not roughly circular with low solidity then the artificial intelligent algorithm is configured to consider the identified blood component as a specific type of blood component, namely platelets. Furthermore, the count of the platelets is adjusted by subtracting the count of white blood cells and the red blood cells in the slide. Similarly, each of the provided images of four different sections of the slide are analysed, the count of the platelets in each of the images are considered, and a numerical factor is used to determine an average of the platelets. Thereafter, the count is provided as a final count of platelets in the stained slide 102.
[0037] In such embodiment, upon determining the blood components, namely the white blood cells, the red blood cells, and the platelets, the count is displayed on a display of the computing device 104 instantly.
[0038] Referring to FIG.2, shown is a schematic illustration of an exemplary embodiment 200 of the system 100, in accordance with an embodiment of the present disclosure. As shown, the exemplary embodiment 200 includes a microscope 202, a stained slide 204, a smart device 206, and a holder 208 of the smart device 206. It will be appreciated that the smart device 206 is an embodiment of the computing device of FIG. 1 (a).
[0039] In the exemplary embodiment 200, the stained slide 204 is placed under the microscope 202 to have a magnified view 210 of the blood strain in the stained slide 204. It will be appreciated that the stained slide 204 is an embodiment of the stained slide 102 of FIG.1 (a). As shown, the magnified view 210 of the blood strain in the stained slide 204 includes the white blood cells 212, the red blood cells 214, and the platelets 216. Furthermore, the holder 208 of the smart device 206 may be configured to hold the smart device 206 at the eye-piece of the microscope 202 appropriately to capture the magnified view 210 of the blood strain in the stained slide 204. In such embodiment, the smart device 206 is configured to capture images of four different sections of the stained slide 204, such as a first image at a first position; a second image is captured upon sliding the stained slide 204 longitudinally to a second position from the first position; a third image is captured upon sliding the stained slide 204 longitudinally to a third position from the second position; and a fourth image is captured upon sliding the stained slide 204 longitudinally to a fourth position from the third position. It will be appreciated that, the first, second, third and fourth position are positions of the slide on the microscope 202.
[0040] Referring to FIG.3A-3D, shown are schematic illustrations of exemplary user interfaces of the computing device 300, in accordance with an embodiment of the present disclosure. It will be appreciated that the computing device 300 is an embodiment of the smart device 206 of FIG. 2. As shown, the user interfaces 302, 308, 312, and 314 are configured to enable the user of the system (such as the system 100 of FIG.1) to interact with the system to generate an instant computer vision based blood cell counting. The user interfaces 302, 308, 312, and 314 includes an common control area 304 for controlling one or more operation of the application associate with the interfaces 302, 308, 312, and 314. It will be appreciated that the application refers to the application installed in the computing device 104 of FIG. 1 (a). In an example, the one or more operation controlled from the common control area 304 includes generating a profile in the application for a specific user, a control setting of the operating parameters of the system 100, exiting the application, and the like.
[0041] Specifically, the user interface 302 includes an input area 306 that is displayed on a display of the computing device 300 to initiate the computer vision based blood cell counting; the interface provides a selection menu wherein the user may select a type of computer vision based blood cell counting the user prefers, for example, the user may select a count for white blood cells, red blood cells, and platelets.
[0042] Specifically, the user interface 308 includes an input area 310. The input area 310 is configured to enable the user to upload/provide the images of four different sections of the stained slide (such as the stained slide 204 of FIG.2). The input area 310 may include plurality of graphical elements to enable uploading of images of four different sections of the stained slide 204. For example, one graphical element of the plurality of graphical elements may enable the user to upload the images of four different sections of the stained slide 204 from the memory of the computing device 300. In another example, the one graphical element of the plurality of graphical elements may enable the user to access the imaging unit in the computing device 300 to click magnified images of the stained slide 204.
[0043] Specifically, the user interface 312 is configured to display the information related to the images uploaded/provided for the determining the type of computer vision based blood cell counting the user prefers, such as a count for white blood cells, red blood cells, and platelets. The four different sections of the images uploaded/provided have different views of the stained slide 204.
[0044] Specifically, the user interface 314 is configured to display the information related to details of the blood cell counting. The details of the blood cell counting are further displayed in the area 316. Furthermore, the details of the blood cell counting are provided based on the type of blood cell counting selected by the user in user interface 302. For example, the user selects a count for white blood cells, red blood cells, and platelets and uploads appropriate image in user interfaces 308 and 312, in such instance the count of the white blood cells, red blood cells, and platelets is provided in user interface 314.
[0045] Referring to FIG. 4, illustrated are steps of a method 400 of computer vision based blood cell counting, in accordance with an embodiment of the present disclosure. At step 402images of four different sections of the stained slide are captured. At step 404 the images of four different sections of the stained slide are analysed using artificial intelligence algorithms to identify a plurality of blood components. At step 406 the plurality of blood components identified in the images of four different sections of the stained slide are analysed based on predefined parameters. At step 408 the counts of each blood component in the plurality of blood components is determined. At step 410 the counts of each blood component of the plurality of blood components is instantly displayed on a display unit of a computing device.
[0046] The steps 402 to 410 are only illustrative and other alternatives can also be provided where one or more steps may be added, one or more steps may be removed, or one or more steps may be provided in a different sequence without departing from the scope of the disclosed method herein.
[0047] This detailed description, and particularly the specific details of the exemplary embodiment disclosed, are given primarily for better understanding and no unnecessary limitations are to be understood therefrom, for modifications will become evident to those skilled in the art upon reading this disclosure and may be made without departing from the spirit or scope of the claimed invention.
[0048] It will be readily understood that the components of various embodiments of the present invention, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the detailed description of the embodiments of the present invention, as represented in the attached figures, is not intended to limit the scope of the invention as claimed, but is merely representative of selected embodiments of the invention.
[0049] The features, structures, or characteristics of the invention described throughout this specification may be combined in any suitable manner in one or more embodiments. For example, reference throughout this specification to “certain embodiments,” “some embodiments,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in certain embodiments,” “in some embodiment,” “in other embodiments,” or similar language throughout this specification do not necessarily all refer to the same group of embodiments and the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0050] It should be noted that reference throughout this specification to features, advantages, or similar language does not imply that all of the features and advantages that may be realized with the present invention should be or are in any single embodiment of the invention. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present invention. Thus, discussion of the features and advantages, and similar language, throughout this specification may, but do not necessarily, refer to the same embodiment.
[0051] Furthermore, the described features, advantages, and characteristics of the invention may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize that the invention can be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the invention.
[0052] One having ordinary skill in the art will readily understand that the invention as discussed above may be practiced with steps in a different order, and/or with hardware elements in configurations which are different than those which are disclosed. Therefore, although the invention has been described based upon these preferred embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions would be apparent, while remaining within the spirit and scope of the invention. In order to determine the metes and bounds of the invention, therefore, reference should be made to the appended claims.

CLAIMS:We Claim:
1) A system 100 for computer vision based blood cell counting, the system 100 comprising of:
a stained slide 102; and
a computing device 104, wherein the computing device 104 is configured to:
capture images of four different sections of the stained slide 102,
analyse the images of the stained slide 102 using artificial intelligence algorithm to identify a plurality of blood components,
analyse the plurality of blood components identified in the images of the stained slide 102 based on predefined parameters,
determine the counts of each blood component in the plurality of blood components,
display the counts of each blood component of the plurality of blood components instantly on a display unit of the computing device 104.

2) The system as claimed in claim 1, wherein the stained slide 102 is formed by placing a drop of blood smear on one end of a slide, and dispersing the drop of blood along the length of the stained slide 102.

3) The system as claimed in claim 1, wherein the stained slide 102 is stained with a Leishman stain.

4) The system as claimed in claim 1, wherein the computing device 104 includes a memory unit having computer readable instructions stored thereon, and a processor that executes the computer readable instructions stored in the memory.

5) The system as claimed in claim 4, wherein the computing device 104 enables a user to upload the images of four different sections of the stained slide 102, and view results of blood cell count via a graphical user interface.

6) A method 400 of computer vision based blood cell counting, the method 400 comprising the steps of:
capturing 402 images of four different sections of a stained slide via a computing device;
analysing 404 the images of the stained slide using artificial intelligence algorithm to identify a plurality of blood components via the computing device;
analysing 406 the plurality of blood components identified in the images of the stained slide based on predefined parameters via the computing device;
determining 408 the counts of each blood component in the plurality of blood components via the computing device; and
displaying 410 the counts of each blood component of the plurality of blood components instantly on a display unit of the computing device.

7) The method as claimed in claim 6, wherein the images of four different sections of the stained slide are stored in a memory of the computing device.

8) The method as claimed in claim 6, wherein each of the images of four different sections of the stained slide is a two-dimensional and/or three dimensional image.

9) The method as claimed in claim 6, wherein the artificial intelligence algorithms are configured to extract and process information from the images of four different sections of the stained slide.

10) The method as claimed in claim 6, wherein the blood components includes white blood cells, red blood cells, and platelets.

Documents

Application Documents

# Name Date
1 201911034107-FER.pdf 2025-03-07
1 201911034107-FORM 18 [23-08-2023(online)].pdf 2023-08-23
1 201911034107-STATEMENT OF UNDERTAKING (FORM 3) [23-08-2019(online)].pdf 2019-08-23
2 201911034107-FORM 18 [23-08-2023(online)].pdf 2023-08-23
2 201911034107-Proof of Right [19-01-2021(online)].pdf 2021-01-19
2 201911034107-PROVISIONAL SPECIFICATION [23-08-2019(online)].pdf 2019-08-23
3 201911034107-COMPLETE SPECIFICATION [21-08-2020(online)].pdf 2020-08-21
3 201911034107-Proof of Right [19-01-2021(online)].pdf 2021-01-19
3 201911034107-FORM FOR SMALL ENTITY(FORM-28) [23-08-2019(online)].pdf 2019-08-23
4 201911034107-FORM FOR SMALL ENTITY [23-08-2019(online)].pdf 2019-08-23
4 201911034107-CORRESPONDENCE-OTHERS [21-08-2020(online)].pdf 2020-08-21
4 201911034107-COMPLETE SPECIFICATION [21-08-2020(online)].pdf 2020-08-21
5 201911034107-FORM 1 [23-08-2019(online)].pdf 2019-08-23
5 201911034107-DRAWING [21-08-2020(online)].pdf 2020-08-21
5 201911034107-CORRESPONDENCE-OTHERS [21-08-2020(online)].pdf 2020-08-21
6 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-08-2019(online)].pdf 2019-08-23
6 201911034107-DRAWING [21-08-2020(online)].pdf 2020-08-21
6 201911034107-Correspondence-021219.pdf 2019-12-05
7 201911034107-Power of Attorney-021219.pdf 2019-12-05
7 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI [23-08-2019(online)].pdf 2019-08-23
7 201911034107-Correspondence-021219.pdf 2019-12-05
8 201911034107-Power of Attorney-021219.pdf 2019-12-05
8 201911034107-DRAWINGS [23-08-2019(online)].pdf 2019-08-23
8 201911034107-FORM-26 [25-11-2019(online)].pdf 2019-11-25
9 201911034107-FORM-26 [25-11-2019(online)].pdf 2019-11-25
9 abstract.jpg 2019-09-12
10 201911034107-DRAWINGS [23-08-2019(online)].pdf 2019-08-23
10 201911034107-FORM-26 [25-11-2019(online)].pdf 2019-11-25
10 abstract.jpg 2019-09-12
11 201911034107-DRAWINGS [23-08-2019(online)].pdf 2019-08-23
11 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI [23-08-2019(online)].pdf 2019-08-23
11 201911034107-Power of Attorney-021219.pdf 2019-12-05
12 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-08-2019(online)].pdf 2019-08-23
12 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI [23-08-2019(online)].pdf 2019-08-23
12 201911034107-Correspondence-021219.pdf 2019-12-05
13 201911034107-DRAWING [21-08-2020(online)].pdf 2020-08-21
13 201911034107-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-08-2019(online)].pdf 2019-08-23
13 201911034107-FORM 1 [23-08-2019(online)].pdf 2019-08-23
14 201911034107-CORRESPONDENCE-OTHERS [21-08-2020(online)].pdf 2020-08-21
14 201911034107-FORM 1 [23-08-2019(online)].pdf 2019-08-23
14 201911034107-FORM FOR SMALL ENTITY [23-08-2019(online)].pdf 2019-08-23
15 201911034107-COMPLETE SPECIFICATION [21-08-2020(online)].pdf 2020-08-21
15 201911034107-FORM FOR SMALL ENTITY [23-08-2019(online)].pdf 2019-08-23
15 201911034107-FORM FOR SMALL ENTITY(FORM-28) [23-08-2019(online)].pdf 2019-08-23
16 201911034107-FORM FOR SMALL ENTITY(FORM-28) [23-08-2019(online)].pdf 2019-08-23
16 201911034107-Proof of Right [19-01-2021(online)].pdf 2021-01-19
16 201911034107-PROVISIONAL SPECIFICATION [23-08-2019(online)].pdf 2019-08-23
17 201911034107-STATEMENT OF UNDERTAKING (FORM 3) [23-08-2019(online)].pdf 2019-08-23
17 201911034107-PROVISIONAL SPECIFICATION [23-08-2019(online)].pdf 2019-08-23
17 201911034107-FORM 18 [23-08-2023(online)].pdf 2023-08-23
18 201911034107-STATEMENT OF UNDERTAKING (FORM 3) [23-08-2019(online)].pdf 2019-08-23
18 201911034107-FER.pdf 2025-03-07
19 201911034107-FORM 3 [07-05-2025(online)].pdf 2025-05-07
20 201911034107-RELEVANT DOCUMENTS [05-09-2025(online)].pdf 2025-09-05
21 201911034107-PETITION UNDER RULE 137 [05-09-2025(online)].pdf 2025-09-05
22 201911034107-OTHERS [05-09-2025(online)].pdf 2025-09-05
23 201911034107-FER_SER_REPLY [05-09-2025(online)].pdf 2025-09-05
24 201911034107-COMPLETE SPECIFICATION [05-09-2025(online)].pdf 2025-09-05
25 201911034107-CLAIMS [05-09-2025(online)].pdf 2025-09-05

Search Strategy

1 201911034107_SearchStrategyNew_E_SearchHistoryE_05-03-2025.pdf