Abstract: A medical imaging system for performing imaging procedures is disclosed. The medical imaging system includes a display unit for displaying a plurality of user interface objects accessible to a user. The user accesses one or more user interface (U1) objects of the plurality of UI objects for performing an imaging procedure. A processor communicably coupled to the display unit is configured to identify a usage pattern of the plurality of UI objects by the user. Thereafter one or more display features of the one or more UI objects of the plurality of UI objects is modified based on the usage pattern.
COMPUTING SYSTEM AND METHOD FOR ARRANGING USER INTERFACE OBJECTS DISPLAYED IN A MEDICAL IMAGING SYSTEM
TECHNICAL FIELD
[0001] The subject matter disclosed herein relates to a medical imaging system. More specifically the subject matter relates to a computing system and a method for arranging user interface objects displayed in a medical imaging system.
BACKGROUND OF THE INVENTION
[0002] User interface designing for an application and especially for a portable device is challenging due to lack of real estate in a display screen of the device. Usually a plethora of user interface (UI) objects for accessing different applications may be found in the display screen. These UI objects are provided so that user can get easy access to applications upfront. However the downside is that the user may at times have to perform multiple clicks or hits on the UI so that a correct UI object is selected for accessing a required application due to the presence of more UI objects. This is prevalent in mobile devices, smart phones, tablets and other handhelds. Similarly in a medical imaging environment, medical imaging systems also have a user interface having multiple UI objects that are accessed by a user for activating one or more applications. These applications as activated to image different regions or areas (e.g. different organs) of patients or other objects. For example, an ultrasound imaging system may be utilized to generate an image of organs, vasculature, heart, or other portions of the body. However in a medical imaging system such as the ultrasound imaging system an ultrasound image requires more space in a display screen and thus it is more challenging in providing space to UI objects. The user may need to give more attention to the ultrasound image as this may be a streaming ultrasound image video as well.
[0003] Thus there is a need for a system that can arrange the UI objects in a convenient manner so that the real estate in a display screen is utilized efficiently.
BRIEF DESCRIPTION OF THE INVENTION
[0004] The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.
[0005] As discussed in detail below, embodiments of the invention include a computing system including a user interface for displaying a plurality of user interface (UI) objects accessible to a user is disclosed. A processor communicably coupled to the user interface is configured to identify a user pattern of the plurality of UI objects by the user. The processor is also configured to modify one or more display features of the one or more UI objects of the plurality of UI objects based on the usage pattern.
[0006] In another embodiment a medical imaging system for performing imaging procedures is disclosed. The medical imaging system includes a display unit for displaying a plurality of user interface objects accessible to a user. The user accesses one or more user interface (UI) objects of the plurality of UI objects for performing an imaging procedure. A processor communicably coupled to the display unit is configured to identify a usage pattern of the plurality of UI objects by the user. Thereafter one or more display features of the one or more UI objects of the plurality of UI objects is modified based on the usage pattern.
[0007] In yet another embodiment a method of arranging a plurality of user interface objects displayed in a medical imaging system is disclosed. The method includes displaying a plurality of user interface (UI) objects accessible to a user in a display unit of the medical imaging system. The user accesses one or more UI objects of the plurality of UI objects for performing an imaging procedure. The method also includes identifying a usage pattern of the plurality of UI objects based on user input; and modifying one or more display features of one or more UI objects of the plurality of UI objects based on the usage pattern.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] FIGURE 1 is a schematic illustration of a computing system for arranging multiple user interface (UI) objects in an user interface;
[0009] FIGURE 2 is a schematic illustration of a mobile device wherein the computing system may operate to arrange UI objects in accordance with an embodiment;
[0010] FIGURE 3 is a schematic illustration of a portable medical imaging device such as an ultrasound imaging system in accordance with an embodiment;
[0011] FIGURE 4 is a schematic illustration of a medical imaging system for performing imaging of an anatomy in accordance with an embodiment;
[0012] FIGURE 5 and FIGURE 6 illustrate the display unit of the medical imaging system for presenting arrangement of UI objects in accordance with an exemplary embodiment;
[0013] FIGURE 7 illustrates a display unit presenting multiple UI objects for setting a two dimensional imaging mode to perform an imaging procedure in accordance with an exemplary embodiment;
[0014] FIGURE 8 illustrates the display unit presenting sub-menu parameters of the two-dimensional imaging mode in the form of UI objects in accordance with an exemplary embodiment;
[0015] FIGURE 9 illustrates the display unit presenting the UI objects in a re¬arranged order based on a user's usage pattern in accordance with an exemplary embodiment;
[0016] FIGURE 10 is a block diagram of a method of arranging a plurality of UI objects displayed in a medical imaging system in accordance with an embodiment; and
[0017] FIGURE 11 is a block diagram of a method of arranging a plurality of UI objects displayed in a medical imaging system in accordance with another embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0018] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
[0019] To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. One or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be standalone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
[0020] As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to "one embodiment" ofthe present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising" or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.
[0021] A medical imaging system for performing imaging procedures is disclosed. The medical imaging system includes a display unit for displaying a plurality of user interface (UI) objects accessible to a user. The user accesses one or more UI objects of the plurality of UI objects for performing an imaging procedure. A processor communicably coupled to the display unit is configured to identify a usage pattern of the plurality of UI objects by the user. Thereafter one or more display features of the one or more UI objects of the plurality of UI objects is modified based on the usage pattern.
[0022] Although the various embodiments are described with respect to an ultrasound imaging system, the various embodiments may be utilized with any suitable imaging system, for example, X-ray, computed tomography, single photon emission computed tomography, magnetic resonance imaging, or the like.
[0023] FIG. 1 illustrates a computing system 100 for arranging multiple user interface (UI) objects in a user interface 102 in accordance with an embodiment. The computing system 100 may be configured in a mobile device, a tablet, a smart phone, a handheld or any computing device. The UI objects may be associated with different applications in a mobile device, a tablet, a smart phone, a handheld or any computing device. A UI object may be selected by a user for accessing or launching an application. Further the user may select different UI objects for using an application. The users may have different usage patterns of using the UI objects. For example a user may use few UI objects more frequently for accessing some applications and other UI objects may remain unused or less used. Thus a processor 106 communicably coupled to the user interface 102 identifies the usage pattern of the user. The usage pattern may be determined by identifying a frequency of selection of each UI object selected. In an embodiment the frequency is identified by monitoring number of clicks on an UI object that resulted in selection and launching of an application and number of clicks on the UI object that did not result in selection and launching of the application. From this the number of clicks that resulted in launching of the application is considered. In an embodiment the processor 104 may also monitor an order of selecting the UI objects 104 by the user for determining the usage pattern. The usage pattern is used by the processor 104 to modify one or more display features of the UI objects 104. The display features include but are not limited to at least one of a size of an UI object and a display tone of the UI object. The display tone may include for example but not limited to, color, brightness and contrast of the UI object.
[0024] In an embodiment the display features are modified based on a predefined selection threshold. If the frequency of the selection of one or more UI objects is greater than the predefined selection threshold then a size and/or the display tone of the UI objects 104 are varied. Similarly if the frequency of the selection of one or more UI objects is lesser than the predefined selection threshold then a size and/or the display tone of the UI objects 104 are varied. This is further explained in detail in conjunction with FIG. 2.
[0025] FIG. 2 is a schematic illustration of a mobile device 200 wherein the computing system 100 may operate to arrange UI objects in accordance with an embodiment. As illustrated the mobile device 200 have a display screen 202 including a camera UI object 204, a calendar UI object 206, a messaging UI object 208, a contacts UI object 210, an application UI object 212 and a memo UI object 214. However it may be envisioned that the display screen 202 may include other UI objects pertaining to other applications as well. A user of the mobile device 200 may frequently use the messaging UI object 208 by clicking on this UI object for sending and reading messages. The user may similarly use other UI objects to access corresponding applications. If a frequency of selection of the messaging UI object 208 is more than a predefined selection threshold for example 22 clicks on an UI object, size of the messaging UI object 208 is increased as illustrated in FIG. 2. Thus the predefined selection threshold is used for identifying the UI object that will be used by the user almost all the time. The frequency of selection of other UI objects are also monitored and if their frequencies do not exceed than the predefined selection threshold then the size of these UI objects may remain same or may be reduced. In an instance the memo UI object 214 may be less frequently accessed or not accessed, the size of the memo UI object 214 is reduced. Any variation in a display feature (i.e. the size) of the UI object enables the user to easily select the required application (i.e. the messaging application) more conveniently.
[0026] The user may also use sub-menu UI objects such as an inbox 216 and a compose message 218 of the messaging UI object 208 more frequently. However other sub-menu UI objects for example a sent item box 220, settings 222, and a deleted item box 224 may be less frequently used or never used. This usage pattern of the user is identified and compared with a predefined selection threshold. If the frequency of selection of the inbox 216 and the compose message 218 are greater than a predefined selection threshold, the size of these sub-menu UI objects may be increased so that it stands out among other sub-menu UI objects in the display screen 202 of the mobile device 200. When the size of the sub-menu UI objects increase the other sub-menu UI objects i.e. the sent item box 220, the settings 222, and the deleted item box 224 may be decreased so that they can be displayed within the display screen 202. In another scenario the display tone such as, a contrast and a color of the inbox 216 and the compose message 218 may be increased. When one of the display features (such as size or display tone) of the inbox 216 and the compose message 218 are increased, the user can conveniently select these sub-menu UI objects. It may be envisioned that the size and the display tone of the UI objects and the sub-menu UI objects may be varied together.
[0027] A medical imaging system may also have UI objects in a display that may be used by a medical technician or a medical expert to capturing images of an anatomy of a subject. These UI objects may be associated with different imaging procedures such as a cardiac imaging, an obstetric imaging, an abdominal imaging and a vascular imaging to be performed on the anatomy. FIG. 3 is a schematic illustration of a portable medical imaging device such as an ultrasound imaging system 300 in accordance with an embodiment. The ultrasound imaging system 300 may be a portable or a handheld ultrasound imaging system. For example, the ultrasound imaging system 300 may be similar in size to a smartphone, a personal digital assistant or a tablet. In other embodiments, the ultrasound imaging system 300 may be configured as a laptop or a cart based system. The ultrasound imaging system 300 may be transportable to a remote location, such as a nursing home, a medical facility, rural area, or the like.
[0028] A probe 302 is in communication with the ultrasound imaging system 300. The probe 302 may be mechanically coupled to the ultrasound imaging system 300. Alternatively, the probe 302 may wirelessly communicate with the ultrasound imaging system 300. The probe 302 includes transducer elements 304 that emit ultrasound pulses to an object 306 to be scanned, for example an organ of a patient. The ultrasound pulses may be back-scattered from structures within the object 306, such as blood cells or muscular tissue, to produce echoes that return to the transducer elements 304. The transducer elements 304 generate ultrasound image data based on the received echoes. The probe 302 also includes a motion sensor 308 in accordance with an embodiment. The motion sensor 308 may include but not limited to, an accelerometer, a magnetic sensor and a gyro sensor. The motion sensor 308 is configured to identify the position and orientation of the probe 302 on the object 306. The position and orientation may be identified in real-time, when a medical expert is manipulating the probe 302. The term "real-time" includes an operation or procedure that is performed without any intentional delay. The probe 302 transmits the ultrasound image data to the ultrasound imaging system 300. The ultrasound imaging system 300 includes a memory 310 that stores the ultrasound image data. The memory 310 may be a database, random access memory, or the like. In one embodiment, the memory 310 is a secure encrypted memory that requires a password or other credentials to access the image data stored therein. The memory 310 may have multiple levels of security. For example, a surgeon or doctor may have access to all of the data stored in the memory 310, whereas, a technician may have limited access to the data stored in the memory 310. In one embodiment, a patient may have access to the ultrasound image data related to the patient, but is restricted from all other data. A processor 312 accesses the ultrasound image data from the memory 310. The processor 312 may be a logic based device, such as one or more computer processors or microprocessors. The processor 312 generates an image based on the ultrasound image data. The image is displayed on a presentation layer 314, which may be, for example, a graphical user interface (GUI) or other displayed user interface, such as a virtual desktop.
The presentation layer 314 may be a software based display that is accessible from multiple locations. The presentation layer 314 displays the image on a display 316 provided within the ultrasound imaging system 300. The display 316 may be a touch sensitive screen. Alternatively, the presentation layer 314 may be accessible through a web-based browser, local area network, or the like. In such an embodiment, the presentation layer 314 may be accessible remotely as a virtual desktop that displays the presentation layer 314 in the same manner as the presentation layer 314 is displayed in the display 316.
[0029] The ultrasound imaging system 300 includes imaging configurations 318 associated with different imaging procedures that can be performed. The imaging procedures include for example, obstetric imaging, cardiac imaging and abdominal imaging. Based on an imaging procedure to be performed a corresponding imaging configuration needs to be set. The imaging configuration may be set by a user in the ultrasound imaging system 300. The imaging configurations may be pre-stored in the ultrasound imaging system 300. The imaging configuration may include various parameters such as frequency, speckle reduction imaging, imaging angle, time gain compensation, scan depth, gain, scan format, image frame rate, field of view, focal point, scan lines per image frame, number of ultrasound beams and pitch of the transducer elements. These parameters vary for different imaging configurations. For example, the ultrasound imaging system 300 may be used for a cardiac application by configuring a cardiac imaging configuration. Thereafter an abdominal imaging configuration stored in the ultrasound imaging system 300 needs to be set for performing the abdominal imaging application. For the cardiac application, an image frame rate is an important factor. Therefore the ultrasound imaging system 300 is set to switch off few imaging filters such as a frame averaging filter and a speckle reduction imaging filter, and also vary some parameters like narrow field of view, single focal point, lesser number of scan lines per image frame. Whereas for an abdominal application, resolution may be an important parameter. Thus the ultrasound imaging system 100 turns on a medium or high frame averaging filter and a speckle reduction imaging filter. Further some parameters may be also set for example multiple focal points, wide field of view, more number of scan lines per image frame (i.e. higher line density), and transmission of multiple ultrasound beams.
[0030] The ultrasound imaging system 300 also includes a transmitter/receiver 320 that communicates with a transmitter/receiver 322 of a workstation 324. For example, the workstation 324 may be positioned at a location, such as a hospital, imaging center, or other medical facility. The workstation 324 may be a computer, tablet-type device, or the like. The workstation 324 may be any type of computer or end user device. The workstation 324 includes a display 326. The workstation 324 communicates with the ultrasound imaging system 300 to display an image based on image data acquired by the ultrasound imaging system 300 on the display 326. The workstation 324 also includes any suitable components image viewing, manipulation, etc.
[0031] The ultrasound imaging system 300 and the workstation 324 communicate through the transmitter/receivers 320 and 322, respectively. The ultrasound imaging system 300 and the workstation 324 may communicate over a local area network. For example, the ultrasound imaging system 300 and the workstation 324 may be positioned in separate remote locations of a medical facility and communicate over a network provided at the facility. In an exemplary embodiment, the ultrasound imaging system 300 and the workstation 324 communicate over an internet connection, such as through a web-based browser.
[0032] An operator may remotely access imaging data stored on the ultrasound imaging system 300 from the workstation 324. For example, the operator may log onto a virtual desktop or the like provided on the display 326 of the workstation 324. The virtual desktop remotely links to the presentation layer 314 of the ultrasound imaging system 300 to access the memory 310 of the ultrasound imaging system 300. The memory 310 may be secured and encrypted to limit access to the image data stored therein. The operator may input a password to gain access to at least some of the image data.
[0033] Once access to the memory 310 is obtained, the operator may select image data to view. It should be noted that the image data is not transferred to the workstation 324. Rather, the image data is processed by the processor 312 to generate an image on the presentation layer 314. For example, the processor 312 may generate a DICOM image on the presentation layer 314. The ultrasound imaging system 300 transmits the presentation layer 314 to the display 326 of the workstation 324 so that the presentation layer 314 is viewable on the display 326. In one embodiment, the workstation 324 may be used to manipulate the image on the presentation layer 314. The workstation 324 may be used to change an appearance of the image, such as rotate the image, enlarge the image, adjust the contrast of the image, or the like. Moreover, an image report may be input at the workstation 324. For example, an operator may input notes, analysis, and/or comments related to the image. In one embodiment, the operator may input landmarks or other notations on the image. The image report is then saved to the memory 310 of the ultrasound imaging system 300. Accordingly, the operator can access images remotely and provide analysis of the images without transferring the image data from the ultrasound imaging system 300. The image data remains stored only on the ultrasound imaging system 300 so that the data remains restricted only to individuals with proper certification.
[0034] In one embodiment, the ultrasound imaging system 300 is capable of simultaneous scanning and image data acquisition. The ultrasound imaging system 300 may be utilized to acquire a first set of imaging data, while a second set of imaging data is accessed to display on the display 326 of the workstation 324 an image based on the second set of imaging data. The ultrasound imaging system 300 may be also capable of transferring the image data to a data storage system 328 present in a remote location. The ultrasound imaging system 300 communicates with the data storage system 328 over a wired or wireless network.
[0035] FIG. 4 is a schematic illustration of a medical imaging system 400 for performing imaging of an anatomy in accordance with an embodiment. The medical imaging system 400 includes a display unit 402 for presenting multiple user interface (UI) objects 404 accessible to the user. FIG. 5 and FIG. 6 illustrate the display unit 402 presenting arrangement of UI objects in accordance with an exemplary embodiment.
Hence the FIG. 5 and FIG. 6 are hereinafter explained together. The user may access few UI objects from the multiple UI objects 404 for performing an imaging procedure. The UI objects 404 may be associated with but not limited to a probe selection, an obstetric imaging, an abdominal imaging, a urology imaging, a gynecology imaging, and a cardiac imaging. As illustrated in FIG. 5, the display unit 402 presents a probe selection 500 that is employed by the user for selecting a type of probe that may be used. The probe may be initially connected to the medical imaging system 400 and then the probe selection 500 may be clicked by the user. When clicked the available probes may be displayed and the user can select the appropriate probe that may be used. The probe is selected based on the type of imaging that may be performed.
[0036] The user may select an imaging procedure such as an obstetric imaging 502 as presented in the display unit 402. When the obstetric imaging 502 is selected then settings of this imaging procedure is presented to the user. The settings include various parameters such as a trim rout 504, a cervix 506, a fetal cardio 508, a fetal head 510, an ovary 512 and penetration 514. These parameters are displayed as UI objects. The trim rout 504 may be used to confirm in which trimester the patient is currently. The cervix 506 may be selected to check a length of the cervix. Further the fetal cardio 508 and the fetal head 510 are selected to check a heart rate and a head size of the fetus. The ovary 512 may be activated to determine the size of a left and a right ovary to determine an ovary volume. The penetration 514 is used to define a frequency range function that allows for adjustment of high resolution/lower penetration, mid resolution or mid penetration, or lower resolution/ high penetration for capturing an image. From the transducer's broadband signal a certain start frequency and start bandwidth is extracted and then continuously changed over depth. Every transducer includes a set of three fixed receive settings which are easily controlled by switching a frequency. The frequency range may include resolution, normal and penetration.
[0037] Different users may have a usage pattern that they prefer for performing an imaging procedure and so for example in obstetric imaging the user (such as a obstetric technician) may prefer to set the penetration 514 constant and does not vary for all the imaging procedures conducted for different subjects. The obstetric technician may not constantly vary or use another setting such as the cervix 506. The obstetric technician may check the fetal cardio, the fetal head size, and the ovary (left/right ovary) size for all the imaging procedures and thus these UI objects are accessed frequently. This usage pattern is learned by a processor 406 of the medical imaging system 400. The processor 406 monitors the number of clicks on these UI objects and accordingly display features of the fetal cardio, the fetal head size and the ovary are varied. The number of clicks is then compared with a predefined selection threshold to determine the changes to be made in display features of the UI objects. For instance as illustrated in FIG. 6, size of the UI objects of the trim rout 504, the fetal cardio 508, the fetal head 510, and the ovary 512 is increased as their frequency of usage is more than a predefined selection threshold. Whereas the size of the UI objects associated with the cervix 506 and the penetration 514 is decreased because their frequency of usage is less than the predefined selection threshold. In another alternative embodiment a size of the UI object depends upon a degree of variation of number of clicks on the UI object from the predefined selection threshold. For example, the trim rout 504 may have a number of clicks value as 30 and the fetal cardio 508 and the fetal head 510 may each have a number of clicks value 23. Then based on the comparison of these number of clicks value with a predefined selection threshold i.e. 22 it is determined that the trim rout 504 have a larger degree of variation as compared to the fetal cardio 508 and the fetal head 510. Thus the UI object of the trim rout 504 have a larger size as compared to the UI objects of the fetal cardio 508 and the fetal head 510.
[0038] In an embodiment an order of arrangement of the UI objects is also changed based on an order of selection of the UI objects. The user may select the UI objects in the order such as the trim rout 504, the fetal cardio 508, the fetal head 510, and the ovary 512 for performing the obstetric imaging and thus the processor 406 arrange these UI objects in the same order as illustrated in FIG. 6. Moreover display tone i.e. color, brightness and contrast of these UI objects may be also changed. For example the UI objects of the trim rout 504, the fetal cardio 508, the fetal head 510, and the ovary 512 may be shown with a brighter color or may be glowed up. While the UI objects of the cervix 506 and the penetration 514 are shown in a lighter color or with a lower intensity. As a result the user may be able to conveniently identify the UI objects i.e. the trim rout 504, the fetal cardio 508, the fetal head 510, and the ovary 512 and activate them. The display features of these UI objects are varied such that more real estate in a display screen of the display unit 402 is not used thereby efficiently using the screen space. Also more frequently used UI objects are present in the display screen and can be visible to the user by removing or de-emphasizing the unused UI objects. In another instance the color of a UI object remains constant and the intensity of the color may be increased when the UI object is frequently used. If the UI object is used less, the intensity of the UI object is reduced. In an alternative embodiment in the event an UI object is not used at all by the user, the processor 406 may remove the UI object from the screen of the display unit 402.
[0039] Now when another user uses the medical imaging system 400, then an usage pattern of the user is again monitored and display features of the UI objects 404 are modified based on the usage pattern in run time. In an embodiment the medical imaging system 400 may be configured to present the UI objects 404 to the user based on user's pre-stored usage pattern. More specifically the user may login to the medical imaging system 400 using user's login credentials and the pre-stored usage pattern of the user is loaded. The pre-stored usage pattern is a usage pattern that is identified earlier by the processor 406 while the user worked with the medical imaging system 400 for imaging. The pre-stored usage pattern is stored in a memory 408. In another embodiment the user may load the user's pre-stored usage pattern prior to using the medical imaging system 400.
[0040] FIG. 7 illustrates a display unit 700 presenting multiple UI objects for setting a two dimensional imaging mode to perform an imaging procedure in accordance with an exemplary embodiment. FIG. 8 illustrates the display unit 700 presenting sub-menu parameters of the two-dimensional imaging mode in the form of UI objects in accordance with an exemplary embodiment. FIG. 9 illustrates the display unit presenting the UI objects rearranged based on a usage pattern of the user in accordance with an exemplary embodiment. FIG. 7, FIG. 8 and FIG. 9 are hereinafter described together. In the display unit 700 of a medical imaging system, a two dimensional imaging mode (2-D imaging mode) 702 is provided and when selected by the user a two-dimensional display is activated. The 2-D imaging mode includes sub-menu parameters that are defined as UI objects and are selectable by the user. There are different imaging procedures are selected using the UI objects such as an abdomen 704, a kidney 706, a liver 708, an aorta 710, a speckle reduction imaging (SRI) 712 and a compound resolution imaging (CRI) 714 as displayed in the display unit 700. The SRI tab 712 is employed to filter speckle in images such as an image 716. The image 716 may be a 2-D image. The image 716 may a streaming video image and may be freezed at any point in time. Further the CRI tab 714 is used to activate a CRI filter that enhances the image 716 captured. The abdomen 704 is used to select an imaging procedure to perform imaging for abdomen calculations. The abdomen calculations are associated with analysis of body parts such as liver, gallbladder, pancreas, spleen, left/right kidney, left/right renal artery, proximal aorta, mid aorta, distal aorta, and vessel. The UI objects associated with the kidney 706, the liver 708 and the aorta 710 are used to select an imaging procedure to perform analysis of kidney, liver, and aorta of a subject.
[0041] The user may be a medical technician and uses the abdomen tab 704 more frequently to perform abdominal imaging on the subject. This usage pattern of the medical technician is identified and the abdomen tab 704 is increased in size as illustrated in FIG. 8. The user activates a sub-menu 718 to display the sub-menu parameters such as, a map tab 720, a line density tab 722, an enhance tab 724, a line filter tab 726, an image position tab 728 and a persistence tab 730. These sub-menu parameters are used to make changes to the image 716 that needs to be captured. The map tab 720 is used to determine the displayed brightness of an ultrasound echo (i.e. ultrasound echo received from an anatomy for example abdomen that is diagnosed) in relationship to its amplitude. The line density tab 722 is used to adjust the image resolution. The enhance tab 724 is used for image sharpening so that information in the image is more visible. The line filter tab 726 is employed to filter noises in images. The image position tab 728 is used to adjust the position of the image in a display field in the display screen. Further the persistence tab 730 is used to eliminate speckles from the image 716.
[0042] Referring back to the selection of the abdomen using the UI object i.e. the abdomen 704, the user sets sub-menu parameters using the map tab 720, the line density tab 722, the enhance tab 724, the line filter tab 726, the image position tab 728 and the persistence tab 730. However the user may use the enhance tab 724, the line filter tab 726 and the image position tab 728 more frequently than other sub-menu parameters. The usage is monitored to determine whether the frequency of selection of these tabs (i.e. based on clicks on these tabs) is more than a predefined selection threshold. If the frequency of selection of these tabs is more than a predefined selection threshold, the size of the enhance tab 724, the line filter tab 726 and the image position tab 728 is increased as compared to other UI objects of the sub-menu parameters. Further if the frequencies of selection of the map tab 720, the line density tab 722 and the persistence tab 730 are below the predefined selection threshold then their display features are varied for example the size is reduced. In an alternative embodiment the frequency of selection of the map tab 720, the line density tab 722, the enhance tab 724, the line filter tab 726, the image position tab 728 and the persistence tab 730 are compared with multiple predefined selection thresholds. For instance a frequency of selection associated with each of these tabs is compared with different predefined selection thresholds.
[0043] The enhance tab 724, the line filter tab 726 and the image position tab 728 are shifted from the sub-menu 718 to the screen along with the abdomen 704 as illustrated in FIG. 9. The enhance tab 724, the line filter tab 726 and the image position tab 728 may be selected by the user in this order. The order of selection is determined and the tabs are arranged in the same order as shown in FIG. 9. The other tabs such as the map tab 720 and the line density tab 722 remain in the sub-menu as they are not selected by the user or the frequency of selection is below the predefined selection threshold. Further the size of the UI objects i.e. the kidney 706, the liver 708, and the aorta 710 is reduced because the frequency of selection of these UI objects is less than the predefined selection threshold. The CRI 714 is removed from the screen shown in FIG. 7.
[0044] FIG. 10 is a block diagram of a method of arranging a plurality of user interface objects displayed in a medical imaging system in accordance with an embodiment. The medical imaging system is a portable medical imaging device such as an ultrasound imaging system. The medical imaging system includes a display unit displaying multiple UI objects that is used by a user at block 1000. The user activates an imaging procedure for capturing images of an anatomy of a subject by selecting one or more UI objects of the multiple UI objects. The user may have a preferred way of selecting the one or more UI objects (i.e. a user input) for performing the imaging procedure. This preferred way of selecting the one or more UI objects is a usage pattern that is identified at block 1002. The usage pattern includes a frequency of selection of a UI object and an order of selection of the one or more UI objects. Based on the usage pattern the medical imaging system modifies one or more display features of the one or more UI objects of the plurality of UI objects at block 1004. The one or more display features include but are not limited to at least one of a size of an UI object and a display tone of the UI object. The display tone may include for example, color, brightness and contrast of the UI object.
[0045] FIG. 11 is a block diagram of a method of arranging a plurality of user interface objects displayed in a medical imaging system in accordance with another embodiment. The medical imaging system includes a display unit displaying multiple UI objects that is used by a user at block 1100. The user activates an imaging procedure for capturing images of an anatomy of a subject by selecting one or more UI objects of the multiple UI objects. The user may have a preferred way of selecting the one or more UI objects (i.e. a user input) for performing the imaging procedure. This preferred way of selecting the one or more UI objects is a usage pattern that is identified at block 1102. The usage pattern includes a frequency of selection of a UI object and an order of selection of the one or more UI objects. At block 1104, a frequency of selection of each UI object is compared with a predefined selection threshold. In an embodiment the frequency of selection of each UI object is compared with separate predefined selection thresholds for each UI object. A check is performed to determine if frequency of selection of a UI object of the one or more selected UI objects is above the predefined selection threshold at block 1106. If the frequency of selection is above the predefined selection threshold then a size of the UI object is increased as indicated by the block 1108. For example when a frequency of selection of an UI object is more than a predefined selection threshold i.e. 22 clicks on the UI object, size of the UI object is increased. Further a display tone associated with the UI object is also increased as shown in block 1110. The display tone may include for example, color, brightness and contrast of the UI object. For instance an intensity of color of the UI object is increased so that the UI object stands out or their prominence is increased in the display unit so that the user can select the UI object with ease.
[0046] Whereas if the frequency of selection is below the predefined selection threshold then the size of the UI object is decreased as indicated by the block 1112. For instance if the frequency of selection of another UI object is less than 22 clicks then the size of this UI object is reduced. The display tone associated with the UI object is also increased as shown in block 1114. The display tone may include for example, color, brightness and contrast of the UI object. For instance an intensity of color of the UI object is decreased so that the UI object. This reduced intensity of color indicates that the UI object is not used or very less used by the user.
[0047] While determining the usage pattern an order of selection or usage of the UI objects is also identified. Based on the order of selection the UI objects are sorted and displayed in the display unit at block 1116. The usage pattern may vary based on users or for the same user so that medical imaging system is configured to determine any change in usage pattern and accordingly reconfigure or rearrange the UI objects in a main menu or in any of the sub-menu.
[0048] The various embodiments and/or components, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
[0049] As used herein, the term "computer" or "module" may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term "computer".
[0050] The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
[0051] The methods described in conjunction with FIGs. 10 and 11 can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method of arranging a plurality of user interface objects displayed in a medical imaging system is explained with reference to the flow chart of FIGs. 10 and 11, other methods of implementing the method can be employed. For example, the order of execution of each method steps may be changed, and/or some of the method steps described may be changed, eliminated, divide or combined. Further the method steps may be sequentially or simultaneously executed for presenting health state of a subject in a monitoring system.
[0052] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
We Claim:
1. A computing system comprising:
a user interface for displaying a plurality of user interface objects accessible to a user; and
a processor communicably coupled to the user interface, wherein the processor is configured to:
identify a usage pattern of the plurality of user interface objects by the user;
modify at least one display feature of at least one user interface objects of the plurality of user interface objects based on the usage pattern.
2. The computing system of claim 1, wherein the usage pattern comprises at least one of a frequency of selection of a user interface object and an order of selection of at least one user interface object of the plurality of user interface objects.
3. The computing system of claim 2, wherein the processor is further configured to:
determine the frequency of selection of each user interface object of the plurality of user interface objects by the user; and
vary the at least one display feature of the at least one user interface object based on a predefined selection threshold.
4. The computing system of claim 3, wherein at least one display feature of an user interface object comprises at least one of a size of the user interface object and a display tone of the user interface object.
5. The computing system of claim 4, wherein the processor is further configured to:
increase the size of a user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
decrease the size of a user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
6. The computing system of claim 4, wherein the processor is further configured to:
increase the display tone of a user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
reduce the display tone of a user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
7. The computing system of claim 2, wherein the processor is further configured to:
determine the order of selection of at least one user interface object of the plurality of user interface objects; and
sort the plurality of user interface objects in the user interface based on the order of selection.
8. A medical imaging system for performing imaging procedures, the medical imaging
system comprising:
a display unit for displaying a plurality of user interface objects accessible to a user, wherein the user accesses at least one user interface object of the plurality of user interface objects for performing an imaging procedure; and
a processor communicably coupled to the display unit, wherein the processor is configured to:
identify a usage pattern of the plurality of user interface objects by the user;
modify at least one display feature of at least one user interface objects of the plurality of user interface objects based on the usage pattern.
9. The medical imaging system of claim 8, wherein the medical imaging system is a
portable medical imaging system.
10. The medical imaging system of claim 8, wherein the usage pattern comprises at least one of a frequency of selection of a user interface object and an order of selection of at least one user interface object of the plurality of user interface objects, the usage pattern is associated with an imaging procedure.
11. The medical imaging system of claim 10, wherein the processor is further configured to:
determine the frequency of selection of each user interface object of the plurality of user interface objects by the user; and
vary the at least one display feature of the at least one user interface object based on a predefined selection threshold, wherein at least one display feature of an user interface object comprises at least one of a size of the user interface object and a display tone of the user interface object.
12. The medical imaging system of claim 11, wherein the processor is further configured to:
increase a size of at least one user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
decrease a size of at least one user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
13. The medical imaging system of claim 11, wherein the processor is further configured to:
increase the display tone of a user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
reduce the display tone of a user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
14. The medical imaging system of claim 10, wherein the processor is further configured to:
determine the order of selection of at least one user interface object of the plurality of user interface objects; and
sorting the plurality of user interface objects in the user interface based on the order of selection.
15. A method of arranging a plurality of user interface objects displayed in a medical
imaging system, the method comprising:
displaying a plurality of user interface objects accessible to a user in a display unit of the medical imaging system, wherein the user accesses at least one user interface object of the plurality of user interface objects for performing an imaging procedure;
identifying a usage pattern of the plurality of user interface objects based on user input; and
modifying at least one display feature of at least one user interface objects of the plurality of user interface objects based on the usage pattern.
16. The method of claim 15, wherein the usage pattern comprises at least one of a frequency of selection of a user interface object and an order of selection of at least one user interface object of the plurality of user interface objects, the usage pattern is associated with an imaging procedure.
17. The method of claim 16, wherein identifying the usage pattern comprises determining the frequency of selection of each user interface object of the plurality of user interface objects by the user.
18. The method of claim 17, wherein modifying at least one display feature of the at least one user interface objects comprises varying the at least one display feature of the at least one user interface object based on a predefined selection threshold, wherein at least one display feature of an user interface object comprises at least one of a size of the user interface object and a display tone of the user interface object.
19. The method of claim 18, wherein varying the at least one display feature of the at least one user interface object comprises:
increasing a size of at least one user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
decreasing a size of at least one user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
20. The method of claim 19, wherein varying the at least one display feature of the at
least one user interface object further comprises:
increasing the display tone of a user interface object if a frequency of selection of the user interface object is above the predefined selection threshold; and
reducing the display tone of a user interface object if a frequency of selection of the user interface object is below the predefined selection threshold.
21. The method of claim 16 further comprises:
determining the order of selection of at least one user interface object of the plurality of user interface objects by the user for performing the imaging procedure; and
sorting the plurality of user interface objects in the user interface based on the order of selection.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 4539-CHE-2012 FORM-5 31-10-2012.pdf | 2012-10-31 |
| 1 | 4539-CHE-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 4539-CHE-2012-IntimationOfGrant06-02-2023.pdf | 2023-02-06 |
| 2 | 4539-CHE-2012 FORM-2 31-10-2012.pdf | 2012-10-31 |
| 2 | 4539-CHE-2012-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 4539-CHE-2012-PatentCertificate06-02-2023.pdf | 2023-02-06 |
| 3 | 4539-CHE-2012 FORM-1 31-10-2012.pdf | 2012-10-31 |
| 3 | 4539-CHE-2012-Annexure [23-01-2023(online)].pdf | 2023-01-23 |
| 3 | 4539-CHE-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 4 | 4539-CHE-2012-Written submissions and relevant documents [23-01-2023(online)].pdf | 2023-01-23 |
| 4 | 4539-CHE-2012-IntimationOfGrant06-02-2023.pdf | 2023-02-06 |
| 4 | 4539-CHE-2012 DRAWINGS 31-10-2012.pdf | 2012-10-31 |
| 5 | 4539-CHE-2012-PatentCertificate06-02-2023.pdf | 2023-02-06 |
| 5 | 4539-CHE-2012-FORM-26 [12-01-2023(online)].pdf | 2023-01-12 |
| 5 | 4539-CHE-2012 DESCRIPTION(COMPLETE) 31-10-2012.pdf | 2012-10-31 |
| 6 | 4539-CHE-2012-Correspondence to notify the Controller [11-01-2023(online)].pdf | 2023-01-11 |
| 6 | 4539-CHE-2012-Annexure [23-01-2023(online)].pdf | 2023-01-23 |
| 6 | 4539-CHE-2012 CORRESPONDENCE OTHERS 31-10-2012.pdf | 2012-10-31 |
| 7 | 4539-CHE-2012-Written submissions and relevant documents [23-01-2023(online)].pdf | 2023-01-23 |
| 7 | 4539-CHE-2012-US(14)-HearingNotice-(HearingDate-13-01-2023).pdf | 2022-12-23 |
| 7 | 4539-CHE-2012 CLAIMS 31-10-2012.pdf | 2012-10-31 |
| 8 | 4539-CHE-2012 ABSTRACT 31-10-2012.pdf | 2012-10-31 |
| 8 | 4539-CHE-2012-ABSTRACT [14-01-2020(online)].pdf | 2020-01-14 |
| 8 | 4539-CHE-2012-FORM-26 [12-01-2023(online)].pdf | 2023-01-12 |
| 9 | 4539-CHE-2012 FORM-5 30-11-2012.pdf | 2012-11-30 |
| 9 | 4539-CHE-2012-CLAIMS [14-01-2020(online)].pdf | 2020-01-14 |
| 9 | 4539-CHE-2012-Correspondence to notify the Controller [11-01-2023(online)].pdf | 2023-01-11 |
| 10 | 4539-CHE-2012 FORM-1 30-11-2012.pdf | 2012-11-30 |
| 10 | 4539-CHE-2012-COMPLETE SPECIFICATION [14-01-2020(online)].pdf | 2020-01-14 |
| 10 | 4539-CHE-2012-US(14)-HearingNotice-(HearingDate-13-01-2023).pdf | 2022-12-23 |
| 11 | 4539-CHE-2012 CORRESPONDENCE OTHERS 30-11-2012.pdf | 2012-11-30 |
| 11 | 4539-CHE-2012-ABSTRACT [14-01-2020(online)].pdf | 2020-01-14 |
| 11 | 4539-CHE-2012-CORRESPONDENCE [14-01-2020(online)].pdf | 2020-01-14 |
| 12 | 4539-CHE-2012 FORM-18 01-10-2014.pdf | 2014-10-01 |
| 12 | 4539-CHE-2012-CLAIMS [14-01-2020(online)].pdf | 2020-01-14 |
| 12 | 4539-CHE-2012-DRAWING [14-01-2020(online)].pdf | 2020-01-14 |
| 13 | 4539-CHE-2012-FER_SER_REPLY [14-01-2020(online)].pdf | 2020-01-14 |
| 13 | 4539-CHE-2012-COMPLETE SPECIFICATION [14-01-2020(online)].pdf | 2020-01-14 |
| 13 | 4539-CHE-2012 CORRESPONDENCE OTHERS 01-10-2014.pdf | 2014-10-01 |
| 14 | 4539-CHE-2012-CORRESPONDENCE [14-01-2020(online)].pdf | 2020-01-14 |
| 14 | 4539-CHE-2012-FER.pdf | 2019-07-22 |
| 14 | 4539-CHE-2012-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 15 | 4539-CHE-2012-DRAWING [14-01-2020(online)].pdf | 2020-01-14 |
| 15 | 4539-CHE-2012-FER.pdf | 2019-07-22 |
| 15 | 4539-CHE-2012-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 16 | 4539-CHE-2012 CORRESPONDENCE OTHERS 01-10-2014.pdf | 2014-10-01 |
| 16 | 4539-CHE-2012-FER_SER_REPLY [14-01-2020(online)].pdf | 2020-01-14 |
| 17 | 4539-CHE-2012-DRAWING [14-01-2020(online)].pdf | 2020-01-14 |
| 17 | 4539-CHE-2012-FER.pdf | 2019-07-22 |
| 17 | 4539-CHE-2012 FORM-18 01-10-2014.pdf | 2014-10-01 |
| 18 | 4539-CHE-2012-CORRESPONDENCE [14-01-2020(online)].pdf | 2020-01-14 |
| 18 | 4539-CHE-2012-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 18 | 4539-CHE-2012 CORRESPONDENCE OTHERS 30-11-2012.pdf | 2012-11-30 |
| 19 | 4539-CHE-2012 CORRESPONDENCE OTHERS 01-10-2014.pdf | 2014-10-01 |
| 19 | 4539-CHE-2012 FORM-1 30-11-2012.pdf | 2012-11-30 |
| 19 | 4539-CHE-2012-COMPLETE SPECIFICATION [14-01-2020(online)].pdf | 2020-01-14 |
| 20 | 4539-CHE-2012 FORM-18 01-10-2014.pdf | 2014-10-01 |
| 20 | 4539-CHE-2012 FORM-5 30-11-2012.pdf | 2012-11-30 |
| 20 | 4539-CHE-2012-CLAIMS [14-01-2020(online)].pdf | 2020-01-14 |
| 21 | 4539-CHE-2012-ABSTRACT [14-01-2020(online)].pdf | 2020-01-14 |
| 21 | 4539-CHE-2012 CORRESPONDENCE OTHERS 30-11-2012.pdf | 2012-11-30 |
| 21 | 4539-CHE-2012 ABSTRACT 31-10-2012.pdf | 2012-10-31 |
| 22 | 4539-CHE-2012 CLAIMS 31-10-2012.pdf | 2012-10-31 |
| 22 | 4539-CHE-2012 FORM-1 30-11-2012.pdf | 2012-11-30 |
| 22 | 4539-CHE-2012-US(14)-HearingNotice-(HearingDate-13-01-2023).pdf | 2022-12-23 |
| 23 | 4539-CHE-2012 CORRESPONDENCE OTHERS 31-10-2012.pdf | 2012-10-31 |
| 23 | 4539-CHE-2012 FORM-5 30-11-2012.pdf | 2012-11-30 |
| 23 | 4539-CHE-2012-Correspondence to notify the Controller [11-01-2023(online)].pdf | 2023-01-11 |
| 24 | 4539-CHE-2012-FORM-26 [12-01-2023(online)].pdf | 2023-01-12 |
| 24 | 4539-CHE-2012 DESCRIPTION(COMPLETE) 31-10-2012.pdf | 2012-10-31 |
| 24 | 4539-CHE-2012 ABSTRACT 31-10-2012.pdf | 2012-10-31 |
| 25 | 4539-CHE-2012 CLAIMS 31-10-2012.pdf | 2012-10-31 |
| 25 | 4539-CHE-2012 DRAWINGS 31-10-2012.pdf | 2012-10-31 |
| 25 | 4539-CHE-2012-Written submissions and relevant documents [23-01-2023(online)].pdf | 2023-01-23 |
| 26 | 4539-CHE-2012 CORRESPONDENCE OTHERS 31-10-2012.pdf | 2012-10-31 |
| 26 | 4539-CHE-2012 FORM-1 31-10-2012.pdf | 2012-10-31 |
| 26 | 4539-CHE-2012-Annexure [23-01-2023(online)].pdf | 2023-01-23 |
| 27 | 4539-CHE-2012 DESCRIPTION(COMPLETE) 31-10-2012.pdf | 2012-10-31 |
| 27 | 4539-CHE-2012 FORM-2 31-10-2012.pdf | 2012-10-31 |
| 27 | 4539-CHE-2012-PatentCertificate06-02-2023.pdf | 2023-02-06 |
| 28 | 4539-CHE-2012 DRAWINGS 31-10-2012.pdf | 2012-10-31 |
| 28 | 4539-CHE-2012 FORM-5 31-10-2012.pdf | 2012-10-31 |
| 28 | 4539-CHE-2012-IntimationOfGrant06-02-2023.pdf | 2023-02-06 |
| 29 | 4539-CHE-2012 FORM-1 31-10-2012.pdf | 2012-10-31 |
| 29 | 4539-CHE-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 30 | 4539-CHE-2012 FORM-2 31-10-2012.pdf | 2012-10-31 |
| 30 | 4539-CHE-2012-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 31 | 4539-CHE-2012 FORM-5 31-10-2012.pdf | 2012-10-31 |
| 31 | 4539-CHE-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 2018-12-18_19-07-2019.pdf |