Abstract: A control system for controlling operations of a video x-ray device capable of capturing medical images of a subject is disclosed. The control system comprises a touch input display and a computing device communicably connected to the touch input display. The computing device is configured to receive user inputs through the touch input display, and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display. FIG. 3
TECHNICAL FIELD
[0001] The subject matter disclosed herein relates to video x-ray devices. More specifically the subject matter relates to touch input based system for controlling the operations in a video x-ray device.
BACKGROUND OF THE INVENTION
[0002] X-ray imaging systems such as fluoroscopy systems facilitate the acquisition of X-ray images of a patient's anatomy (or the internal features of any animate or inanimate subject) that may be viewed as a video during an imaging operation. Such fluoroscopy systems are utilized in various applications for viewing of patient anatomy during a variety of diverse medical procedures, such as angiographic procedures, gastrointestinal procedures, cardiology procedures, and so forth. Depending on the clinical application, a radiologist typically selects an operating mode that affects multiple system parameters involved in the setup of the imaging system. For example, the selected mode may affect the X-ray dose to which the patient is exposed, the image quality obtained on the display, and so forth.
[0003] Imaging using the fluoroscopy systems needs presets to be completed in the system by a radiologist. The presets include positioning the table holding the patient at the right height and position, collimator position, imaging protocol setting and so on. The controls to make adjustments in the table position, collimator position, selection of imaging protocol, field of view (FOV) setting, changing magnification factor, image rotation, window width, window brightness and contrast level, record, spot, cine and so on are hardware based input devices. Operating these hardware based input devices may become difficult and cumbersome sometimes because a display device may be present to display the video x-ray image with all these hardware input devices arranged as a single unit. The fluoroscopy system operates in different imaging contexts and the settings are different which needs to be modified using various controls i.e. hardware based input devices. As the settings need to be changed manually there is likelihood that unnecessary or unwanted controls not related to a particular imaging context may be activated by the radiologist causing errors in the settings. These errors may result in capturing of incorrect x-ray images of the subject's anatomy.
[0004] Thus there is a need for improved control systems for operating and controlling the functions in the video x-ray devices with enhanced user experience.
BRIEF DESCRIPTION OF THE INVENTION
[0005] The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.
[0006] As discussed in detail below, embodiments of the invention include a control system for controlling operations of a video x-ray device capable of capturing medical images of a subject. The control system comprises a touch input display and a computing device communicably connected to the touch input display. The computing device is configured to receive user inputs through the touch input display, and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
[0007] In another embodiment, a video x-ray device including an image acquisition unit, a touch input display and a computing device is disclosed. The image acquisition unit acquires medical images of a subject. The computing device is communicably connected to the touch input display and the image acquisition unit. The computing device is configured to receive user inputs through the touch input display, and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] FIGURE 1 illustrates an X-ray imaging system for performing an imaging procedure on a subject;
[0009] FIGURE 2 illustrates an exemplary fluoroscopy system configured to continuously image the internal features of a subject, such as anatomy of a human subject or patient in a medical or screening context, throughout an imaging operation;
[0010] FIGURE 3 is a block diagram illustrating a control system for controlling operations of a video x-ray device capable of capturing medical images of a subject in accordance with an embodiment;
[0011] FIGURE 4 schematically illustrates touch input display of the video x-ray device in accordance with an exemplary embodiment;
[0012] FIGURE 5 schematically illustrates the touch input display presenting an image area and multiple UI elements when the video x-ray device is in a review mode in accordance with an exemplary embodiment;
[0013] FIGURE 6 schematically illustrates the touch input display presenting an image area and multiple UI elements when the video x-ray device is in an exposure mode in accordance with an exemplary embodiment;
[0014] FIGURE 7 schematically illustrates the touch input display presenting a layout of different user interface elements in accordance with an exemplary embodiment;
[0015] FIGURE 8 schematically illustrates the touch input display presenting the medical image in a zoomed form in accordance with an exemplary embodiment;
[0016] FIGURE 9 is a schematic illustration of the touch input display presenting an image zooming element for zooming in and out of the medical image in accordance with an exemplary embodiment;
[0017] FIGURE 10 is a schematic illustration of the touch input display presenting a person selecting a region of interest in the medical image in accordance with an exemplary embodiment; and
[0018] FIGURE 11 schematically illustrates a video x-ray device communicating with a handheld device over a wireless network in accordance with an exemplary embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0019] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
[0020] To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. One or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be standalone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
[0021] As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to "one
embodiment" of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising" or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.
[0022] A control system for controlling operations of a video x-ray device capable of capturing medical images of a subject is disclosed. The control system comprises a touch input display and a computing device communicably connected to the touch input display. The computing device is configured to receive user inputs through the touch input display, and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
[0023] In another embodiment, a video x-ray device including an image acquisition unit, a touch input display and a computing device is disclosed. The image acquisition unit acquires medical images of a subject. The computing device is communicably connected to the touch input display and the image acquisition unit. The computing device is configured to receive user inputs through the touch input display, and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
[0024] Various embodiments of the invention provide an X-ray imaging system as shown in FIG. 1. Set forth below is a description of one type of X-ray imaging system 100. System 100 is described herein as an example only, as explained above. More specifically, and referring to FIG. 1, imaging system 100 is shown as including a base 102 and a positioning arm 104. The base 102 extends from a portable platform 106 having a plurality of wheels 108 so that base 102 is movable relative to an object or patient 110 to be imaged. Rather than wheels, other position altering devices can be utilized, such as a pivot that allows arm 104 to tilt and rotate.
[0025] The arm 104 includes a first end portion 112 and a second end portion 114. More specifically, the arm 104 rotates relative to the base 102 about an axis of rotation
and moves relative to the base 102 to alter the respective distances between the arm first end portion 112 and base 102 and the arm second end portion 114 and base 102. An x-ray source assembly 116 is movably coupled to arm first end portion 112. The x-ray source assembly 116 includes an X-ray source 118 configured to emit x-rays.
[0026] A detector assembly 120 is movably coupled to the arm second end portion 114. The detector assembly 120 includes a detector 122 and is configured to receive the X-rays from the x-ray source 118 to generate an image of the object. By moving the arm 104 relative to the base 102, the position of the source assembly 116 may be altered so that source assembly 116 is moved toward or away from a table 124. Altering the position of the source assembly 116, alters the position of the detector assembly 120 relative to the base 102 in an opposite direction.
[0027] The detector 112, in one embodiment, is formed by a plurality of detector elements 126 which together sense the projected x-rays that pass through the object to collect image data. In the example embodiment, the detector 112 is a flat panel, an image intensifier, or film. In one embodiment, the detector 112 is a solid state detector or radiation imager comprising a large flat panel imaging device having a plurality of pixels 126 arranged in rows and columns. The detector 112, of course, need not be a digital detector such as a flat panel detector and could be one of many different types of known detectors.
[0028] Regarding the detector 112, each pixel 126 includes a photosensor (not shown), such as a photodiode, that is coupled via a switching transistor (not shown) to two separate address lines, a scan line and a data line. In each row of pixels, each respective switching transistor (typically a thin film field effect transistor (FET)) is coupled to a common scan line through that transistor's gate electrode. In each column of pixels, the readout electrode of the transistor (e.g., the source electrode of the FET) is coupled to a data line, which in turn is selectively coupled to a readout amplifier.
[0029] During nominal operation, X-ray beams passing through the patient 110 are incident on the detector 122 (i.e. an imaging array). The radiation is incident on a
scintillator material and the pixel photosensors measure (by way of change in the charge across the diode) the amount of light generated by X-ray interaction with the scintillator. As a result, each detector element, or pixel 126 produces an electrical signal that represents the intensity of an impinging X-ray beam and hence the attenuation of the beam as it passes through the object. During a scan to acquire X-ray projection data in one mode defined as a CT volume rotation mode, the detector assembly 120 and the source assembly 116 are rotated about the object.
[0030] The system 100 also includes the table 124 for supporting the patient 110. To generate an image of the patient 110, the arm 104 is rotated so that the source assembly 116 and the detector assembly 120 rotate about the patient 110. More specifically, the arm 104 is rotatably coupled to the base 102 so that the detector 112 and the source 118 are rotated about the object 110.
[0031] Movement of the arm 104 and the operation of the x-ray source assembly 116 and the detector assembly 120 are governed by a control mechanism 128 of the system 100. The controller, or the control mechanism 128 includes an X-ray controller 130 that provides power and timing signals to the x-ray source 118 and a motor controller (motor controls) 132 that controls the position of arm 104, the source assembly 116 and the detector assembly 120.
[0032] A data acquisition system (DAS) 134 in the control mechanism 128 samples data from the detector 112 for subsequent processing. An image
processor/reconstructor 136 (the term reconstructor as used herein includes reconstructors as are known in the computed tomography art, as well as processors for processing data collected in a scan (i.e., not limited to computed tomography image reconstructors) receives sampled x-ray data from DAS 134 and performs image processing/reconstruction. The image is applied as an input to a computer 138 which stores the image in a mass storage device 140. Although not shown, a lap top computer can interface to the computer 138, and images, data, and commands can be communicated between the computer 138 and the lap top computer. As explained above,
the voice activated interface described herein is not limited to practice with X-ray and can be utilized in connection with many other medical imaging modalities.
[0033] The computer 138 also receives commands and scanning parameters from an operator via a console 142 that has a keyboard. An associated cathode ray tube or LCD display 144 allows the operator to observe the image and other data from the computer 138. The operator supplied commands and parameters are used by the computer 138 to provide control signals and information to the DAS 134, the x-ray controller 130 and the motor controller 132. The computer 138 operates a table motor controller 68 which controls position of motorized table 146 relative to the system 100.
[0034] FIG. 2 illustrates an exemplary fluoroscopy system 200 (may be also referred to as a video x-ray device in this disclosure) configured to continuously image the internal features of a subject, such as anatomy of a human subject or patient 202 in a medical or screening context, throughout an imaging operation. The illustrated fluoroscopy system 200 includes an X-ray tube 204 with a collimator 206, a port 208, and filters (not shown), a table 210 on which the patient is positioned, an imaging console 212, an image intensifier 214, a camera 216, and a monitor 218. The imaging console 212 includes a user interface 220 including a first control panel 222 and a second control panel 224. The first control panel 222 includes a display 226 and a plurality of configurable adjustments 228. The second control panel 224 includes a display 230 and configurable adjustments 232 and 234, which are configured to increase or decrease a parameter value, respectively. The monitor 218 also includes a display 236 configured to display a sequence of images to an operator during the imaging operation.
[0035] During operation, the X-ray source 204 generates an X-ray beam, for example, via a conventional cathode and anode X-ray production system. In some embodiments, the X-ray beam may be filtered to provide the desired energy spectrum before reaching the collimator 206. To that end, some embodiments may include one or more desired filters such as energy based filters (e.g., aluminum), equalization filters (e.g., trough filters, bow-tie filters, wedge filters, etc.), and so forth. Further, the size and shape of the X-ray beam is adjusted by the collimator 206 before emerging from the port
208. After emerging from the port 208, the X-ray beam passes through the table 210 and the patient 202 positioned thereon.
[0036] The X-ray beam is attenuated by the patient's anatomy, and at least a portion of the attenuated beam is detected by a high sensitive detector of the image intensifier 214 mounted to the imaging console 212. The image intensifier 214 is adapted to produce a projection image of an acceptable quality from a low number of X-ray photons. Such a feature may be advantageous in fluoroscopy systems since continuous imaging throughout the imaging operation may expose the patient to substantial quantities of X-ray energy. The output signals from the image intensifier 214 are continuously transferred via the video camera 216 to the monitor 218 for viewing on the display 226 during the imaging operation.
[0037] It should be noted that, while the present disclosure refers to the use of the fluoroscopy system in a medical diagnostic context, the system may be used in different contexts as well. For example, with human subjects, the system may be used for screening and similar applications. In other environments, the system may be used for detection of items in parcels, luggage, transport vehicles, and so forth. Still further, in some embodiments, such fluoroscopy imaging systems may be utilized for inspection of industrial parts, such as pipes or wind blades.
[0038] During use, as the fluoroscopy imaging operation is occurring, the operator may utilize one or more of the configurable adjustments 228, 232, and 234 to dynamically adjust one or more parameters of the imaging operation. Further, the system controller may automatically adjust and/or the operator may adjust one or more of a range, a step size, and a default value of each of the configurable adjustments throughout the fluoroscopy operation. For example, in one embodiment, the configurable adjustments 228 may be adapted to allow an operator to control at least one of a noise reduction level, a contrast level in the displayed video, a brightness of the displayed video, edge enhancement in the displayed images, and so forth. For further example, a single configurable adjustment provided for the operator to adjust the contrast level may be automatically configurable by a controller and/or configurable by the operator to
determine the range, step size, and default value associated with that configurable adjustment. As such, the system may automatically alter and/or the operator may alter the range, step size, and default value of the contrast level adjustment during the imaging procedure as desired. Such a feature may offer distinct advantages over systems in which such values are predetermined during system setup and remain fixed throughout the imaging operation. For instance, by enabling the automatic and/or manual configurability of the adjustments, embodiments of the present invention may facilitate the dynamic adjustment and optimization of the video displayed on the monitor 218 during the imaging operation.
[0039] Further, it should be noted that in some embodiments, the functionality of the configurable adjustments 228,232, and 234 may be determined prior to and/or during the imaging operation. The functionality of such adjustments may be dynamically adjustable and, thus, adapted to change during the operation automatically and/or upon input from the operator. For example, in the illustrated embodiment, the user may utilize the adjustments 232 and 234 to increase and/or decrease image contrast during a first portion of the imaging operation and may further utilize the same adjustments 232 and 234 to increase and/or decrease image brightness during a second portion of the imaging operation. As such, the display 236 may be adapted to display the parameter that the configurable adjustments 232 and 234 are configured to adjust at any given time during the imaging operation.
[0040] The illustrated configurable adjustments include knobs 228 and buttons 232 and 234. However, it should be noted that in other embodiments, the configurable adjustments are operated manually by a user and may include but are not limited to joysticks, track balls, knobs, buttons, switches, panels, and so forth. Indeed, the configurable adjustments may be any suitable device coupled to any part of the fluoroscopy system in which at least one of the functionality, the step size, the range, and the default value may be set by an operator before or during the imaging operation. Further, in some embodiments, the configurable adjustments may not be associated with a display (e.g., 226 or 230) and may not be located on a control panel of the console 212.
[0041] It should be noted that the fluoroscopy system of FIG. 2 is described for use as a continuous fluoroscopy system configured to continuously generate an X-ray beam, for example between approximately 0.5 and 5 mA. In such systems, the video camera 216 may be configured to display the generated projection images at any suitable rate (e.g., approximately 30 frames/second). However, it should be noted that the fluoroscopy system may be any desired type of fluoroscopy system, such as a high dose rate fluoroscopy system (e.g., specially activated fluoroscopy), pulsed fluoroscopy (e.g., variable frame rate pulsed fluoroscopy), and so forth. Indeed, the configurable adjustments of the fluoroscopy system of FIG. 2 may be utilized in any suitable fluoroscopy imaging system.
[0042] FIG. 3 illustrates a control system 300 for controlling operations of a video x-ray device 302 capable of capturing medical images of a subject in accordance with an embodiment. The control system 300 is communicably connected to the video x-ray device 302 for performing various functions. The control system 300 includes a touch input display 304 and a computing device 306 communicably connected to the touch input display 304. The video x-ray device 302 captures one or more medical images of the subject positioned on a table (such as the table 210). The touch input display 304 presents medical images of the subject. The touch input display 304 includes user interface (UI) elements for performing multiple operations on the video x-ray device 302. The multiple operations includes but not limited to, managing multiple imaging controls for image acquisition for example configurable adjustments, controlling operation of an image acquisition unit of a video x-ray device, controlling position of a table holding the subject. The multiple imaging controls may include for example field of view (FOV), magnification factor, image display parameters (such as contrast level, brightness, noise reduction level, edge enhancement and so forth), image window parameters, image orientation, collimator position, region of interest (ROI), exposure type, dosage level, imaging protocol, image zooming and recording of image. Further controlling operation of the image acquisition unit may include varying the position and orientation of the image acquisition unit of the video x-ray device 302.
[0043] As illustrated in FIG. 4, the touch input display 304 includes an image area 308 (also called a view port) for displaying a live video x-ray image cine acquired from the subject. The live video x-ray image cine includes multiple medical images of an internal organ of the subject for example a medical image 310. The image acquisition unit of the video x-ray device 302 captures these medical images. The touch input display 304 presents multiple UI elements such as a record element 312, a region of interest (ROI) element 314, a brightness element 316, a contrast element 318, a x-ray on/off element 320, a time element 322, a vignette element 324, a magnification element 326, a field of view (FOV) element 328, and a dose level element 330. However it may be noted that all these UI elements may not be shown together in the touch input display 304 at same point of time and some UI elements may not be presented. The record element 312 may be operated by the user (may be a technician or a medical expert) for recording medical images captured by the video x-ray device 302 during an imaging procedure or when the video x-ray device 302 is in an image exposure mode or image acquisition mode. When the video x-ray device 302 is in the image exposure mode, the user may need to identify a ROI on the subject for capturing a medical image. For instance a ROI may be a particular portion of lungs of the subject. The ROI is selected and marked using the ROI element 314. The user can control display parameters of a medical image using the brightness element 316 and the contrast element 318. The x-ray on/off element 320 can be used to switch on and off the video x-ray device 302. The time element 322 is used for providing information associated with amount of dose accumulated with the patient.
[0044] The vignette element 324 is used to present multiple medical images captured prior to the medical images currently captured by the video x-ray device 302. In an embodiment the vignette element 324 may present the medical images in the form of thumbnails. The user can access these thumbnails to view the prior medical images. Now the magnification element 326 can be used by the user to magnify to a particular portion of an internal organ of the subject and capture a medical image of the internal
organ. In another embodiment the magnification element 326 may indicate a magnification factor or a magnification level associated with a medical image captured by the video x-ray device 302. The FOV element 328 is used to set a FQV for acquiring the medical image by the user. Further during a video x-ray procedure the subject is provided with a contrast agent. The dosage level of the contrast agent can be determined or checked using the dose level element 330.
[0045] The multiple UI elements presented in the touch input display 304 varies based on an imaging context of the video x-ray device 302. The imaging context may include such as a review mode, am exposure/acquisition mode, a pic mode, a spot mode, a scout x-ray mode and so on. The pic and spot modes are used by the radiographer or user to capture snap shots of a particular region of the anatomy which has diagnostic value. Based on the imaging context set in the video x-ray device 302, the UI elements presented in the touch input display 304 varies. In this process when an imaging context is set, one or more UI elements associated with the imaging context are selected from the multiple UI elements and presented in the touch input display 304. Considering an example, the video x-ray device 302 may be set in a review mode then only the vignette element 324 may be presented in the touch input display 304. In other modes the vignette element 324 will not be presented in the touch input display 304. Now when the video x-ray device 302 is set in the exposure or acquisition mode, the x-ray on/off element 320 is presented in the touch input display 304. The x-ray on/off element 320 may not be presented in the review mode. Thus the multiple UI elements presented in a layout 332 may randomly vary based on the imaging context set in the video x-ray device 302. This dynamic change in the UI elements presented avoids accidental activation of UI elements that are not related to a current imaging context of the video x-ray device 302. FIG. 5 and FIG. 6 illustrate the touch input display 304 presenting the image area 308 and the multiple UI elements in the review mode and the exposure mode respectively in accordance with exemplary embodiments. It may be noted that all UI elements relevant for a particular exposure mode may not be illustrated in FIG. 5 and 6 only for sake of convenience of representation and hence it may be appreciated that all UI
elements associated with a particular exposure mode will be presented in the touch input display 304.
[0046] FIG. 7 illustrates the touch input display 304 presenting a layout of different user interface elements in accordance with an exemplary embodiment. The medical image 310 presented in the image area 308 may be captured when the image acquisition unit is positioned at a particular position and orientation with respect to the subject. The user can use his/her hand 700 to move or pan in the image area for re-orienting or repositioning the image acquisition unit with respect to the subject. For instance the hand 700 can be moved touching the image area 308 in left, right, up and down direction for moving the image acquisition unit in these directions. The orientation and position information of the image acquisition unit is presented in a window 702 in the touch input display 304 according to an exemplary embodiment. In other embodiments it may be appreciated that the orientation and position information of the image acquisition unit may be presented in different forms.
[0047] In another embodiment the touch input display 304 may present axes X and Y for changing the position and orientation of the image acquisition unit. The hand 700 may be used by the user to move along the axes X and Y for varying position of the image acquisition unit with respect to the subject. Here a UI element 704 such as a marker may be placed on an axis X which can be moved along the axis X for changing the position of the image acquisition unit in the X direction. Similarly a UI element 706 such as a marker may be placed on an axis Y which can be moved along the Y axis for changing the position of the image acquisition unit in the Y direction. Even though only X and Y axes are shown in FIG. 7, it may be appreciated that other axes such as Z axis may be also presented in the touch input display 304 for varying orientation and position of the image acquisition unit in Z direction according to other embodiments. In an embodiment the window 702 may display the position of the image acquisition unit (for example a collimator position) in terms of measurements in each axis. For instance the measurement may be given in "centimeters" in each axis with reference point as the subject. Thus the distance between the image acquisition unit and the subject in the X-
axis, Y axis and Z axis may be presented. This information can enable the user to evaluate the distance of the image acquisition unit and accordingly modify the position. Further it may be envisioned that the axes X, Y and Z may be presented in any other form in the image area or as a separate UI element in the touch input display 304 in accordance with other exemplary embodiments.
[0048] Turning now to FIG. 8, the touch input display 304 presents the medical image 310 in a zoomed form in accordance with an embodiment. The medical image 310 can be zoomed-in and zoomed-out using hand gestures of the user in the image area 308 (i.e. the view port). For example as illustrated, user's hand 400 can be used to pinch on the image area 308 to zoom-in and zoom-out of the medical image 310.
[0049] In another embodiment an image zooming element 900 (i.e. an UI element) may be displayed in the image area 308 as illustrated in FIG. 9. The image zooming element 900 includes gauges 902 for zooming the medical image 310. A cursor 904 can be moved along these gauges 902 for zooming-in and zooming-out of the medical image 310. The user can use the hand 400 for moving the cursor 904 in upward direction i.e. + ve direction to zoom-in or magnify the medical image 310. Further the cursor 904 can be moved in downward direction i.e. -ve direction to zoom-out in the medical image 310. The image zooming element 900 may be displayed only in response to tap gesture using the hand 400 on the image area 308. In another embodiment an image zooming element (e.g. the image zooming element 900) may be positioned outside the image area 308 as an UI element in the touch input display 304. It may be envisioned that other types of hand gestures can be used to perform the function of zooming in and out of the medical image 310 in other exemplary embodiments. Further in other embodiments the medical image 310 can be zoomed-in and zoomed-out using in other gestures such as but not limited to, face gestures, eye gestures, hand gestures without touching the image area 310 and so on.
[0050] Once a desired zoom level of the medical image 310 is achieved by the user, the user can selected a ROI1000 by tapping on the medical image 310 using the hand 400 as illustrated in FIG. 10 in accordance with an exemplary embodiment. The desired zoom level once achieved indicates that the field of view (FOV) desired by the user is set. The ROI 1000 is shown as a square element for convenience of representation and hence it may be envisioned that the ROI can be selected in any other manner without deviating from the scope of this disclosure. The ROI 1000 may be varied by varying the size of the square element in an embodiment. The size of square element may be varied using one or more UI elements provided in the touch input display 304 for increasing or decreasing the ROI 1000 according to another embodiment.
[0051] In an exemplary embodiment a preset set of ROIs may be provided in the touch input display 304 as one or more UI elements. The preset set of ROIs may be a commonly used standard set of ROIs. The user can select a ROI using the hand 400 for fixing the ROI on the medical image 310. A preset set of ROIs are different for various internal organs or portions of the subject. So once the image acquisition unit is focused on an internal organ or portion of the subject, the preset set of ROIs may be selected by the user from a menu (not shown in FIG. 10) provided in the touch input display 304.
[0052] FIG. 11 illustrates a handheld device 1100 embodied with a touch input display for controlling operations of a video x-ray device 1102 by a user 1104 in accordance with an exemplary embodiment. The user 1104 may be an x-ray technician, a lab technician, a radiologist and so on. The user 1104 may be located in another room near to a location 1106 of the video x-ray device 1102. The handheld device 1100 may communicate wirelessly with the video x-ray device 1102. Here in this environment, a person (for example, an attender or a lab technician) may set a patient or subject 1108 on a table 1110. An image acquisition unit 1112 is positioned or oriented with respect to the subject 1108 so as to initiate the x-ray imaging procedure. A display 1114 is provided to present the imaging or scanning parameters to the person. The user 1104 may use the touch input display in the handheld device 1100 for controlling the image acquisition unit 1112 for performing the scanning procedure. Thus the user 1104 need not be present inside or in the location 1106 for performing the scanning procedure and thus not exposed to the x-rays.
[0053] In another scenario the user 1104 may be in a remote location 1116 such as another country or any part of the globe and the video x-ray device 1102 may be in a hospital. The user 1104 uses the touch input display in the handheld device 1100 to control functions in the video x-ray device 1102. The handheld device 1100 communicates with the video x-ray device 1102 over a wireless network 1118. The wireless network 1118 may include but are not limited to, internet, a 3rd Generation communication (3G) network, a 4th Generation communication (4G) network, and a Long Term Evolution communication (4G LTE) network. Explaining by way of an example a patient may be positioned in a video x-ray device installed in a hospital in village area by a lab technician or attender. The image acquisition unit of the video x-ray device may be operated to orient with respect to the patient by a radiologist located in a remote location for example city location using a wireless handheld device. In another embodiment an application platform can be accessed from any location over a wireless network to control the operation of the image acquisition unit, table position and other functions of the video x-ray device. The application platform may be a web-based application, a remote server application and so on that provides a user interface displaying captured x-ray image (in the form of video) and UI elements for controlling the video x-ray device and the table.
[0054] The various embodiments and/or components, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
[0055] As used herein, the term "computer" or "module" may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term "computer".
[0056] The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
[0057] The methods described in can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method for using a control system for controlling operations of a video x-ray device capable of capturing medical images of a subject is explained hereinabove, other methods of implementing the method can be employed. For example, the order of execution of each method steps may be changed, and/or some of the method steps described may be changed, eliminated, divide or combined. Further the method steps may be sequentially or simultaneously executed for using a control system for controlling operations of a video x-ray device capable of capturing medical images of a subject.
[0058] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
We Claim:
1. A control system for controlling operations of a video x-ray device capable of capturing medical images of a subject, wherein the control system comprises:
a touch input display; and
a computing device communicably connected to the touch input display, wherein the computing device is configured to:
receive user inputs through the touch input display; and
perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
2. The control system of claim 1, wherein the plurality of operations comprises managing a plurality of imaging controls for image acquisition, controlling operations of an image acquisition unit of the video x-ray device, and controlling position of a table holding the subject.
3. The control system of claim 2, wherein the plurality of imaging controls comprises field of view, magnification factor, image display parameters, image window parameters, image orientation, collimator position, region of interest, exposure type, dosage level, imaging protocol, image zooming and recording of image.
4. The control system of claim 1, wherein the touch input display is a handheld device.
5. The control system of claim 4, wherein the touch input display is in a remote location.
6. The control system of claim 1, wherein the touch input display presents an user interface, wherein the user interface comprises:
an image area for displaying a live video x-ray image cine acquired from the patient; and
a plurality of user interface (UI) elements for performing the plurality of operations in the video-x ray device.
7. The control system of claim 6, wherein the user interface comprises an image review area presenting at least one video x-ray image acquired prior to a video.
8. The control system of claim 6, wherein the computing device is further configured to:
select at least one user interface element of the plurality of user interface elements based on an imaging context of the video x-ray device; and
present the at least one user interface element in response to operation of the video x-ray device in the imaging context.
9. The control system of claim 6, wherein the computing device is configured to perform at least one operation of the plurality of operations in the video x-ray device in response to at least one gesture received from the user at the touch input display.
10. A video x-ray device comprising:
an image acquisition unit for acquiring medical images of a subject;
a touch input display; and
a computing device communicably connected to the touch input display and the image acquisition unit, wherein the computing device is configured to: receive user inputs through the touch input display; and perform a plurality of operations in the video x-ray device with respect to a subject based on the user inputs received through the touch input display.
11. The video x-ray device of claim 10, wherein the plurality of operations comprises managing a plurality of imaging controls for image acquisition, controlling operations of an image acquisition unit of the video x-ray device, and controlling position of a table holding the subject.
12. The video x-ray device of claim 11, wherein the plurality of imaging controls comprises field of view, magnification factor, image display parameters, image window parameters, image orientation, collimator position, region of interest, exposure type, dosage level, imaging protocol, image zooming and recording of image.
13. The video x-ray device of claim 11, wherein the touch input display presents an user interface, wherein the user interface comprises:
an image area for displaying a live video x-ray image cine acquired from the patient; and
a plurality of user interface (UI) elements for performing the plurality of operations in the video-x ray device.
14. The video x-ray device of claim 11, wherein the computing device is further configured to:
select at least one user interface element of the plurality of user interface elements based on an imaging context of the video x-ray device; and
present the at least one user interface element in response to operation of the video x-ray device in the imaging context.
15. The video x-ray device of claim 10, wherein the computing device is configured to perform at least one operation of the plurality of operations in the video x-ray device in response to at least one gesture received from the user at the touch input display.
| # | Name | Date |
|---|---|---|
| 1 | 4885-CHE-2013 FORM-5 30-10-2013.pdf | 2013-10-30 |
| 1 | 4885-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 4885-CHE-2013-RELEVANT DOCUMENTS [27-09-2022(online)].pdf | 2022-09-27 |
| 2 | 4885-CHE-2013 FORM-2 30-10-2013.pdf | 2013-10-30 |
| 2 | 4885-CHE-2013-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 4885-CHE-2013-IntimationOfGrant08-04-2022.pdf | 2022-04-08 |
| 3 | 4885-CHE-2013 FORM-1 30-10-2013.pdf | 2013-10-30 |
| 3 | 4885-CHE-2013-PatentCertificate08-04-2022.pdf | 2022-04-08 |
| 3 | 4885-CHE-2013-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 4 | 4885-CHE-2013-Written submissions and relevant documents [07-01-2022(online)].pdf | 2022-01-07 |
| 4 | 4885-CHE-2013-RELEVANT DOCUMENTS [27-09-2022(online)].pdf | 2022-09-27 |
| 4 | 4885-CHE-2013 FORM 18 30-10-2013.pdf | 2013-10-30 |
| 5 | 4885-CHE-2013-IntimationOfGrant08-04-2022.pdf | 2022-04-08 |
| 5 | 4885-CHE-2013-AMENDED DOCUMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 5 | 4885-CHE-2013 DRAWINGS 30-10-2013.pdf | 2013-10-30 |
| 6 | 4885-CHE-2013-PatentCertificate08-04-2022.pdf | 2022-04-08 |
| 6 | 4885-CHE-2013-Correspondence to notify the Controller [08-12-2021(online)].pdf | 2021-12-08 |
| 6 | 4885-CHE-2013 DESCRIPTION (COMPLETE) 30-10-2013.pdf | 2013-10-30 |
| 7 | 4885-CHE-2013-Written submissions and relevant documents [07-01-2022(online)].pdf | 2022-01-07 |
| 7 | 4885-CHE-2013-FORM 13 [08-12-2021(online)].pdf | 2021-12-08 |
| 7 | 4885-CHE-2013 CORRESPONDENCE OTHERS 30-10-2013.pdf | 2013-10-30 |
| 8 | 4885-CHE-2013 CLAIMS 30-10-2013.pdf | 2013-10-30 |
| 8 | 4885-CHE-2013-AMENDED DOCUMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 8 | 4885-CHE-2013-MARKED COPIES OF AMENDEMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 9 | 4885-CHE-2013 ABSTRACT 30-10-2013.pdf | 2013-10-30 |
| 9 | 4885-CHE-2013-Correspondence to notify the Controller [08-12-2021(online)].pdf | 2021-12-08 |
| 9 | 4885-CHE-2013-POA [08-12-2021(online)].pdf | 2021-12-08 |
| 10 | 4885-CHE-2013-FORM 13 [08-12-2021(online)].pdf | 2021-12-08 |
| 10 | 4885-CHE-2013-US(14)-HearingNotice-(HearingDate-29-12-2021).pdf | 2021-12-03 |
| 10 | abstract4885-CHE-2013.jpg | 2014-07-14 |
| 11 | 4885-CHE-2013-ABSTRACT [05-03-2020(online)].pdf | 2020-03-05 |
| 11 | 4885-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 11 | 4885-CHE-2013-MARKED COPIES OF AMENDEMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 12 | 4885-CHE-2013-CLAIMS [05-03-2020(online)].pdf | 2020-03-05 |
| 12 | 4885-CHE-2013-FER.pdf | 2019-09-18 |
| 12 | 4885-CHE-2013-POA [08-12-2021(online)].pdf | 2021-12-08 |
| 13 | 4885-CHE-2013-US(14)-HearingNotice-(HearingDate-29-12-2021).pdf | 2021-12-03 |
| 13 | 4885-CHE-2013-OTHERS [05-03-2020(online)].pdf | 2020-03-05 |
| 13 | 4885-CHE-2013-COMPLETE SPECIFICATION [05-03-2020(online)].pdf | 2020-03-05 |
| 14 | 4885-CHE-2013-ABSTRACT [05-03-2020(online)].pdf | 2020-03-05 |
| 14 | 4885-CHE-2013-CORRESPONDENCE [05-03-2020(online)].pdf | 2020-03-05 |
| 14 | 4885-CHE-2013-FER_SER_REPLY [05-03-2020(online)].pdf | 2020-03-05 |
| 15 | 4885-CHE-2013-CLAIMS [05-03-2020(online)].pdf | 2020-03-05 |
| 15 | 4885-CHE-2013-DRAWING [05-03-2020(online)].pdf | 2020-03-05 |
| 16 | 4885-CHE-2013-COMPLETE SPECIFICATION [05-03-2020(online)].pdf | 2020-03-05 |
| 16 | 4885-CHE-2013-CORRESPONDENCE [05-03-2020(online)].pdf | 2020-03-05 |
| 16 | 4885-CHE-2013-FER_SER_REPLY [05-03-2020(online)].pdf | 2020-03-05 |
| 17 | 4885-CHE-2013-CORRESPONDENCE [05-03-2020(online)].pdf | 2020-03-05 |
| 17 | 4885-CHE-2013-OTHERS [05-03-2020(online)].pdf | 2020-03-05 |
| 17 | 4885-CHE-2013-COMPLETE SPECIFICATION [05-03-2020(online)].pdf | 2020-03-05 |
| 18 | 4885-CHE-2013-DRAWING [05-03-2020(online)].pdf | 2020-03-05 |
| 18 | 4885-CHE-2013-FER.pdf | 2019-09-18 |
| 18 | 4885-CHE-2013-CLAIMS [05-03-2020(online)].pdf | 2020-03-05 |
| 19 | 4885-CHE-2013-ABSTRACT [05-03-2020(online)].pdf | 2020-03-05 |
| 19 | 4885-CHE-2013-FER_SER_REPLY [05-03-2020(online)].pdf | 2020-03-05 |
| 19 | 4885-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 20 | 4885-CHE-2013-OTHERS [05-03-2020(online)].pdf | 2020-03-05 |
| 20 | 4885-CHE-2013-US(14)-HearingNotice-(HearingDate-29-12-2021).pdf | 2021-12-03 |
| 20 | abstract4885-CHE-2013.jpg | 2014-07-14 |
| 21 | 4885-CHE-2013-POA [08-12-2021(online)].pdf | 2021-12-08 |
| 21 | 4885-CHE-2013-FER.pdf | 2019-09-18 |
| 21 | 4885-CHE-2013 ABSTRACT 30-10-2013.pdf | 2013-10-30 |
| 22 | 4885-CHE-2013 CLAIMS 30-10-2013.pdf | 2013-10-30 |
| 22 | 4885-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 22 | 4885-CHE-2013-MARKED COPIES OF AMENDEMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 23 | abstract4885-CHE-2013.jpg | 2014-07-14 |
| 23 | 4885-CHE-2013 CORRESPONDENCE OTHERS 30-10-2013.pdf | 2013-10-30 |
| 23 | 4885-CHE-2013-FORM 13 [08-12-2021(online)].pdf | 2021-12-08 |
| 24 | 4885-CHE-2013 ABSTRACT 30-10-2013.pdf | 2013-10-30 |
| 24 | 4885-CHE-2013 DESCRIPTION (COMPLETE) 30-10-2013.pdf | 2013-10-30 |
| 24 | 4885-CHE-2013-Correspondence to notify the Controller [08-12-2021(online)].pdf | 2021-12-08 |
| 25 | 4885-CHE-2013-AMENDED DOCUMENTS [08-12-2021(online)].pdf | 2021-12-08 |
| 25 | 4885-CHE-2013 CLAIMS 30-10-2013.pdf | 2013-10-30 |
| 26 | 4885-CHE-2013-Written submissions and relevant documents [07-01-2022(online)].pdf | 2022-01-07 |
| 26 | 4885-CHE-2013 CORRESPONDENCE OTHERS 30-10-2013.pdf | 2013-10-30 |
| 27 | 4885-CHE-2013-PatentCertificate08-04-2022.pdf | 2022-04-08 |
| 27 | 4885-CHE-2013 DESCRIPTION (COMPLETE) 30-10-2013.pdf | 2013-10-30 |
| 28 | 4885-CHE-2013-IntimationOfGrant08-04-2022.pdf | 2022-04-08 |
| 28 | 4885-CHE-2013 DRAWINGS 30-10-2013.pdf | 2013-10-30 |
| 29 | 4885-CHE-2013-RELEVANT DOCUMENTS [27-09-2022(online)].pdf | 2022-09-27 |
| 29 | 4885-CHE-2013 FORM 18 30-10-2013.pdf | 2013-10-30 |
| 30 | 4885-CHE-2013-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 30 | 4885-CHE-2013 FORM-1 30-10-2013.pdf | 2013-10-30 |
| 31 | 4885-CHE-2013-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 31 | 4885-CHE-2013 FORM-2 30-10-2013.pdf | 2013-10-30 |
| 32 | 4885-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 32 | 4885-CHE-2013 FORM-5 30-10-2013.pdf | 2013-10-30 |
| 1 | Searchstrategy(4885CHE2013)_11-09-2019.pdf |