Abstract: The present disclosure discloses a method and apparatus for imaging a wide field of view. The apparatus comprises a first sensor and a second sensor, affixed to a first lens and second lens, respectively, each positioned via linear guide pins on an angular shaped mounting bracket via two springs. The apparatus comprises a cable coupled at one end to a first mount base supported by the fist lens and drawn towards a first part of an actuating module, and coupled at another end to a second mount base supporting the second lens via a second part of the actuating module. The actuating module is mounted on a linear shaft and is configured to rotate stepwise in a defined sequence, to initiate movement of the linear shaft, where the linear shaft triggers movement of the first lens and the second lens that causes simultaneous focusing of an object placed before the first and second sensor. FIG. 1 is the reference figure.
Description: FIELD
[0001] The present invention relates field of imaging and, more particularly, to a method and system for achieving a wide field of view in open surgeries.
BACKGROUND
[0002] Various challenges exist in achieving an optimal field of view (FOV) in open surgeries for covering an entire surgical area. The use of a single lens with a higher back focal length leads to a smaller FOV. Hence usually lenses with small back focal lengths ranging from 6-8 millimeter are preferred in surgeries. However, the challenge in implementing such smaller lenses, it due to their proximity with camera sensors, the enclosure restricts access to focusing mechanisms.
[0003] The use of such lenses is however hindered due to restricted access in focusing within a camera probe enclosure. Further, the complexity arising by independently controlling multiple lenses with separate actuators results in increased size, power consumption and circuit intricacies.
[0004] There is a need for a method and system that uses a single actuator to simultaneously focus on dual lenses placed horizontally and vertically. The proposed technique must also facilitate use of smaller back focal length lenses without compromising on precision and efficiency. Accordingly, an alternate method and system for imaging a field of view in open surgeries is disclosed.
BRIEF DESCRIPTION OF THE FIGURES
[0005] These and other features, aspects, and advantages of the example embodiments will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0006] FIG. 1 is a block diagram of an apparatus for imaging a wide field of view, according to an example embodiment;
[0007] FIG. 2 is a block diagram of an apparatus for imaging a wide field of view, according to an example embodiment;
[0008] FIG. 3 illustrates an exploded view of a camera system used within the apparatus of FIGs. 1 and 2, according to an example embodiment; and
[0009] FIG. 4 is a flow diagram illustrating functioning of the apparatus of FIGs. 1 and 2, according to an example embodiment.
SUMMARY
[0010] Example embodiments, however, may be embodied in many alternate forms and should not be construed as limited to only the example embodiments set forth herein.
[0011] According to an embodiment of the present invention, a single actuator is implemented that simultaneously focuses on dual lenses. The present invention, aims to reduce an overall size of the camera head, minimize heat generation, and power consumption. Additionally, a more simplified circuit design is proposed.
[0012] According to an embodiment, an image processing apparatus is disclosed. The image processing apparatus includes at least one first sensor affixed to a first lens, via a first plurality of linear guide pins positioned on a mounting axis of the first lens, at least one second sensor affixed to a second lens via a second plurality of linear guide pins on a mounting axis of the second lens; wherein the first lens and the second lens are implemented in an angular shaped mounting bracket via two springs. In an embodiment, the angular shaped mounting bracket can be L-shaped. Further, the apparatus includes a cable coupled at one end to a first mount base supporting the first lens and drawn towards a first part of an actuating module and coupled at another end to a second mount base supporting the second lens via a second part of the actuating module; wherein a looped section of the cable is engaged with a linear shaft; and an actuator mounted on the linear shaft. The actuator rotates stepwise in a defined sequence; and initiates movement of the linear shaft, wherein the linear shaft triggers movement of the first lens and the second lens that causes simultaneous focusing of an object placed before the first and second sensor.
[0013] The image processing apparatus is an electric pulley system, where the first part is a first bearing pulley and the second part is a second bearing pulley. In another embodiment, the actuating module is a gear system where the first part comprises a first set of gears, and the second part comprises a second set of gears. Further, the rotation of the actuator is controlled to position the first lens and the second lens at one of a plurality of preset focus points, wherein the preset focus points are at distances of 8 centimeters (cms), 16 cms and 25 cms from the object. In an embodiment, the rotation of the actuator is controlled such as to adjust a position of the first lens and the second lens back and forth based on a sharpness of the image as obtained from an image analyzer that is coupled to the actuator. Further, the actuator is configured to control a back-an-forth movement of the first lens and the second lens within the respective mounting axes, by using the linear shaft that is positioned on two mounted railings and that is configured to traverse back and forth.
[0014] In an embodiment, the image processing apparatus further comprises a distance measuring sensor placed on a camera probe and configured to constantly read a distance between the camera probe and the object, and feed the distance to a microprocessor coupled to the image analyzer. The length of each linear guide pin is selected to accommodate a movement of the respective lens within a focus range. The apparatus facilitates implementation of the first and second lens having a back focal length of 6-12 mm.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0015] The drawings are to be regarded as being schematic representations and elements illustrated in the drawings are not necessarily shown to scale. Rather, the various elements are represented such that their function and general purpose become apparent to a person skilled in the art. Any connection or coupling between functional blocks, devices, components, or other physical or functional units shown in the drawings or described herein may also be implemented by an indirect connection or coupling. A coupling between components may also be established over a wireless connection. Functional blocks may be implemented in hardware, firmware, software, or a combination thereof.
[0016] Various example embodiments will now be described more fully with reference to the accompanying drawings in which only some example embodiments are shown. Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, may be embodied in many alternate forms and should not be construed as limited to only the example embodiments set forth herein.
[0017] Accordingly, while example embodiments are capable of various modifications and alternative forms, example embodiments are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed. On the contrary, example embodiments are to cover all modifications, equivalents, and alternatives thereof. Like numbers refer to like elements throughout the description of the figures.
[0018] Before discussing example embodiments in more detail, it is noted that some example embodiments are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.
[0019] Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Inventive concepts may, however, be embodied in many alternate forms and should not be construed as limited to only the example embodiments set forth herein.
[0020] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or," includes any and all combinations of one or more of the associated listed items. The phrase "at least one of" has the same meaning as "and/or".
[0021] Further, although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, it should be understood that these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are used only to distinguish one element, component, region, layer, or section from another region, layer, or section. Thus, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from the scope of inventive concepts.
[0022] Spatial and functional relationships between elements (for example, between modules) are described using various terms, including “connected,” “engaged,” “interfaced,” and “coupled.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the above disclosure, that relationship encompasses a direct relationship where no other intervening elements are present between the first and second elements, and also an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. In contrast, when an element is referred to as being "directly” connected, engaged, interfaced, or coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between," versus "directly between," "adjacent," versus "directly adjacent," etc.).
[0023] The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms "a," "an," and "the," are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the terms “and/or” and “at least one of” include any and all combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0024] It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0025] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0026] Spatially relative terms, such as “beneath”, “below”, “lower”, “above”, “upper”, and the like, may be used herein for ease of description to describe one element or feature’s relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in ‘addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below”, or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, term such as “below” may encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein are interpreted accordingly.
[0027] Portions of the example embodiments and corresponding detailed description may be presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0028] At least one example embodiment is generally directed to techniques for evaluation of loyalty programs employed by organizations. In particular, the embodiments disclose techniques relating to determination of loyalty delivered sales from such loyalty programs over pre-determined periods of time.
[0029] Turning to the drawings, FIG. 1 is a block diagram illustrating a system 100 for effecting imaging of a wide field of view. The system 100 includes at least one first sensor (102), a first lens holder (104), a first lens (106), at least one second sensor (112), a second lens holder (108), a second lens (110), a distance sensor (130), a cam follower (122), an eccentric cam driven by servo (124), a servo/ actuator (126), a linear shaft (128), a first pulley (114), a rope (120), a mounting bracket (118) and a second pulley (116), and a linear guide pin (132) and a spring (134) and a dichroic mirror (136). The first pulley (114) and the second pulley (116) comprise two parts of an actuating module. In another embodiment, the actuating module is a gear mechanism as shown in FIG. 2.
[0030] In an embodiment, the at least one first sensor (102) and the at least one second sensor (112) are camera components used to capture images. While only two sensors are shown in FIG. 1, the apparatus (100) can contain multiple sensors capturing varied data based on light spectrum near Infrared (NIR) image, Color Image, Ultraviolet (UV) image etc.
[0031] The distance sensor (130) is used to measure distance from the apparatus to an object (138). The distance sensor (130) can be one of laser or ultrasonic or any other type of sensor that measures distance. The springs (134) associated with each sensor or camera is held in its home position and is pushed towards the distance sensor (130) using the spring (134).
[0032] An eccentric cam (124) is driven by a servo actuator (126) and is coupled to a linear shaft (128). The linear shaft (128) is a rectangle part that holds the eccentric cam (124) (an offset toothless gear) and a cam follower for hooking the pulley rope (120). A cam follower (122) is a free rotating bearing attached to the linear shaft (128). When the actuator (126) is activated and the eccentric cam (124) rotates, the eccentric cam (124) in turn pushes the cam follower (122) which in turn pushes the entire linear shaft (128) back and forth drawing the two lenses for focus.
[0033] When the eccentric cam (124) or the actuating module is activated the first lens (106) moves against the spring’s (134) natural position compressing the spring (134) and vice versa when the actuating module turns clockwise.
[0034] The linear guide pin (132) is associated with the first lens holder (104). The linear guide pin (132) ensures a smooth linear movement of the first lens (106), where the first lens (106) follows a direction of the linear guide pin (132). In an embodiment, each lens holder has two linear guide pins which ensures movement of associated lenses while focusing.
[0035] The dichroic mirror (136) also known as a beam splitter is an optical device that can split light through reflection and transmission, allowing flexible light paths to flow toward multiple cameras. In this case the dichroic mirror (136) allows only NIR wavelengths to pass through it on the NIR camera and reflects natural light toward the first sensor (102) and the second sensor (112). The first lens (106) is the part of a camera that directs light to the film or, in a digital camera, to a computer chip that can sense the light.
[0036] The actuating module as mentioned, includes the first pulley rope (120) connected between the first pulley (114) and the second pulley (116). The pulley rope (120) is a flexible nylon-based rope not limited to nylon, and is used to hold both lens holders hooking through the linear shaft (128), which helps to pull the first lens (106) and second lend (110) to-an- fro for focus.
[0037] The first pulley (114) and the second pulley (116) are standard pulley bearings that ensure that a movement of the pulley rope (120) is smooth and along a predefined path. The servo or actuator (126) can be a servo, stepper, or geared motor that will rotate the eccentric cam (124) on command.
[0038] The mounting bracket (118) is used to mount the camera, actuator (126), and all the components. As shown, the fist lens (106) having a higher back focal length is used to illuminate both the cameras through the dichroic mirror (136) and is placed in between two cameras, a NIR camera (302) and a color camera (304) as shown in FIG. 3, to allow NIR wavelengths to pass through and form NIR image data on the NIR camera (302) and reflecting visible light towards the color camera (304).
[0039] NIR based minimally invasive imaging such as laparoscopic imaging has similar requirement but the field of view required for such surgeries are smaller compared to open surgeries thus a single lens which has a higher back focal length ranging from 22 – 26 mm can be used to illuminate the focal point of both the cameras (302, 304), this objective lens is positioned away from the sensors and the dichroic mirror (136) as shown in Figs. 1 and 2. Actuator-based focus methodology is implemented, to reduce a working distance of minimally invasive surgeries.
[0040] Typically, open surgeries require a wider field of view (FOV) that covers the entire surgical area. In such a scenario a single lens with a higher back focal length cannot be used since it produces a smaller FOV. This calls for the implementation of a smaller back focal length lenses ranging from 6mm to 8 mm. Typically, having a smaller back focal lens has its challenges such as its placements are to be within 6mm to 8mm from the camera sensors (!02 and 112). in this scenario, access to the focusing mechanism is not possible as the lens is mounted closer to the camera sensor placed inside the camera probe enclosure.
[0041] Implementation of an actuator-based focus mechanism is ideal, but the complexity further increases since there are two lenses (106 and 110). Implementation of multiple actuators to independently control to focus of each lens increases the complexity of the camera probe design while increasing its overall size, power consumption, circuit design, as well as thermal dissipation. Hence, the proposed solution places the dual lens horizontally and vertically using a single actuator, which reduces the overall size of the camera head, reduces heat generation and simplifies circuit and power consumption.
[0042] As shown in FIG. 1, the camera system utilizes a 90-degree L-shaped mounting bracket (118), designed to house two camera sensors (102 and 112). Each sensor is securely affixed to a lens through multiple linear guide pins (e.g. 132) positioned on their respective mounting axes. This configuration facilitates the controlled back-and-forth movement of each lens (106 and 110) within its axis. Notably, the length of the linear guide pin (132) is meticulously engineered to accommodate a movement of the first lens (106) within the focus range, aligning precisely with the back focal lengths of the lenses relative to the associated camera sensors (112 and 102).
[0043] In its default state, the two lenses are maintained at their home position, oriented toward the camera sensors (112 and 102). This is achieved through the implementation of two springs affixed in the L-shaped mounting bracket (118) against the linear guide pins, as depicted in Fig 3.
[0044] In an embodiment, a high-tensile-strength nylon cable also referred to as a pulley rope (120) is secured to mount base of the first lens (106), and is drawn towards a powered electric pulley system via a first pulley (114). Subsequently, the pulley rope (120) loops back to a base of the second lens (110) through the second pulley (116), with the looped section intricately engaged with the linear shaft (128). The linear shaft (128) positioned on two mounted railings, is thereby enabled to traverse back and forth.
[0045] Mounted on the linear shaft (128) is the servo actuator (126) featuring an offset bush. During operation, as the actuator (126) undergoes rotation, the offset bush interfaces with a ball-bearing pulley cam follower (122), initiating the movement of the linear shaft (128). This motion, in turn, pulls the two lenses (106 and 110) along with it, propelled by the tension in the pulley rope (120). Furthermore, as the actuator (126) rotates and the offset bearing post pushes against the mounted spring (134), a base of the first lens (106) is urged back to its home position in synchronization with a step movement if the actuator (126). This comprehensive process facilitates remote control of the focus for both camera lenses (106 and 110) by manipulating a clockwise and anticlockwise movements of the actuator (126).
[0046] In an embodiment, a distance measuring sensor such as the distance sensor (130) is placed on a camera probe front face which constantly reads the distance between the camera probe and the object (138) which is fed to a microprocessor (410). The microprocessor (410) is configured with an autofocus algorithm for NIR-based open surgery imaging systems. The algorithm facilitates operation of the microprocessor (410) in two distinct modes to accommodate the specific needs of medical practitioners.
[0047] In an example, a first mode is tailored for standard autofocus, where the algorithm leverages three preset focus points strategically positioned at 8cm, 16cm, and 25cm. These preset points are derived from historical data, facilitating rapid and precise focus adjustments at commonly encountered distances within the surgical field. The first mode is well-suited for healthcare professionals who prioritize swift and straightforward imaging.
[0048] In another embodiment, a second mode is designed for users who demand continuous fine-resolution focus in the context of NIR-based open surgeries. In this advanced mode, the algorithm initiates a refinement process beyond the preset points. After reaching a predefined focus point, the lens undergoes subtle adjustments, moving one step back and forth. The algorithm meticulously analyzes image sharpness to determine the optimal position, providing a finer level of control over focus. This tailored approach is particularly beneficial in medical scenarios that require a heightened degree of accuracy in imaging. The dual-mode functionality enhances the adaptability of the autofocus system, catering to the diverse requirements of healthcare professionals engaged in NIR-based open surgeries.
[0049] FIG. 2 illustrates an image processing apparatus for imaging a wide field of view in open surgeries, using a gear mechanism or a gear system within the actuating module. As shown, the servo actuator (122) is coupled to a plurality of gears or pinions, such as a first pinion (224), a second pinion (216) and a third pinion (218), which facilitates movement of the linear shaft (128) and thereby facilitates focusing of the first lens (106) and the second lens (110). In the embodiment, as shown the first pinion (224) is in a physical engagement with a first rack (226), which also provides a mechanical support to the gear system. The second pinion (216) is in physical engagement with a second rack (214). In another embodiment, the gear system includes a first part comprising a first set of gears, and a second part having a second set of gears. In an example, the first set of gears can include multiple gears, for example three, four or six gears and the like, and would work in tandem with second set of gears, which could also include a set of three, four and six gears.
[0050] FIG. 3, illustrates an exploded view 300 of the camera system used in the image processing apparatus (100). As shown, the color camera (304) is positioned in right angles to the NIR camera (302). Both cameras 304 ad 302, focus light onto the dichroic mirror (136). The dichroic mirror (136) reflects the light onto a tube lens (306), and passes the light onto an objective (308), which focuses the light onto a sample (312). The advantages of the system are reduced complexity of the camera probe, and increased overall size of the camera probe, heightened power consumption, circuit consumption, and challenges in thermal dissipation.
[0051] FIG. 4 is a block diagram 400, illustrating functioning of the apparatus 100 and 200 shown in FIGs. 1 and 2. As shown, the object (402) is focused upon by the dichroic mirror (404) splits the image into a visible image that is provided to a first lens (408) and into a NIR image that is provided to a second lens (410). The image is then provided to a color camera (414) and the image from the second lens (410) is provided to the NIR camera (412). The output from the color camera (414) and the NIR camera (412) is then provided to a compute module (422), where as the output from the first lens (408) and the second lens (410) is provided to an actuator (416). A distance of the object (402) from the NIR camera (412) and the color camera (414) is measured by a distance measuring sensor (406), and the distance is fed to a microprocessor (418). The output of the microprocessor (418) is then provided to a trigger generator (420) and the output of the trigger generator (420) is then provided to the actuator (416). Further, the distance measured is also provided as an input from the microprocessor (418) to the compute module (422). Further, the output from the compute module (422) is checked by a focus module (424) to determine if the image is in focus. If the image is in focus, the output is displayed on a display (426). Incase the image is not in focus, a signal is provided to the microprocessor (418), that indicates the image is not in focus. The microprocessor (418) then sends a feedback signal to the trigger generator (420) indicating the out of focus situation of the image. The trigger generator (420) then provides a signal to the actuator (416) to move the first lens (408) and the second lens (410) back and forth by a predefined distance in millimeter, to ensure that the image gets into focus or to ensure better focus on the object (402).
[0052] It will be understood by those within the art that, in general, terms used herein, are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present.
[0053] For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., “a” and/or “an” should be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, means at least two recitations, or two or more recitations).
[0054] While only certain features of several embodiments have been illustrated, and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of inventive concepts.
[0055] The aforementioned description is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. The broad teachings of the disclosure may be implemented in a variety of forms. Therefore, while this disclosure includes particular examples, the true scope of the disclosure should not be so limited since other modifications will become apparent upon a study of the drawings, the specification. It should be understood that one or more steps within a method may be executed in different order (or concurrently) without altering the principles of the present disclosure. Further, although each of the example embodiments is described above as having certain features, any one or more of those features described with respect to any example embodiment of the disclosure may be implemented in and/or combined with features of any of the other embodiments, even if that combination is not explicitly described. In other words, the described example embodiments are not mutually exclusive, and permutations of one or more example embodiments with one another remain within the scope of this disclosure.
[0056] The example embodiment or each example embodiment should not be understood as a limiting/restrictive of inventive concepts. Rather, numerous variations and modifications are possible in the context of the present disclosure, in particular those variants and combinations which may be inferred by the person skilled in the art with regard to achieving the object for example by combination or modification of individual features or elements or method steps that are described in connection with the general or specific part of the description and/or the drawings, and, by way of combinable features, lead to a new subject matter or to new method steps or sequences of method steps, including insofar as they concern production, testing and operating methods. Further, elements and/or features of different example embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure.
[0057] Still further, any one of the above-described and other examples features of example embodiments may be embodied in the form of an apparatus, method, system, computer program, tangible computer readable medium and tangible computer program product. For example, of the aforementioned methods may be embodied in the form of a system or device, including, but not limited to, any of the structure for performing the methodology illustrated in the drawings.
[0058] In this application, including the definitions below, the term ‘module’ or the term ‘controller’ may be replaced with the term ‘circuit.’ The term ‘module’ may refer to, be part of, or include processor hardware (shared, dedicated, or group) that executes code and memory hardware (shared, dedicated, or group) that stores code executed by the processor hardware.
[0059] The module may include one or more interface circuits. In some examples, the interface circuits may include wired or wireless interfaces that are connected to a local area network (LAN), the Internet, a wide area network (WAN), or combinations thereof. The functionality of any given module of the present disclosure may be distributed among multiple modules that are connected via interface circuits. For example, multiple modules may allow load balancing. In a further example, a server (also known as remote, or cloud) module may accomplish some functionality on behalf of a client module.
[0060] Further, at least one example embodiment relates to a non-transitory computer-readable storage medium comprising electronically readable control information (e.g., computer-readable instructions) stored thereon, configured such that when the storage medium is used in a controller of a magnetic resonance device, at least one example embodiment of the method is carried out.
[0061] Even further, any of the aforementioned methods may be embodied in the form of a program. The program may be stored on a non-transitory computer readable medium, such that when run on a computer device (e.g., a processor), cause the computer-device to perform any one of the aforementioned methods. Thus, the non-transitory, tangible computer readable medium is adapted to store information and is adapted to interact with a data processing facility or computer device to execute the program of any of the above-mentioned embodiments and/or to perform the method of any of the above-mentioned embodiments.
[0062] The computer readable medium or storage medium may be a built-in medium installed inside a computer device main body or a removable medium arranged so that it may be separated from the computer device main body. The term computer-readable medium, as used herein, does not encompass transitory electrical or electromagnetic signals propagating through a medium (such as on a carrier wave), the term computer-readable medium is therefore considered tangible and non-transitory. Non-limiting examples of the non-transitory computer-readable medium include, but are not limited to, rewriteable non-volatile memory devices (including, for example flash memory devices, erasable programmable read-only memory devices, or a mask read-only memory devices), volatile memory devices (including, for example static random access memory devices or a dynamic random access memory devices), magnetic storage media (including, for example an analog or digital magnetic tape or a hard disk drive), and optical storage media (including, for example a CD, a DVD, or a Blu-ray Disc). Examples of the media with a built-in rewriteable non-volatile memory, include but are not limited to memory cards, and media with a built-in ROM, including but not limited to ROM cassettes, etc. Furthermore, various information regarding stored images, for example, property information, may be stored in any other form, or it may be provided in other ways.
[0063] The term code, as used above, may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. Shared processor hardware encompasses a single microprocessor that executes some or all code from multiple modules. Group processor hardware encompasses a microprocessor that, in combination with additional microprocessors, executes some or all code from one or more modules. References to multiple microprocessors encompass multiple microprocessors on discrete dies, multiple microprocessors on a single die, multiple cores of a single microprocessor, multiple threads of a single microprocessor, or a combination of the above.
[0064] Shared memory hardware encompasses a single memory device that stores some or all code from multiple modules. Group memory hardware encompasses a memory device that, in combination with other memory devices, stores some or all code from one or more modules.
[0065] The term memory hardware is a subset of the term computer-readable medium. The term computer-readable medium, as used herein, does not encompass transitory electrical or electromagnetic signals propagating through a medium (such as on a carrier wave), the term computer-readable medium is therefore considered tangible and non-transitory. Non-limiting examples of the non-transitory computer-readable medium include, but are not limited to, rewriteable non-volatile memory devices (including, for example flash memory devices, erasable programmable read-only memory devices, or a mask read-only memory devices), volatile memory devices (including, for example static random access memory devices or a dynamic random access memory devices), magnetic storage media (including, for example an analog or digital magnetic tape or a hard disk drive), and optical storage media (including, for example a CD, a DVD, or a Blu-ray Disc). Examples of the media with a built-in rewriteable non-volatile memory, include but are not limited to memory cards, and media with a built-in ROM, including but not limited to ROM cassettes, etc. Furthermore, various information regarding stored images, for example, property information, may be stored in any other form, or it may be provided in other ways.
[0066] The apparatuses and methods described in this application may be partially or fully implemented by a special purpose computer created by configuring a general-purpose computer to execute one or more particular functions embodied in computer programs. The functional blocks and flowchart elements described above serve as software specifications, which may be translated into the computer programs by the routine work of a skilled technician or programmer.
[0067] The computer programs include processor-executable instructions that are stored on at least one non-transitory computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may encompass a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services, background applications, etc.
[0068] The computer programs may include: (i) descriptive text to be parsed, such as HTML (hypertext markup language) or XML (extensible markup language), (ii) assembly code, (iii) object code generated from source code by a compiler, (iv) source code for execution by an interpreter, (v) source code for compilation and execution by a just-in-time compiler, etc. As examples only, source code may be written using syntax from languages including C, C++, C#, Objective C, Haskell, Go, SQL, R, Lisp, Java®, Fortran, Perl, Pascal, Curl, OCaml, Javascript®, HTML5, Ada, ASP (active server pages), PHP, Scala, Eiffel, Smalltalk, Erlang, Ruby, Flash®, Visual Basic®, Lua, and Python®. , Claims:1. An image processing apparatus for imaging a wide field of view, the image processing apparatus comprises:
at least one first sensor affixed to at least one first lens via a first plurality of linear guide pins positioned on a mounting axis of the at least one first lens;
at least one second sensor affixed to at least one second lens via a second plurality of linear guide pins on a mounting axis of the at least one second lens; wherein the first lens and the second lens are implemented in an angular shaped mounting bracket via two springs;
a cable coupled at one end to a first mount base supporting the at least one first lens and drawn towards a first part of an actuating module and coupled at another end to a second mount base supporting the at least one second lens via a second part of the actuating module; wherein a looped section of the cable is engaged with a linear shaft; and
an actuating module mounted on the linear shaft and configured to:
rotate stepwise in a defined sequence; and
initiate movement of the linear shaft, wherein the linear shaft triggers movement of the at least one first lens and the at least one second lens that causes simultaneous focusing of an object placed before the at least one first sensor and the at least one second sensor.
2. The image processing apparatus of claim 1, wherein the actuating module is an electric pulley system, wherein the first part is a first pulley, and the second part is a second pulley.
3. The image processing apparatus of claim 1, wherein the actuating module is a gear mechanism, wherein the first part comprises a first set of gears and the second part comprises a second set of gears.
4. The image processing apparatus of claim 1, wherein the rotation of the actuator is controlled to position the first lens and the second lens at one of a plurality of preset focus points, wherein the preset focus points are at distances of 8 centimeters (cms), 16 cms and 25 cms from the object.
5. The image processing apparatus of claim 1, wherein the rotation of the actuator is controlled to adjust a position of the at least one first lens and a position of the at least one second lens based on a sharpness of the image as obtained from an image analyzer that is coupled to the actuator.
6. The image processing apparatus of claim 5, wherein the actuator is further configured to:
control a back-an-forth movement of the first lens and the at least one second lens within the respective mounting axes, by using the linear shaft that is positioned on two mounted railings and that is configured to traverse back and forth.
7. The image processing apparatus of claim 3, further comprising:
a distance measuring sensor placed on a camera probe and configured to:
constantly read a distance between the camera probe and the object; and
feed the distance to a microcontroller coupled to the image analyzer.
8. The image processing apparatus of claim 1, wherein a length of each linear guide pin is selected to accommodate a movement of the respective lens within a focus range.
9. The image processing apparatus of claim 1, wherein the actuator is configured to:
facilitate implementation of the first lens and the second lens with a back focal length ranging from 6-12 millimeter (mm).
| # | Name | Date |
|---|---|---|
| 1 | 202441017497-STATEMENT OF UNDERTAKING (FORM 3) [11-03-2024(online)].pdf | 2024-03-11 |
| 2 | 202441017497-FORM FOR STARTUP [11-03-2024(online)].pdf | 2024-03-11 |
| 3 | 202441017497-FORM FOR SMALL ENTITY(FORM-28) [11-03-2024(online)].pdf | 2024-03-11 |
| 4 | 202441017497-FORM 1 [11-03-2024(online)].pdf | 2024-03-11 |
| 5 | 202441017497-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-03-2024(online)].pdf | 2024-03-11 |
| 6 | 202441017497-EVIDENCE FOR REGISTRATION UNDER SSI [11-03-2024(online)].pdf | 2024-03-11 |
| 7 | 202441017497-DRAWINGS [11-03-2024(online)].pdf | 2024-03-11 |
| 8 | 202441017497-DECLARATION OF INVENTORSHIP (FORM 5) [11-03-2024(online)].pdf | 2024-03-11 |
| 9 | 202441017497-COMPLETE SPECIFICATION [11-03-2024(online)].pdf | 2024-03-11 |
| 10 | 202441017497-Proof of Right [28-03-2024(online)].pdf | 2024-03-28 |
| 11 | 202441017497-FORM-26 [28-03-2024(online)].pdf | 2024-03-28 |
| 12 | 202441017497-Covering Letter [14-10-2024(online)].pdf | 2024-10-14 |