Abstract: The present invention discloses a system, an image capturing device, an image processing device and a method thereof. The system for capturing and aligning the images comprises an automated panoramic head positioned on a tripod, an image capturing device, a plurality of marking cues, an image processing device and a graphics viewer. The method for capturing and aligning the 2D images comprises of capturing the images by an image capturing device, automatically rotating the 2D image capturing device to capture images from a different direction, processing the captured images by the image processing device and presenting the processed image on the graphics viewer. FIGURE 1
A SYSTEM, AN IMAGE CAPTURING DEVICE, AN IMAGE PROCESSING DEVICE AND A METHOD THEREOF
TECHNICAL FIELD
[0001] The present disclosure relates generally to an image capturing and processing systems, and more specifically to a system and a method for capturing, aligning and visualizing multiple 2D (two-dimensional) panoramic images into a 3D (three-dimensional) high definition view.
BACKGROUND
[0002] Conventionally, 3D image capturing systems use sensors to capture images of the surroundings. In particular, these 3D image capturing systems involve expensive and complex hardware such as Light Detection and Ranging (LIDAR) sensors to capture the 3D images. Despite the emergence of comparatively less expensive 3D capture devices for automatically reconstructing multiple 3D scenes from 3D images, these devices have failed to provide users with high resolution images for enhanced viewing experience. Moreover, the data generated from such devices is too colossal to be deployed on cloud and to be visualized in low specification devices or in regions with poor data connectivity.
[0003] Another possible method for capturing of 3D views of physical space is by using 2D panoramas. Typically, Fish-eye lenses with stitching software, provides a complete 360x180 degrees view of the environment, are used to capture 2D scenes with spherical panoramic images. However, such 2D images have no other information, such as GPS or depth data, besides the visual content. Since the cameras are uncalibrated, there is no way of estimating their position and aligning them up to a global scale, which is essential for reconstruction of a 3D view.
3
[0004] Further, in the known art, several panoramic head with the cameras have been proposed in combination with tripod to align the device and keep. However, such devices lack in positioning of the panoramic head with speed and accuracy. Such heads are adjusted by means of levers, buttons or similar devices.
[0005] Therefore, there is a need for such systems or apparatuses which could provide users with an immersive 3D viewing experience of physical world by eliminating the aforementioned drawbacks.
SUMMARY
[0006] The following presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter.
[0007] Its sole purpose to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
[0008] It is therefore a primary objective of this invention to provide cost-effective and high definition experience of 3D physical spaces.
[0009] According to the preferred embodiment, a system for capturing and processing of a plurality of panoramic images, said system comprising an automated panoramic head (101) operatively coupled to an image capturing device (102), wherein said automated panoramic head is configured to rotate the image capturing device in 360 degrees and capture a plurality of panoramic images from a plurality of locations within a region of interest; and an image processing device (103) configured to process the panoramic images and create a discretized and quantized three-dimensional (3D) space.
[0010] In another aspect of the present invention, the plurality of panoramic images captured are two-dimensional (2D) images.
4
[0011] In another aspect of the present invention, the automatic panoramic head is positioned on an adapter tripod to maintain a focal point of the image capturing device.
[0012] In another aspect of the present invention, the automated panoramic head comprises a control unit (104) to control the rotation of the automated panoramic head.
[0013] In another aspect of the present invention, the control unit triggers a shutter of the image capturing device to capture first image from the location of the region of interest.
[0014] In another embodiment of the present invention, an automatic panoramic head (101) to capture a plurality of panoramic images, said panoramic head comprising an image capturing device (102) positioned on said automatic panoramic head; an adapter tripod (201) housing a plurality of ball bearings (204), to connect the lower stationary part of the automatic panoramic head; an adapter gear boom (202) comprising a first part and a second part protuding part, the first part being removably attached to the image capturing device and the second protruding part being rotatably mounted within the adapter tripod though the plurality of ball bearings; a motor mounting plate (203) disposed between said adapter tripod (201) and a motor & gear box (205), comprising a plurality of holes of various diameters to allow the adapter gear boom access to the bearing in top and allows the motor axel to pass through; a control unit disposed over the motor mounting plate to control the movement of the motor; and a twin gear arrangement (206) to transfer the rotational motion from the motor & gear box to the adapter gear boon and ensures that the motor provide a predetermined torque; wherein the motor is configured to swivel the panoramic head and the image capturing device on its axis of 360 degrees to capture the plurality of panoramas of images.
[0015] In another aspect of the present invention, the adapter tripod is configured to maintain the focal point of the image capturing device stationary, so as to eliminate parallax during panoramic stitching.
[0016] In another aspect of the present invention, the motor rotates the image capturing device through a given number of steps to position the image capturing device to a certain degrees of a first image capturing direction.
5
[0017] In another aspect of the present invention, the automated panoramic head is adjusted in a particular direction so that no part of the panoramic head comes within the Field of View (FOV) of the image capturing device.
[0018] In another aspect of the present invention, the motor is a stepper motor.
[0019] In another aspect of the present invention, an image processing device for processing of a plurality of panoramic images, said device comprising an image capturing device to capture a plurality of panoramic images; a processor to process the plurality of panoramic images and create a discretized and quantized three-dimensional (3D) space; and an image viewer to view said processed panoramic images; wherein the processor is configured to position the plurality of panoramic images in a single coordinate by means of a plurality of marking cues placed in a region of interest; project the first image on a sphere on the point (0,0,0); locate pixel representing a second point in the first image; determine the position of second image by ray-projecting the pixel on a floor plane; register second image; and loop for every succeeding point to correctly position each image.
[0020] In another aspect of the present invention, the device rotates the plurality of panoramic images about the floor plane such that the point of intersection of the ray coincides with the location of the first panoramic sphere.
[0021] In another aspect of the present invention, the plurality of panoramic images captured are two-dimensional (2D) images.
[0022] In another aspect of the present invention, the image viewer is a 3D viewer or a 2D viewer.
[0023] In another aspect of the present invention, a method for capturing and processing of a plurality of panoramic images, said method comprising acquiring, by an image capturing device, a plurality of panoramic images; processing, in a processor, the plurality of panoramic images and creating a discretized and quantized three-dimensional (3D) space; and viewing, in an image viewer, the processed panoramic images; wherein the processing comprising positioning the plurality of panoramic images in a single coordinate by means of a plurality of marking cues placed in a region of interest; projecting the first image on a sphere on the point (0,0,0); locating pixel representing a second point in the first image; determining the
6
position of second image by ray-projecting the pixel on a floor plane; registering second image; and looping for every succeeding point to correctly position each image.
[0024] These and other objects, embodiments and advantages of the present invention will become readily apparent to those skilled in the art from the following detailed description of the embodiments having reference to the attached figures, the invention not being limited to any particular embodiments disclosed.
BRIEF DESCRIPTION OF THE FIGURES
[0025] The foregoing and further objects, features and advantages of the present subject matter will become apparent from the following description of exemplary embodiments with reference to the accompanying drawings, wherein like numerals are used to represent like elements.
[0026] It is to be noted, however, that the appended drawings along with the reference numerals illustrate only typical embodiments of the present subject matter, and are therefore, not to be considered for limiting of its scope, for the subject matter may admit to other equally effective embodiments.
FIGURE 1 is a schematic illustration of an image capturing and processing devices in accordance with the present invention.
FIGURE 2 is a schematic illustration of an automated panoramic head in accordance with the present invention.
FIGURE 3 is a schematic illustration of a control unit to control the automated panoramic head in accordance with the present invention.
FIGURE 4 is an illustration of mechanical parts along with two gear arrangement of the automatic panoramic head in accordance with the present invention.
7
FIGURE 5 illustrates an image processing device for generating 3D/2.5D high definition view in accordance with the present invention.
FIGURE 6 describes a schematic illustration of the plurality of marking cues helps in positioning the plurality of 2D images in a single coordinate system.
FIGURE 7 describes is a flowchart depicting the processing of plurality of panoramic images in accordance with the embodiments of the present invention.
[0027] In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF EMBODIMENTS
[0028] Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
[0029] It is to be noted, however, that the reference numerals in claims illustrate only typical embodiments of the present subject matter, and are therefore, not to be considered for limiting of its scope, for the subject matter may admit to other equally effective embodiments.
8
[0030] The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
[0031] As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include operatively connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
[0032] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0033] The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures.
[0034] Also, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itself one or more components which are implicitly understood. These components
9
may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
[0035] The present disclosure provides a system and a method for capturing and aligning multiple 2D images in a coordinate system to present 2.5D/3D views of the physical world. The system for capturing and aligning the 2D images comprises of an automated panoramic heads positioned on a tripod, an image capturing device, a plurality of marking cues, an image processing device and a graphics viewer. The method for capturing and aligning the 2D images comprises of capturing the images by a 2D image capturing device, automatically rotating the 2D image capturing device to capture images from a different direction, processing the captured images by the computer implemented product and presenting the processed image on the graphics viewer.
[0036] FIG 1 is a schematic illustration of an image capturing and processing devices in accordance with the present invention. Said system comprising an automated panoramic head (101) operatively coupled to an image capturing device (102), wherein said automated panoramic head is configured to rotate the image capturing device in 360 degrees and capture a plurality of panoramic 2D images from a plurality of locations within a region of interest; and an image processing device (103) configured to process the panoramic images and create a discretized and quantized three-dimensional (3D) space.
[0037] In an embodiment, the image capturing device includes, inter alia, DSLR camera, point-and-shoot camera, autofocus camera, digital camera, Single-lens reflex camera, Twin-lens reflex camera etc. For the purpose of the present disclosure, a 2D image captured by the image capturing device may be interpreted as including a collection of visual appearance of the physical world. The 2D image captured by the image capturing device does not have any geometric information, GPS or depth information of the physical world.
[0038] The image capturing device may simultaneously capture the images or capture them over time period. The information captured in a 2D image can take many forms. For example, the information may include a series of points with XY spatial coordinates, brightness or intensity value, or color values such as RGB or HSV. The XY spatial coordinates may be
10
presented in absolute coordinates relative to a chosen origin point and axes, camera-relative coordinates, or another coordinate frame.
[0039] The automated panoramic head positioned on a tripod is configured to automatically rotate the image capturing device in 360 degrees. The automated panoramic head can be positioned at multiple locations within the region of interest to capture the plurality of panoramas from the multiple locations. FIG 2 is a schematic illustration of the automated panoramic head. The automated panoramic head comprises of an adapter tripod 201, an adapter gear boom 202, a plurality of motor mounting plate 203, plurality of ball bearings 204, a motor and gearbox 205, and a twin gear arrangement 206.
[0040] The adapter tripod houses the plurality of ball bearing which connects the lower stationary part to the automated panoramic head. The adapter gear boom connects the tripod to the ball bearing housed in the adapter tripod. The adapter gear boom has a protruding top part, which extends through the mounting plate to the adapter tripod to bearing and also holds one of the plastic gears on its axis. The plurality of motor mounting plate connects the motor and gearbox to the adapter tripod to bearing. The plurality of mounting plate has holes of various diameters to allow the adapter gear boom easy access to the bearing in top and allows the motor axel to pass through. Moreover, the plurality of mounting plates also provides the structure to attach the control unit and battery. The ball bearing allows differential motion between the stationary lower half and moving upper half. The motor and gearbox stepper motor and speed reduction gearbox provides the much needed rotational power to the setup. The twin gear arrangement is exactly 1:1 and hence doesn’t multiply speed or torque. The gear arrangement is in place to avoid any axial load on the gearbox and motor. Motors are very sensitive to axial loads which lead to break down. The gear arrangement makes sure that motor only needs to provide torque to the system.
[0041] The automated panoramic head, as described above, would swivel on its axis of 360 degrees with help of stepper motors to allow the image capturing device to capture the plurality of panoramas. The adapter tripod is configured to maintain the focal point of the 2D image capturing device stationary, thereby eliminating parallax during panoramic image stitching. The automated panoramic head is adjusted in such a manner that while facing in a
11
particular direction little or no part of the panoramic head comes within the Field of View (FOV) of the image capturing device. The image capturing device includes wide angle lens with a FOV of 180 degrees, therefore nothing could be placed directly above or below the wide angle lens.
[0042] FIG 3 is a schematic illustration of the control circuit 300 to control the automated panoramic motor head in accordance with the present invention. The control unit comprises a processor 302 powered by a battery 304. Further, the processor is operatively coupled to a motor shield 306 and a relay circuit 310. The relay circuit is further connected to the stepper motor 308.
[0043] In another embodiment, the relay circuit comprises a relay switch and a trigger circuit.
[0044] In order to capture the 360 degree panoramic view from the location, the image capturing device is positioned at zero degrees. Thereafter, a signal is sent to a relay circuit 310 to trigger the shutter of the image capturing device to capture first image from the location of the region of interest.
[0045] Thereafter, the processor 302 instructs, after a certain delay, the stepper motor 308 to rotate through a given number of steps to position the image capturing device to a certain degrees of the first image capturing direction. The image capturing device then captures the next image from 90 degrees of the first image capturing direction. A counter keeps track of the number of times the relay circuit has been triggered. The image capturing device stops capturing images after 4 triggers from the relay circuit, thus completing a complete 360 degree shot from its present location. The reset button is then triggered to adjust the count of the relay circuit to zero and the process is repeated for the next location.
[0046] The plurality of 2D images captured by the image capturing device is used to merge all 2D images taken from multiple locations into a single space that corresponds to the positions of their data relative to one another in the physical world. Since the 2D image capturing devices are not calibrated, the position of the panoramas can be estimated only up to a relative pose and thus would be inadequate to describe the 3D model up to a global scale.
12
[0047] Further, a plurality of marking cues are placed in the region of interest to denote the various locations from which panoramic images are to be captured. Whenever a particular panorama is captured by the image capturing device, the image contains the images of the plurality of marking cues which were previously placed in the region of interest.
[0048] FIG 4 is an illustration of mechanical parts along with two gear arrangement of the automatic panoramic head in accordance with the present invention. The motor mounting plate 203 is disposed between an adapter gear boom 202 and an adapter tripod 201 to bearing. Further, the motor mounting plate is operatively coupled with the stepper motor 304.
[0049] FIG 5 illustrates an image processing device for processing and generating 2.5D content once the 2D images have been captured. The image processing device is operatively coupled to the image capturing device to receive a plurality of the 2D captured panoramic images. A processor 501 to process the panoramic images and an image viewer 502 to view the processed panoramic images in a 2D or a high definition 3D space.
[0050] In another embodiment, the image processing device comprises of a home tab, an image tab, a galaxy tab and main window (not shown in the figures).
[0051] FIG 6 describes a schematic illustration of the plurality of marking cues which help in positioning the plurality of 2D images in a single coordinate system. The height of the focal point of the capturing device from the floor is measured. The first panorama is warped spherically and placed at origin of the physical world, at a height equal to the aforementioned height. The 2D images of marking cues depicting the adjacent locations from which the image has been captured is also visible, which implies that the floor markers in the physical world would lie on a ray of line joining the center of the panoramic sphere through image of the floor markers. Since these marking cues lies in the region of interest, and the center of the sphere is at the height of the focal point of the image capturing device, the location of the adjacent 2D images would lie where the ray meets the floor plane of the coordinate system. Having obtained the location of a second 2D image on the physical world coordinate, where the location of the first 2D image is assumed to be at origin, it becomes imperative that both of these scenes are properly aligned so that the scenes overlap and matching points or areas
13
with similar characteristics would be in a similar position. In one particular example, the implementation of this process would include rotating the second 2D panoramic sphere about its center. From the point of view of the second 2D image the floor marker of the first scene is clearly visible, which implies, as already stated above, a ray through the center of the panoramic sphere through the floor marker associated with the first sphere would depict the location of the first sphere where the ray cuts the floor plane of the coordinate system. Since the location of the first sphere is already known (origin), rotating the 2nd panoramic sphere about the normal to the floor plane such that the point of the intersection of the ray coincides with the location of the first panoramic sphere would align both the 2D images. This entails that when viewed from the center of the 2nd panoramic image the floor marker associated with the first sphere and its actual location in the coordinate system would coincide perfectly.
[0052] Having obtained the position and the alignment of the second 2D image perfectly to the global scale, the pose of the other 2D images can be easily determined by the aforementioned steps. And thus, there would be a single 3D coordinate system representing the 3D physical world with multiple 2D scenes perfectly placed and aligned, thus forming a 2.5D composite scene i.e. each individual scene is 2D, however they are located in a 3D physical environment.
[0053] FIG 7 describes is a flowchart depicting a method for processing of a plurality of panoramic images in accordance with the embodiments of the present invention. In step 701, acquiring a plurality of panoramic images and processing, the plurality of panoramic images and creating a discretized and quantized three-dimensional (3D) space; wherein the processing comprising positioning (in step 702) the plurality of panoramic images in a single coordinate by means of a plurality of marking cues placed in a region of interest and projecting the first image on a sphere on the point (0,0,0); locating (in step 703)pixel representing a second point in the first image; determining (in step 704) the position of second image by ray-projecting the pixel on a floor plane; registering (in step 705) second image; and looping ( in step 706) for every succeeding point to correctly position each image
[0054] In another embodiments, the home tab is configured to load the images captured by the 2D image capturing device. The home tab of the image processing device performs of the
14
following functions, namely, adding plurality of equi-rectangular panorama images, saving the panorama images to lower resolution panorama images, converting the plurality of equi-rectangular panorama images to cubic panorama images, and a quality check to conduct checks for eliminating errors.
[0055] The image tab is configured to perform image processing operations. The image tab comprises of HDR button, batch operation button and panorama button. The HDR button triggers the process for receiving multi-exposure images, generating high dynamic range images, conducting tone mapping operation. In an embodiment of the present disclosure the tone mapping operation changes RGB values of the pixels thus affecting the contrast, saturation, brightness of the images and converts the images to tiff or jpeg. The HDR process also determines and saves the exif data for the final HDR-tone mapped images. In an embodiment a user can create their own custom presets and apply those again during batch operation. The batch operation button receives multi-exposure images of multiple scenes and performs HDR and Tone mapping operation upon those images. The batch operation button saves precious time lost due to uploading and performing operation on one single image at a time. The panorama button receives input images captured from one particular view point and returns their stitched equi-rectangular panorama as output.
[0056] The galaxy Tab contains all galaxy specific functions. A galaxy is a set of panorama images which share a common spatial characteristic. In an embodiment, all panorama images taken from various points within a bedroom are categorized under a single galaxy for example Bedroom galaxy. A galaxy thus contains within itself a number of spheres or it may also contain children galaxies. For example a Bedroom galaxy, a Drawing Room Galaxy, a Kitchen Galaxy may be categorized under Apartment 1 galaxy. The galaxy which contains each and every other galaxy is called the Root Galaxy. At the first step there are no galaxies created, all spheres/ panorama images come under Root Galaxy. When a project is initialized and all panorama images are loaded, there are no galaxies created, all spheres/ panorama images come under Root Galaxy.
[0057] The add button adds galaxies to the Root Galaxies or to any of its child galaxies. The remove button removes any galaxies. The root galaxy can however not be removed. The
15
color button adds a different color as identified for different galaxies. The Edit button edits the properties of a galaxy not limited to but including its title, panorama images it contains, description, start galaxy, wherein the start galaxy determines which galaxy should be navigated if a particular galaxy is desired to be viewed, a start sphere providers the functionality that which particular sphere/ panorama image should be displayed first when this particular galaxy is desired to viewed, and a start orientation ( which particular direction of the panorama image should be viewed once the desired sphere is reached). The Move-up and move-down buttons (not shown in the figure) are used to move a galaxy up or down the galaxy hierarchy. The mark button marks each sphere and assigns it to a particular galaxy. A galaxy is selected first, and all those spheres which are thereafter “marked” are categorized under the selected galaxy. The Map button is used to select a particular image as a floorplan or site-layout for a particular project. All the galaxies which have been created are thereby marked at their respective positions on this “map”. The viewing direction in the spheres/panorama images is also aligned with the viewing direction in the map here. The Edge button generates edges between all the connecting spheres. A connection from one sphere to the next implies a transition/ jump is possible from the first sphere to the latter. However, the reverse wouldn’t be true if the reverse connection is not made. These edges serve as the basis for the Auto-Tour generation. The edges provide guidelines to the user for possible connections which could be used to generate an auto-tour (an automatic transition from one sphere to the next) for viewing the entire apartment or the physical space. This functionality is achieved by the auto-tour button.
[0058] The Image processing device provides 3D Graphics Viewer as well as a 2D Image Viewer. The 3D viewer consists of a 3D coordinate as a reference frame which helps us in the global alignment of all the panorama images added from the Home Tab menu. Interactive functionalities in the viewer help us with determination of the location of the adjacent panorama images, their orientation, their connections, categorization into different galaxies as required. The 2D image viewer helps us with viewing the resultant output images from HDR Image generation and panoramic stitching.
[0059] The 3D image viewer is used for visualizing the 2.5D graphic content robustly across all platforms including but not limited to web, smartphones or head mounted displays(HMD), thus providing an immersive experience to all users who use it. Viewing normal 3D content
16
on web or smartphones or HMD requires a lot of bandwidth and data consumption. It is thus preferable to have these contents pre-loaded on those devices. The graphics visualization algorithms and content loader algorithms drastically reduce data consumption, without inhibiting user-experience of any kind, thus enabling users to view any particular project on web or any other devices with just a link to that project.
[0060] The present disclosure is applicable to all types of on-chip and off chip memories used in various in digital electronic circuitry, or in hardware, firmware, or in computer hardware, firmware, software, or in combination thereof. Apparatus of the invention can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and methods actions can be performed by a programmable processor executing a program of instructions to perform functions of the invention by operating on input data and generating output. The invention can be implemented advantageously on a programmable system including at least one input device, and at least one output device. Each computer program can be implemented in a high-level procedural or object-oriented programming language or in assembly or machine language, if desired; and in any case, the language can be a compiled or interpreted language.
[0061] Suitable processors include, by way of example, both general and specific microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data file; such devices include magnetic disks and cards, such as internal hard disks, and removable disks and cards; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of volatile and non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; CD-ROM and DVD-ROM disks; and buffer circuits such as latches and/or flip flops. Any of the foregoing can be supplemented by, or incorporated in ASICs (application-specific integrated circuits), FPGAs (field-programmable gate arrays) and/or DSPs) digital signal processors).
[0062] It will be apparent to those having ordinary skill in this art that various modifications and variations may be made to the embodiments disclosed herein, consistent with the present
17
disclosure, without departing from the spirit and scope of the present disclosure. Other embodiments consistent with the present disclosure will become apparent from consideration of the specification and the practice of the description disclosed herein.
We Claim:
1. A system for capturing and processing of a plurality of panoramic images, said system comprising:
an automated panoramic head (101) operatively coupled to an image capturing device (102), wherein said automated panoramic head is configured to rotate the image capturing device in 360 degrees and capture a plurality of panoramic images from a plurality of locations within a region of interest; and
an image processing device (103) configured to process the panoramic images and create a discretized and quantized three-dimensional (3D) space.
2. The system as claimed in claim 1, wherein the plurality of panoramic images captured are two-dimensional (2D) images.
3. The system as claimed in claim 1, wherein the automatic panoramic head is positioned on an adapter tripod to maintain a focal point of the image capturing device.
4. The system as claimed in claim 1, wherein the automated panoramic head comprises a control unit (104) to control the rotation of the automated panoramic head.
5. The system as claimed in claim 1, wherein the control unit triggers a shutter of the image capturing device to capture first image from the location of the region of interest.
6. An automatic panoramic head (101) to capture a plurality of panoramic images, said panoramic head comprising:
an image capturing device (102) positioned on said automatic panoramic head;
an adapter tripod (201) housing a plurality of ball bearings (204), to connect the lower stationary part of the automatic panoramic head;
19
an adapter gear boom (202) comprising a first part and a second part protuding part, the first part being removably attached to the image capturing device and the second protruding part being rotatably mounted within the adapter tripod though the plurality of ball bearings;
a motor mounting plate (203) disposed between said adapter tripod (201) and a motor & gear box (205), comprising a plurality of holes of various diameters to allow the adapter gear boom access to the bearing in top and allows the motor axel to pass through;
a control unit disposed over the motor mounting plate to control the movement of the motor; and
a twin gear arrangement (206) to transfer the rotational motion from the motor & gear box to the adapter gear boon and ensures that the motor provide a predetermined torque;
wherein the motor is configured to swivel the panoramic head and the image capturing device on its axis of 360 degrees to capture the plurality of panoramas of images.
7. The panoramic head as claimed in claim 6, wherein the adapter tripod is configured to maintain the focal point of the image capturing device stationary, so as to eliminate parallax during panoramic stitching.
8. The panoramic head as claimed in claim 6, wherein the motor rotates the image capturing device through a given number of steps to position the image capturing device to a certain degrees of a first image capturing direction.
9. The panoramic head as claimed in claim 6, wherein the automated panoramic head is adjusted in a particular direction so that no part of the panoramic head comes within the Field of View (FOV) of the image capturing device
10. The panoramic head as claimed in claim 6, wherein the motor is a stepper motor.
20
11. An image processing device (103) for processing of a plurality of panoramic images, said device comprising:
an image capturing device (102) to capture a plurality of panoramic images;
a processor (501) to process the plurality of panoramic images and create a discretized and quantized three-dimensional (3D) space; and
an image viewer (502) to view said processed panoramic images;
wherein the processor is configured to:
position the plurality of panoramic images in a single coordinate by means of a plurality of marking cues placed in a region of interest;
project the first image on a sphere on the point (0,0,0);
locate pixel representing a second point in the first image;
determine the position of second image by ray-projecting the pixel on a floor plane;
register second image; and
loop for every succeeding point to correctly position each image.
12. The device as claimed in claim 11, wherein the device rotates the plurality of panoramic images about the floor plane such that the point of intersection of the ray coincides with the location of the first panoramic sphere.
13. The device as claimed in claim 11, wherein the plurality of panoramic images captured are two-dimensional (2D) images.
14. The device as claimed in claim 11, wherein the image viewer is a 3D viewer or a 2D viewer.
21
15. A method for capturing and processing of a plurality of panoramic images, said method comprising:
acquiring, by an image capturing device (102), a plurality of panoramic images;
processing, in a processor (501), the plurality of panoramic images and creating a discretized and quantized three-dimensional (3D) space; and
viewing, in an image viewer (502), the processed panoramic images;
wherein the processing comprising:
positioning the plurality of panoramic images in a single coordinate by means of a plurality of marking cues placed in a region of interest;
projecting the first image on a sphere on the point (0,0,0);
locating pixel representing a second point in the first image;
| # | Name | Date |
|---|---|---|
| 1 | 201611016356-Form-5-(11-05-2016).pdf | 2016-05-11 |
| 1 | 201611016356-PETITION UNDER RULE 138 [07-02-2024(online)].pdf | 2024-02-07 |
| 2 | 201611016356-Correspondence to notify the Controller [22-01-2024(online)].pdf | 2024-01-22 |
| 2 | 201611016356-Form-3-(11-05-2016).pdf | 2016-05-11 |
| 3 | 201611016356-US(14)-HearingNotice-(HearingDate-23-01-2024).pdf | 2024-01-08 |
| 3 | 201611016356-Form-2-(11-05-2016).pdf | 2016-05-11 |
| 4 | 201611016356-Form-1-(11-05-2016).pdf | 2016-05-11 |
| 4 | 201611016356-ABSTRACT [01-08-2022(online)].pdf | 2022-08-01 |
| 5 | 201611016356-Drawings-(11-05-2016).pdf | 2016-05-11 |
| 5 | 201611016356-CLAIMS [01-08-2022(online)].pdf | 2022-08-01 |
| 6 | 201611016356-Description (Provisional)-(11-05-2016).pdf | 2016-05-11 |
| 6 | 201611016356-COMPLETE SPECIFICATION [01-08-2022(online)].pdf | 2022-08-01 |
| 7 | 201611016356-DRAWING [01-08-2022(online)].pdf | 2022-08-01 |
| 7 | 201611016356-Abstract-(11-05-2016).pdf | 2016-05-11 |
| 8 | abstract.jpg | 2016-07-26 |
| 8 | 201611016356-FER_SER_REPLY [01-08-2022(online)].pdf | 2022-08-01 |
| 9 | 201611016356-OTHERS [01-08-2022(online)].pdf | 2022-08-01 |
| 9 | Power of Attorney [07-10-2016(online)].pdf | 2016-10-07 |
| 10 | 201611016356-FER.pdf | 2022-02-01 |
| 10 | Form 6 [07-10-2016(online)].pdf | 2016-10-07 |
| 11 | 201611016356-FORM 18 [06-05-2020(online)].pdf | 2020-05-06 |
| 11 | Assignment [07-10-2016(online)].pdf | 2016-10-07 |
| 12 | 201611016356-Power of Attorney-181016.pdf | 2016-10-21 |
| 12 | Description(Complete) [09-05-2017(online)].pdf | 2017-05-09 |
| 13 | 201611016356-OTHERS-181016.pdf | 2016-10-21 |
| 13 | Description(Complete) [09-05-2017(online)].pdf_331.pdf | 2017-05-09 |
| 14 | 201611016356-Correspondence-181016.pdf | 2016-10-21 |
| 14 | Drawing [09-05-2017(online)].pdf | 2017-05-09 |
| 15 | Form 13 [09-05-2017(online)].pdf | 2017-05-09 |
| 15 | Other Document [09-05-2017(online)].pdf | 2017-05-09 |
| 16 | Form 13 [09-05-2017(online)].pdf | 2017-05-09 |
| 16 | Other Document [09-05-2017(online)].pdf | 2017-05-09 |
| 17 | Drawing [09-05-2017(online)].pdf | 2017-05-09 |
| 17 | 201611016356-Correspondence-181016.pdf | 2016-10-21 |
| 18 | 201611016356-OTHERS-181016.pdf | 2016-10-21 |
| 18 | Description(Complete) [09-05-2017(online)].pdf_331.pdf | 2017-05-09 |
| 19 | 201611016356-Power of Attorney-181016.pdf | 2016-10-21 |
| 19 | Description(Complete) [09-05-2017(online)].pdf | 2017-05-09 |
| 20 | 201611016356-FORM 18 [06-05-2020(online)].pdf | 2020-05-06 |
| 20 | Assignment [07-10-2016(online)].pdf | 2016-10-07 |
| 21 | 201611016356-FER.pdf | 2022-02-01 |
| 21 | Form 6 [07-10-2016(online)].pdf | 2016-10-07 |
| 22 | 201611016356-OTHERS [01-08-2022(online)].pdf | 2022-08-01 |
| 22 | Power of Attorney [07-10-2016(online)].pdf | 2016-10-07 |
| 23 | 201611016356-FER_SER_REPLY [01-08-2022(online)].pdf | 2022-08-01 |
| 23 | abstract.jpg | 2016-07-26 |
| 24 | 201611016356-DRAWING [01-08-2022(online)].pdf | 2022-08-01 |
| 24 | 201611016356-Abstract-(11-05-2016).pdf | 2016-05-11 |
| 25 | 201611016356-Description (Provisional)-(11-05-2016).pdf | 2016-05-11 |
| 25 | 201611016356-COMPLETE SPECIFICATION [01-08-2022(online)].pdf | 2022-08-01 |
| 26 | 201611016356-Drawings-(11-05-2016).pdf | 2016-05-11 |
| 26 | 201611016356-CLAIMS [01-08-2022(online)].pdf | 2022-08-01 |
| 27 | 201611016356-Form-1-(11-05-2016).pdf | 2016-05-11 |
| 27 | 201611016356-ABSTRACT [01-08-2022(online)].pdf | 2022-08-01 |
| 28 | 201611016356-US(14)-HearingNotice-(HearingDate-23-01-2024).pdf | 2024-01-08 |
| 28 | 201611016356-Form-2-(11-05-2016).pdf | 2016-05-11 |
| 29 | 201611016356-Form-3-(11-05-2016).pdf | 2016-05-11 |
| 29 | 201611016356-Correspondence to notify the Controller [22-01-2024(online)].pdf | 2024-01-22 |
| 30 | 201611016356-PETITION UNDER RULE 138 [07-02-2024(online)].pdf | 2024-02-07 |
| 30 | 201611016356-Form-5-(11-05-2016).pdf | 2016-05-11 |
| 1 | 201611016356patseerE_31-01-2022.pdf |