Abstract: ABSTRACT The present disclosure discloses method and system for detecting condition of optical lens. The detection system includes light source, lens holder, image capturing unit and analyzing unit. The image capturing unit captures image of optical lens upon uniform illumination of optical lens. The analyzing unit identifies zones of optical lens based on diameter information and type of lens using captured image. The analyzing unit identifies structures and detects the structures in each zone as one or more scratches, when value of at least one parameter associated with each structure is greater than a threshold value of a corresponding parameter. The analyzing unit detects condition of optical lens such as acceptable, not acceptable, and partially acceptable based on number of scratches. In this manner, present disclosure facilitates customers to decide whether optical lens may be used or replaced. FIG. 1a
DESC:TECHNICAL FIELD
The present subject matter is generally related to lens inspection and more particularly, but not exclusively, to method and system for detecting condition of an optical lens.
BACKGROUND
Lenses are typically made with high degree of precision and accuracy. Recently, there are several automated systems for manufacturing optical lens. These automated systems are more accurate and precise to avoid irregularities in lens. Nevertheless, on rare occasion, the lens may contain irregularities during manufacturing. And for this reason, the optical lens must be inspected before sale to customer to ascertain that the lenses are acceptable for use by the customer.
There are several automated systems for inspecting scratches/irregularities present in the optical lens. These systems are installed at the end production lines to identify the scratches in the lens soon after manufacturing of the lens. Conventionally, the inspection system employs imaging system to identify the scratches in the lens. The inspection system illuminates a light beam onto the surface of the optical lens and the image reflecting the surface of the optical lens is captured. The images are analyzed either manually or automatically to identify scratches present in the optical lens. In certain scenarios, the optical lenses are manufactured with certain coatings on the optical lens. For example, the optical lens may either coated with anti-reflective coating or anti scratch coating or anti fog coating. Due to the arrangement of the light or due to the lighting conditions of the environment or different type of coatings present on the optical lens, the images captured by these inspection systems are not clear and fail to identify the irregularities present in the optical lens accurately.
Further, the automated lens inspection systems are employed only at large scale manufacturing units and there are no systems for identifying scratches at retailer units. Further, the optical lens used by the customer may undergo minor scratches and the customers may not be aware whether they should either continue using the optical lens or replace the optical lens. Also, conventionally there are no systems to identify the scratches present on the optical lens and also to predict lifetime of the optical lens that are used by the consumers.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
Disclosed herein is a detection system for detecting condition of an optical lens. The detection system comprises a lens holder, a light source, an image capturing unit and an analyzing unit. The lens holder is configured to place the optical lens for inspection. The light source is placed parallel to the optical lens such that light rays from the light source passes tangentially to surface of the optical lens configured to illuminate the optical lens uniformly. Once illuminated, the image capturing unit captures image of the optical lens. The analyzing unit identifies one or more zones of the optical lens based on type of the optical lens using the captured image, wherein the type of the optical lens is determined using diameter information of the optical lens using the captured image and thereafter identifies one or more structures in each of the one or more zones. Further, the analyzing unit detects the one or more structures in each of the one or more zones as one or more scratches in each of the one or more zones when values of at least one of the one or more parameters associated with the one or more structures is greater than a threshold value of a corresponding parameter. Once the scratches are detected, the analyzing unit detects condition of the optical lens based on number of scratches detected in each of the one or more zones.
Disclosed herein is a method for detecting condition of an optical lens. The method comprises capturing, by an image capturing unit of a detection system, an image of an optical lens placed on a lens holder of the detection system. Thereafter, the method comprises identifying, by an analyzing unit of the detection system, one or more zones of the optical lens based on type of the optical lens using the captured image, wherein the type of the optical lens is determined using dimeter information of the optical lens. Once the zones are identified, the method comprises identifying, by the analyzing unit, one or more structures in each of the one or more zones. Upon identifying the structures, the method comprises detecting the one or more structures in each of the one or more zones one or more scratches in each of the one or more zones when value of at least one of the one or more parameters associated with the one or more structures is greater than a threshold value of a corresponding parameter. Further, the method comprises detecting, by the analyzing unit, condition of the optical lens based on number of scratches detected in each of the one or more zones.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
Fig.1a shows an exemplary view of a detection system for detecting condition of an optical lens in accordance with some embodiments of the present disclosure.
Fig.1b shows a block diagram of an analyzing unit in accordance with some embodiments of the present disclosure.
Fig.2 shows a flowchart illustrating a method for pre-processing captured image in accordance with some embodiments of the present disclosure.
Figs.3a-3f show exemplary representations of categorization of zones for various type of optical lens in accordance with some embodiments of the present disclosure.
Fig.4 shows a flowchart illustrating a method for detecting condition of an optical lens in accordance with some embodiments of the present disclosure.
Figs.5a-5c show exemplary representations of zones and number of scratches in each zone in accordance with some embodiments of the present disclosure.
Fig.5d shows an exemplary report comprising information of each of one or more zones in an optical lens and total number of values of parameters associated with scratches identified in each of the one or more zones in accordance with some embodiments of the present disclosure.
Fig.6 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes”, “including” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
The present disclosure discloses a detection system and a method for detecting condition of an optical lens. The detection system includes a light source, a lens holder, an image capturing unit and an analyzing unit. The light source may include a side ring Light Emitting Diodes (LED) light arranged at bottom of the detection system. The ring LED may be arranged in a circular portion with a dark background. The lens holder may be configured to hold the lens under observation. The light source may be placed such that light rays from the light sources pass tangentially through surface of the optical lens for uniform illumination of the optical lens. The image capturing unit may capture image of the optical lens upon illumination of the optical lens. The analyzing unit may identify one or more zones of the optical lens based on type of the optical lens using the captured image, wherein the type of the optical lens is determined using diameter information of the optical lens. As an example, the one or more zones may be inner zone, intermediate zone, and outer zone. Once the zones are identified, the analyzing unit may identify one or more structures in each of the one or more zones. Thereafter, the analyzing unit may detect one or more structures in each of the one or more zones as one or more scratches, when value of at least one of the one or more parameters associated with the one or more structures is greater than a threshold value of a corresponding parameter. The one or more parameters may be width, height, area, perimeter and XY coordinates of the structures. Once the scratches are detected, the analyzing unit may detect condition of the optical lens based on number of scratches detected in each of the one or more zones. The condition may be one of “acceptable”, “not acceptable”, or “partially acceptable”. The analyzing unit may also generate a report which comprises information on number of scratches in each zone, the condition of the optical lens and lifetime or life expectancy of the optical lens based on the number of scratches. In this manner, the present disclosure discloses a method and system for detecting condition of the lens based on number of scratches which facilitates customers to decide whether the optical lens may be used or replaced based on life expectancy of the optical lens.
Fig.1a shows an exemplary view of a detection system for detecting condition of an optical lens in accordance with some embodiments of the present disclosure.
The detection system 100 comprises a lens holder 101, a light source 102, an image capturing unit and an analyzing unit 105. The image capturing unit may be a camera 103. The detection system 100 also comprises other components such as a guide rod 113 comprising camera holder 111 to hold the camera 103 and a base 115 to hold the lens holder 101. The lens holder 101 may be configured to hold the lens such as raw cut lens for example, optical lens 107. The lens holder 101 may also be configured to hold the spectacle 109. To illustrate and as an example, the method is described by considering the optical lens 107. However, the method may also be applicable for spectacle 109 and any other lens. The light source 102 may include a side ring Light Emitting Diodes (LED) [also referred as ring light] arranged at the bottom of the detection system 100 towards the base 115. The ring LED may be arranged in a circular portion with a dark background [not shown in Fig.1a]. The lens holder 101 may be configurable to accommodate variety of lens based on size and type of lens. The lens holder 101 may be of a non-reflective surface to avoid reflection from external sources. The light source 102 may be placed parallel to the optical lens 107. The light source 102 may be placed parallel to the optical lens 107 such that light rays from the optical lens 107 pass tangentially through surface of the optical lens 107 for uniform illumination of the optical lens 107 and also to avoid internal reflection caused due to coatings on the optical lens 107. As a non-limiting example, the optical lens 107 may include, but not limited to, a bifocal lens, a progressive lens, a single vision lens, a D-bifocal lens and a Kryptoc bi-focal lens.
In an embodiment, once the optical lens 107 is placed on the lens holder 101, dust particles in the lens holder 101 may be removed using a blower [not shown in Fig.1a] configurable in the detection system 100. The camera 103 is placed above the lens holder 101. Once the optical lens 107 is illuminated by the light source 102, the camera captures image of the optical lens 107. Due to arrangement of placement of the lens holder 101 at the bottom, light source 102 parallel to the optical lens 107 and the camera 103 on top of the lens holder 101, uniform illumination of the optical lens 107 is achieved and total internal reflection caused due coatings on the optical lens 107 is avoided. The captured image is provided to the analyzing unit 105.
In an embodiment, the analyzing unit 105 identifies one or more zones of the optical lens 107 based on type of the optical lens 107 and diameter information of the optical lens 107 using the captured image. As an example, the one or more zones may be inner zone, intermediate zone, and the outer zone. The one or more zones may be identified using a machine learning model. The machine learning model may be trained using images of different types of optical lens 107 and diameter information of the optical lens 107. Once the one or more zones are identified, the analyzing unit 105 may identify one or more structures in each of the one or more zones. The one or more structures may be one or more abnormal structures such as scratches or dents in the optical lens 107. The one or more structures may be identified based on pixel value difference in subsequent locations of the captured image. The analyzing unit 105 may detect the one or more structures as the one or more scratches when values of at least one of the one or more parameters associated with the one or more structures is greater than a threshold value of a corresponding parameter. The one or more parameters may be width, height, area, perimeter and XY coordinates of the structures. Based on the number of structures in each of the one or more zones, the analyzing unit 105 may detect condition of the optical lens 107. The condition may be “acceptable”, “not acceptable”, or “partially acceptable”. The analyzing unit 105 may also provide information of life time of the optical lens 107 based on the number of scratches in each of the one or more zones. As an example, the height of the structure may be 1 mm and the threshold value may be 2 mm. Since the height of the structure is less than the threshold value, the structure may not be considered as a scratch. However, if the height of the structure is 2.5mm which is greater than the threshold value, the structure may be considered as a scratch.
Fig. 1b shows a block diagram of the analyzing unit 105. The analyzing unit 105 may include an I/O interface 121, a processor 123 and a memory 125. The I/O interface 121 may be configured to receive the captured image from the image capturing unit 103. The captured image may be stored in the memory. The captured image may be processed by the processor 123 for detecting condition of the lens. The analyzing unit 105 may include data 127 and modules 129. As an example, the data 127 is stored in the memory 125 configured in the analyzing unit 105 as shown in the Fig.1b. In one embodiment, the data 127 may include image data 131, scratches data 133, condition data 135 and other data. In the illustrated Fig.1b, modules 129 are described herein in detail.
In some embodiments, the data 127 may be stored in the memory 125 in form of various data structures. Additionally, the data 127 can be organized using data models, such as relational or hierarchical data models. The other data 137 may store data, including temporary data and temporary files, generated by the modules 129 for performing various functions of the analyzing unit 105.
In some embodiments, the data 127 stored in the memory 125 may be processed by the modules 129 of the analysing unit. The modules 129 may be stored within the memory. In an example, the modules communicatively coupled to the processor 123 configured in the analyzing unit 105, may also be present outside the memory 125 as shown in Fig.1b and implemented as hardware. As used herein, the term modules 129 may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor 123 (shared, dedicated, or group) and memory 125 that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In some embodiments, the modules 129 may include, for example, a pre-processing module 139, a zone identification module 141, a structure identification module 143, a scratch detection module 145, a condition detection module 147, and other modules 149. The other modules 149 may be used to perform various miscellaneous functionalities of the analyzing unit 105. It will be appreciated that such aforementioned modules may be represented as a single module or a combination of different modules. In an embodiment, the other modules 149 may be used to perform various miscellaneous functionalities of the analyzing unit 105 such as a report generation module and a masking module. The report generation module may be configured to generate summary report comprising information of number of scratches in each zone and condition of the optical lens. The masking module may be configured to mask background and foreground regions in the image. The said modules 129 when configured with the functionality defined in the present disclosure will result in a novel hardware.
In an embodiment, the pre-processing module 139 may be configured to pre-process the captured image. The captured image may be stored as image data 131. The pre-processing may be performed prior to identifying one or more zones in the captured image. The process of pre-processing is illustrated in Fig.2. At block 201, the captured image is cropped based on region of interest. As an example, if the spectacle 109 is under observation then only lens region of the spectacle 109 may be cropped using, as an example, a wand tracing tool. At block 203, the image may be converted into an 8-bit gray scale image for easy processing of the image. At block 205, the analyzing unit 105 removes the outlier from the captured image to remove unnecessary or unwanted pixels from the image. In an embodiment, once the pre-processing is performed, the image is segregated into foreground and background image using a method which may include, but not limited to, Otsu’s thresholding method and thereafter, the background region may be masked.
In an embodiment, the zone identification module 141 may be configured to identify one or more zones of the optical lens 107 based on type of the optical lens 107 using the pre-processed image, wherein the type is determined using diameter information of the optical lens 107. As an example, the one or more zones may include an inner zone, an intermediate zone, and an outer zone. The inner zone may be referred as zone A, the intermediate zone may be referred as zone B and the outer zone may be referred as zone C. As an example, the one or more type of optical lens 107 may be high powered single vision lens, bifocal lens, D- Bifocal lens, Kryptok bifocal lens and progressive lens. The one or more zones may be identified using a machine learning model which is trained based the diameter information of the optical lens 107 and based on images of different type of the optical lens 107. As an example, the diameter of the lens may be 60mm, 70mm or 80mm. Based on the diameter information, the type of the optical lens may be identified and based on the type of optical lens, the one or more zones may be detected.
Single vision lens
The single vision lens have exactly one vision. It may be either a short sight or a long sight. Fig.3a illustrates categorization of zones such as zone A 301, inner periphery, and zone A’ 303 outer periphery for short sight and Fig.3b shows different zones such as zone A 301, inner periphery and zone A’ 303 outer periphery for long sight.
Bifocal Lens
In bifocal lens as shown in Fig.3c, both the visions (near and far sight) are present. So accordingly, the zones are categorised such as upper zone 305 and lower zone 307.
D- Bifocal lens
D-Bifocal lenses as shown in Fig.3d generally has two zones since it has two lens power for distance and near vision. Here the lens segment for near vision correction looks like half-moon shape. So accordingly, the zones are categorised such as outer zone 309 and inner zone 311.
Kryptok bifocal lens
The Kryptok bifocal lens as shown in Fig.3e is like D- bifocal lens, it has two zones, the only difference is the inner zone is formulated as circular region. So accordingly, the zones are categorised such as outer zone 313 and inner zone 315.
Progressive lens
The progressive lens is divided into three zones such as upper, lower, and intermediate regions as the power of the lens gradually increases as shown in Fig.3f. So accordingly, the zones are categorised such as outer zone 317, intermediate zone 319 and inner zone 321.
In an embodiment, once the one or more zones are identified, each of the one or more zones are cropped and the structure identification module 143 may identify one or more structures in each of the one or more zones. The structures may be abnormal structures such as scratches or dents. These structures may be identified by comparing pixel values at subsequent locations in the image.
In an embodiment, the scratch detection module 145 may be configured to detect one or more scratches in each of the one or more zones based on the one or more structures using a thresholding technique. The one or more scratches are detected by comparing value of one or more parameters associated with each of the one or more structures with a threshold value of a corresponding parameter. The one or more parameters may include width, height, area, perimeter and XY coordinates of the structures. The value of each of these parameters are compared with a threshold value of a corresponding parameter. As an example, the width value of the structure is compared with a threshold width value. Similarly, the height value of the structure is compared with a threshold height value. When the value of at least one parameter is greater than the threshold value of a corresponding parameter, the structure is detected as a scratch. Once the scratches are detected, the scratches may be highlighted in each zone for easy identification. The identified scratches in each of the one or more zones may be stored as scratches data 133. As and when the scratches are identified in each zone, a counter may be maintained to monitor number of scratches in each zone.
In an embodiment, the condition detection module 147 may be configured to detect the condition of the lens based on number of scratches identified in each of the one or more zones. The condition detection module 147 may detect the condition of the optical lens 107 as one of “acceptable”, “partially acceptable”, or “not acceptable”. The condition detection module 147 may detect the condition of the optical lens 107 as “not acceptable” when the number of scratches in each of the one or more zones is greater than first predefined number of scratches for corresponding each of the one or more zones. The condition detection module 147 may detect the condition of the optical lens 107 as “partially acceptable” when the number of scratches in each of the one or more zones is less than first predefined number of scratches for corresponding each of the one or more zones. The condition detection module 147 may detect the condition of the optical lens 107 as “acceptable” when the number of scratches in each of the one or more zones is less than second predefined number of scratches for corresponding each of the one or more zones. As an example, the first predefined number of scratches may be 100 and the second predefined number of scratches may be 50 for one of the one or more zones, such as zone A. When the number of scratches are less than 50, then the optical lens 107 may be detected as “acceptable” for zone A. When the number of scratches are less than 100, the optical lens 107 may be detected as “partially acceptable” for zone A. When the number of scratches are greater than the first predefined number of scratches, the then optical lens 107 may be “not acceptable” for zone A. Similarly, the number of scratches are compared with predefined number of scratches for each zone to detect the condition of the lens. The identified condition of the optical lens may be stored as condition data 135.
In an embodiment, when the condition of the optical lens 107 is detected, the analyzing unit 105 may generate a report by the analyzing unit 105 based on the number of scratches. The analyzing unit 105 may generate a report comprising information on number of scratches in each zone, the condition of the optical lens 107 and lifetime or life expectancy of the optical lens 107 based on the number of scratches. As an example, Fig.5a shows one or more structures in zone A 501. Fig.5b shows one or more structures in zone B 503 and Fig.5c shows one or more scratches in zone C 505. As an example, the number of scratches in zone A 501 is 1, the number of scratches in zone B 503 are 31 and the number of scratches in zone C 505 are 449. The analyzing unit 105 may compare the number of scratches with the threshold value of scratches predefined for each zone and may detect the condition of optical lens 107 as one of “acceptable”, “partially acceptable” or “non-acceptable”.
In some embodiments, the identified data such as the length, size, depth, width, perimeter and area of the scratches may be exported to an excel sheet or a database and stored in the excel or database as shown in Fig.5d. The excel sheet may also include information associated with total number of scratches and average size of the scratches in each zone. As an example, as shown in Fig.5d, there may be 449 number of scratches in zone C, the total area of the scratches may be 86911, the average size of the scratches may be 193.566, the percentage value of the area may be 0.819 and the perimeter value may be 56.080 in zone C. Similarly, the values for zone A and zone B are also indicated.
Fig.4 shows a flowchart illustrating a method for detecting condition of an optical lens in accordance with some embodiments of the present disclosure.
As illustrated in Fig.4, the method 400 includes one or more blocks illustrating a method of. detecting condition of an optical lens 107. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 401, the method may include capturing an image of an optical lens 107 placed on a lens holder 101. The lens under observation, such as an optical lens 107 or a spectacle 109 may be placed on the lens holder 101 for observation in order to detect condition of the lens. Once the optical lens 107 is placed on the lens holder 101, the lens is illuminated by the light source 102 and the image of the optical lens 107 is captured.
At block 403, the method may include performing pre-processing operations on the captured image. The pre-processing may include operations such as cropping region of interest from the captured image using wand tracing tool, converting the image into 8-bit gray scale image, and removing outliers from the image.
At block 405, the method may include identifying one or more zones of the optical lens 107 based on type of the optical lens 107 and diameter information of the optical lens 107. The one or more zones may be identified using a machine learning model which is trained based on images of different types of optical lens 107 and the diameter information of the optical lens 107. As an example, the one or more zones may be inner zone, intermediate zone, and outer zone.
At block 407, the method may include identifying one or more structures in each of the one or more zones by comparing pixel values at subsequent locations in the image.
At block 409, the method may include detecting one or more scratches in each of the one or more zones based on the one or more structures. The one or more scratches may be detected using a thresholding technique. The one or more scratches are detected by comparing value of one or more parameters associated with each of the one or more structures with a threshold value of a corresponding parameter. The one or more parameters may include width, height, area, perimeter and XY coordinates of the structures. The value of each of these parameters are compared with a threshold value of a corresponding parameter. When the value of at least one parameter is greater than the threshold value of a corresponding parameter, the structure is detected as a scratch. The identified scratches may be highlighted for easy identification.
At block 411, the method may include detecting condition of the lens based on number of scratches identified in each of the one or more zones. The condition may be one of “acceptable”, “partially acceptable” and “not acceptable”.
At block 413, the method may include generating a report comprising information of number of scratches in each zone, condition of the optical lens 107 and life expectancy of the optical lens 107.
Computer System
Fig.6 illustrates a block diagram of an exemplary computer system 600 for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 600 may be an analyzing unit 105, which is used for detecting condition of the optical lens 107. The computer system 600 may include a central processing unit (“CPU” or “processor”) 602. The processor 602 may comprise at least one data processor for executing program components for executing user or system-generated business processes. The processor 602 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 602 may be disposed in communication with one or more input/output (I/O) devices (611 and 612) via I/O interface 601. The I/O interface 601 may employ communication protocols/methods such as, without limitation, audio, analog, digital, stereo, IEEE-1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE) or the like), etc. Using the I/O interface 601, the computer system 600 may communicate with image capturing unit 103 to receive captured image.
In some embodiments, the processor 602 may be disposed in communication with a communication network 609 via a network interface 603. The network interface 603 may communicate with the communication network 609. The network interface 603 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc.
The communication network 609 can be implemented as one of the several types of networks, such as intranet or Local Area Network (LAN) and such within the organization. The communication network 609 may either be a dedicated network or a shared network, which represents an association of several types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the communication network 609 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.
In some embodiments, the processor 602 may be disposed in communication with a memory 605 (e.g., RAM 613, ROM 614, etc. as shown in Fig. 6) via a storage interface 604. The storage interface 604 may connect to memory 605 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
The memory 605 may store a collection of program or database components, including, without limitation, user /application 606, an operating system 607, a web browser 608, a mail client 615, a mail server 616, a web server 617, and the like. In some embodiments, computer system 600 may store user /application data 606, such as the data, variables, records, etc. as described in this invention. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as OracleR or SybaseR.
The operating system 607 may facilitate resource management and operation of the computer system 600. Examples of operating systems 607 include, without limitation, APPLE MACINTOSHR OS X, UNIXR, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTIONTM (BSD), FREEBSDTM, NETBSDTM, OPENBSDTM, etc.), LINUX DISTRIBUTIONSTM (E.G., RED HATTM, UBUNTUTM, KUBUNTUTM, etc.), IBMTM OS/2, MICROSOFTTM WINDOWSTM (XPTM, VISTATM/7/8, 10 etc.), APPLER IOSTM, GOOGLER ANDROIDTM, BLACKBERRYR OS, or the like. A user interface may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 600, such as cursors, icons, check boxes, menus, windows, widgets, etc. Graphical User Interfaces (GUIs) may be employed, including, without limitation, APPLE MACINTOSHR operating systems, IBMTM OS/2, MICROSOFTTM WINDOWSTM (XPTM, VISTATM/7/8, 10 etc.), UnixR X-Windows, web interface libraries (e.g., AJAXTM, DHTMLTM, ADOBE® FLASHTM, JAVASCRIPTTM, JAVATM, etc.), or the like.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
Advantages of the embodiment of the present disclosure are illustrated herein.
In an embodiment, the present disclosure provides method and detection system to detect condition of a lens such as optical lens or a spectacle.
In an embodiment, the present disclosure categorizes the optical lens into zones for quickly identifying scratches in each zone and hence time efficient.
In an embodiment, the present disclosure detects one or more scratches in each zone of the optical lens and hence aids in accurate detection of condition of the lens.
In an embodiment, in the present disclosure the light source is placed parallel to the optical lens for illuminating the optical lens and hence provide uniform illumination and voids reflection from coating from the optical lens.
In an embodiment, the present disclosure provides a summary report with information of number of scratches in the optical lens along with information of the life expectancy of the optical lens and hence helps customers to make decision on usage of the optical lens.
In an embodiment, the present disclosure detects accurately the condition of the optical lens and the results are provided in real-time.
In an embodiment, the system is compact and hence portable and user friendly.
The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise. The enumerated listing of items does not imply that any or all the items are mutually exclusive, unless expressly specified otherwise.
The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Referral Numerals:
Reference Number Description
100 Detection system
101 Lens holder
103 Camera
105 Analyzing unit
102 Light source
107 Optical lens
109 Spectacle
111 Camera holder
113 Guide rod
115 Base
121 I/O interface
123 Processor
125 Memory
127 Data
129 Modules
131 Image data
133 Scratches data
135 Condition data
137 Other data
139 Pre-processing module
141 Zone identification module
143 Structure identification module
145 Scratch detection module
147 Condition detection module
149 Other modules
501 Zone A
503 Zone B
505 Zone C
600 Exemplary computer system
601 I/O Interface of the exemplary computer system
602 Processor of the exemplary computer system
603 Network interface
604 Storage interface
605 Memory of the exemplary computer system
606 User /Application
607 Operating system
608 Web browser
609 Communication network
611 Input devices
612 Output devices
613 RAM
614 ROM
615 Mail Client
616 Mail Server
617 Web Server
,CLAIMS:We claim:
1. A detection system 100 for detecting condition of an optical lens 107, the detection system 100 comprising:
a lens holder 101 for placing the optical lens 107;
a light source 102 placed parallel to the optical lens 107, wherein light rays from the light source 102passes tangentially to surface of the optical lens 107 for uniform illumination of the optical lens 107;
an image capturing unit 103 for capturing an image of the optical lens 107 upon illumination of the optical lens 107;
an analyzing unit 105 configured to:
identify one or more zones of the optical lens 107 based on type of the optical lens 107, wherein the type of the optical lens is determined based on diameter information of the optical lens 107 using the captured image;
identify one or more structures in each of the one or more zones;
detect the one or more structures in each of the one or more zones as one or more scratches, when value of at least one of the one or more parameters associated with one or more structures is greater than a threshold value of a corresponding parameter; and
detect condition of the optical lens 107 based on number of scratches detected in each of the one or more zones.
2. The detection system 100 as claimed in claim 1, wherein the analyzing unit 105 detects the condition of the optical lens 107 as not acceptable when the number of scratches in each of the one or more zones is greater than first predefined number of scratches for corresponding each of the one or more zones.
3. The detection system 100 as claimed in claim 1, wherein the analyzing unit 105 detects the condition of the optical lens 107 as partially acceptable when the number of scratches in each of the one or more zones is less than first predefined number of scratches for corresponding each of the one or more zones.
4. The detection system 100 as claimed in claim 1, wherein the analyzing unit 105 detects the condition of the optical lens 107 as acceptable when the number of scratches in each of the one or more zones is less than second predefined number of scratches for corresponding each of the one or more zones.
5. The detection system 100 as claimed in claim 1, wherein the analyzing unit 105 generates a report comprising information on number of scratches in each zone, the condition of the optical lens 107 and lifetime of the optical lens 107 based on the number of scratches.
6. The detection system 100 as claimed in claim 1, wherein the analyzing unit 105 identifies the one or more zones using a machine learning model which is trained based on images of different type of the optical lens 107 and the diameter information of the optical lens 107.
7. The detection system 100 as claimed in claim 1, wherein the one or more parameters comprises width, height, area, perimeter and XY coordinates of the structures.
8. A method for detecting condition of an optical lens 107, the method comprising:
capturing, by an image capturing unit 103 of a detection system 100, an image of an optical lens 107 placed on a lens holder 101 of the detection system 100;
identifying, by the analyzing unit 105, one or more zones of the optical lens 107 based on type of the optical lens 107, wherein the type of the optical lens is determined based on diameter information of the optical lens 107 using the captured image;
identifying, by the analyzing unit 105, one or more structures in each of the one or more zones;
detecting, by the analyzing unit 105, the one or more structures in each of the one or more zones as one or more scratches when value of at least one of the one or more parameters associated with the one or more structures is greater than a threshold value of a corresponding parameter; and
detecting, by the analyzing unit 105, condition of the optical lens 107 based on number of scratches detected in each of the one or more zones.
9. The method as claimed in claim 8, wherein the condition of the optical lens 107 is not acceptable when the number of scratches in each of the one or more zones is greater than first predefined number of scratches for corresponding each of the one or more zones.
10. The method as claimed in claim 8, wherein the condition of the optical lens 107 is partially acceptable when the number of scratches in each of the one or more zones is less than predefined number of scratches for corresponding each of the one or more zones.
11. The method as claimed in claim 8, wherein the condition of the optical lens 107 is acceptable when the number of scratches in each of the one or more zones is less than second predefined number of scratches for corresponding each of the one or more zones.
12. The method as claimed in claim 8, wherein the one or more parameters comprises width, height, area, perimeter and XY coordinates of the structures.
13. The method as claimed in claim 8 further comprises generating a report comprising information on number of scratches in each zone, the condition of the optical lens 107 and lifetime of the optical lens 107 based on the number of scratches.
14. The method as claimed in claim 8, wherein the image of the optical lens 107 is captured upon illuminating the optical lens 107 using a light source 102placed parallel to the optical lens 107.
15. The method as claimed in claim 8, wherein the one or more zones are identified using a machine learning model which is trained based on images of different type of the optical lens 107 and diameter information of the optical lens 107.
| # | Name | Date |
|---|---|---|
| 1 | 201941017809-STATEMENT OF UNDERTAKING (FORM 3) [03-05-2019(online)].pdf | 2019-05-03 |
| 2 | 201941017809-PROVISIONAL SPECIFICATION [03-05-2019(online)].pdf | 2019-05-03 |
| 3 | 201941017809-POWER OF AUTHORITY [03-05-2019(online)].pdf | 2019-05-03 |
| 4 | 201941017809-FORM 1 [03-05-2019(online)].pdf | 2019-05-03 |
| 5 | 201941017809-DRAWINGS [03-05-2019(online)].pdf | 2019-05-03 |
| 6 | 201941017809-DECLARATION OF INVENTORSHIP (FORM 5) [03-05-2019(online)].pdf | 2019-05-03 |
| 7 | 201941017809-Proof of Right (MANDATORY) [04-06-2019(online)].pdf | 2019-06-04 |
| 8 | Correspondece by Agent_Proof of Right_10-06-2019.pdf | 2019-06-10 |
| 9 | 201941017809-FORM 18 [04-05-2020(online)].pdf | 2020-05-04 |
| 10 | 201941017809-DRAWING [04-05-2020(online)].pdf | 2020-05-04 |
| 11 | 201941017809-CORRESPONDENCE-OTHERS [04-05-2020(online)].pdf | 2020-05-04 |
| 12 | 201941017809-COMPLETE SPECIFICATION [04-05-2020(online)].pdf | 2020-05-04 |
| 13 | 201941017809-FER.pdf | 2021-11-05 |
| 14 | 201941017809-FER_SER_REPLY [29-03-2022(online)].pdf | 2022-03-29 |
| 15 | 201941017809-CORRESPONDENCE [29-03-2022(online)].pdf | 2022-03-29 |
| 16 | 201941017809-CLAIMS [29-03-2022(online)].pdf | 2022-03-29 |
| 17 | 201941017809-US(14)-HearingNotice-(HearingDate-15-02-2024).pdf | 2024-01-31 |
| 18 | 201941017809-Correspondence to notify the Controller [12-02-2024(online)].pdf | 2024-02-12 |
| 19 | 201941017809-Written submissions and relevant documents [01-03-2024(online)].pdf | 2024-03-01 |
| 20 | 201941017809-PatentCertificate04-03-2024.pdf | 2024-03-04 |
| 21 | 201941017809-IntimationOfGrant04-03-2024.pdf | 2024-03-04 |
| 1 | 201941017809(1)E_26-07-2021.pdf |