Specification
The invention relates to a system for acquiring images of a measuring object to be measured and a method for detecting measured images of a measurement object by means of the system and a corresponding computer program product.
In many technical and non-technical applications depend properties of an object of interest on the (bio) chemical composition of the object. Also, structures on the surface or near-surface layers, depending on the transparency in certain spectral regions of the object which are not visible with the naked eye, influence certain characteristics of the object. therefore, the optical detection of size, shape and color of the object or the macroscopic (visible to the naked eye still recognizable) surface texture of the object is generally not sufficient for a satisfactory assessment of such properties. One typical example was the objective assessment of the state of food in terms of freshness or left untreated unit, the veiled repair of car paints to accidental damage,
In many cases, particularly in industrial applications and in the For research, hyperspectral imaging of objects is used. Here, detected by the object hyperspectral measurement images that represent the spectral reflection characteristics of the object in a spatially resolved. Based on this measurement images each having properties of interest of the object can be judged. two approaches are known for capturing hyper-spectral measurement images. In the first approach, the
illuminated object with a broadband light source, the reflected light over narrow band frequency filters, prisms or gratings into its spectral components separately and individually displayed by means of a spectral camera. The broadband uniform illumination can be realized or large area artificially use of daylight as natural lighting. According to the second approach of this principle is reversed and used a wide-band tone camera for image pickup, and the object is sequentially illuminated with narrow-band light sources. This variation is mainly used for small-scale objects in laboratories or microscopy. For lighting are then used for example to spectra sized LEDs or filter wheels.
A disadvantage of the known method of detecting hyperspectral measurement images of an object are in particular the high cost of this benötig th devices where there are complex laboratory instruments generally, and which are set up and in many cases for certain applications optimized. This disadvantage also have many other procedures that are detected by measuring objects to be examined images. Consequently, many technically suitable method can be in practice often fail to implement economic sense, particularly in the area of the consumer.
Other disadvantages of many known methods for capturing measurement images of objects to be examined is also the high temporal effort or the need for specialized technical knowledge in the operation of the equipment or in performing the procedure.
thus, it has as its object to provide a system for acquiring images of a measurement object to be examined, which is also referred to as a measurement object to propose that is most cost effective, easy to use as possible and if possible flexible. but the measurement images are still allowing the best possible assessment of properties of interest of the object. In addition, a method for detecting images corresponding measurement is to be proposed which can be carried out as simple and economical as possible and is flexible. Finally, a corresponding computer program product to be proposed that can be loaded directly into an internal memory of the proposed system, and comprising software code used to execute the steps of the proposed method,
This object is achieved by a system according to the main claim and by a method and a computer program product according to the independent claims. Further developments and specific embodiments are evident from the dependent claims, from the following description and from the figures.
The proposed system for detecting measuring images of an object to be measured thus comprises at least one mobile electronic device, such as a smart phone or tablet computer, or a different (digital) computers. The (at least one) mobile electronic device, which is often simply referred to as the "device", includes (per-weils):
- a housing
- a built in the housing camera for taking images of a measuring object to be measured within an observation range of the camera, so a detectable means of the camera space region,
- a built in the housing to the light emitting screen ads shown on the screen images, for example to display a predefined illumination image sequence in the form of sequentially displayed on the screens, the screen facing the observation area of the camera,
- an integrated in-body control unit, which is adapted to control the display of the mobile electronic device, a plurality of various
display ne lighting images of a predefined illumination image sequence one after the other, wherein the control unit is adapted to control the camera of the mobile electronic device in synchronism with each detect a measurement image from the measurement object to display of each illuminant image of the predefined illumination image sequence.
The proposed method for detecting measurement images of the measurement object can be performed with the system proposed here, and includes the steps of:
- driving the screen of the mobile electronic device by means of the control unit to display a plurality of different illumination images of the predefined illumination image sequence one after the other,
, In synchronism with the display of each illuminant image of the predefined illumination image sequence to detect activation of the camera of the mobile electronic device respectively a measurement image of the measurement object -.
The electronic mobile device typically includes at least one internal data memory which is integrated into the housing of the device. The internal data memory is typically thereof a volatile or non-volatile data storage, or a combination, for example, a RAM, a ROM, a hard disk drive or a solid-state drive (Solid State Drive), or a combination thereof.
The proposed computer program product can be loaded directly into the internal memory of the device. The computer program product comprises software code sections with which are carried out at least the above-mentioned steps of the proposed method (and optionally other steps of the method), when the computer program product is loaded on the mobile electronic device and running.
In the computer program product is, for example, be a on a data storage ( "carrier") stored computer program. In the data memory is, for example, a computer hardware such as a volatile or non-volatile data storage, such as about said internal data memory or for a further data memory of the system outside of the mobile electronic device, such as a data memory of a computer such as a computer server or a data store that is part of a computer network such as the Internet or a (Com-puter) cloud, or by the computer network (eg Internet or cloud) is generated, the computer or computer server, the computer network (eg.Internet or cloud), for example, another component of the
his system. As a possible (further) data storage RAM, ROM, a hard disk drive or solid-state drive, or combinations thereof is used for example in question or even a CD, a DVD or a USB flash drive.
The device typically comprises at least a (digital) processor, for example at least one main processor (CPU), which for example, one or more integrated units (coprocessors) itself may have such as a graphics processor. The processor may be implemented, for example in the form of an electronic circuit, for example, as a semiconductor chip. The above-mentioned control unit of the device may be a (logical or integrated) unit of the processor. The processor is for example connected to the internal data memory of the device to access the data store, in particular in order to retrieve the data in the internal memory loaded computer program product or its loaded software code portions and then (as the control unit of the
Device) to execute the above steps of the process (synchronized activation of the screen and camera). The respective steps of the proposed method may be encoded for example in the form of instructions in the software code portions that are executable by the processor of the Ge-Raets. In carrying out these instructions acts of
Processor then, for example, as said control unit of the device.
The proposed method can include other steps which are described in more detail below. The control unit of the device may be configured to perform these further steps. Accordingly, the computer program product may contain more software code sections, in which respective instructions may be encoded, which can be executed by the processor of the device. In carrying out these instructions, the processor then acts in turn as, for example, said control unit of the apparatus or as a fur-tere unit of the apparatus, for example, as an evaluation unit of the device.
Alternatively, the further method steps can also be performed by other components of the system. For example, the evaluation unit can be arranged externally of the mobile electronic device. In the evaluation, it may therefore for example also around an appropriately equipped computer, for example a computer server of a computer network, act or a (logical or integrated) unit of a processor of the computer. There are possible also mixed forms in which the evaluation unit is distributed to several components of the system and is formed for example by (logical or integrated) units of several processors, such as the processor of the device, and a processor of said computer or computer server.
In some embodiments, the method is thus entirely carried out by means of the mobile electronic device alone. In other embodiments, the process is partially carried out by means of other components of the system, for example by means of one or more computers (such as Internet or cloud), the communication and data transmission between the device and the other components for example, via the Internet or via a cloud can.
Storing data or other application-related information in an external storage system (eg, a cloud storage) is required neither functional nor safety reasons, neither is the concepts described here, but contrary. For example, the use of external data memory can be provided when the storage of certain data on the internal data memory of the mobile device for some reason is not possible, for example due to large amounts of data, for licensing reasons and / or for safety reasons.
A mainly or exclusively local processing and / or storage of data by means of the mobile device may generally or, in certain cases be, for example,
(1) to reduce the transmitted data volume from the mobile device to an external server / storage,
(2) in the absence of coverage or insufficient bandwidth of a mobile data connection at the respective location of the measurement (eg in the agricultural field or in reinforced concrete buildings), and
(3) the case of sensitive data, which may relate to itself, for example the measurement object, the location of the measurement, or the user, particularly when the mobile device is equipped with a GPS module. As examples such as the measurement of secret objects or secret surface chemical compositions may be mentioned, as well as the measurement at a location to be kept secret, such as when a place for storage of the respective measurement object is used, or if the location of measurement, or the measurement data unwanted conclusions on could enable the user, for example via the health condition of the user, his address and his consumer behavior.
For example, the control unit of the mobile device and / or, if present, be set up, the evaluation unit of the mobile device, generally, or at least complete the evaluation, the evaluation of the measured data itself at predefined applications and to store all the incoming data only in the internal data memory. The control unit may be further configured to transfer the measurement data and / or derived from this data (in particular GPS data) to avoid to external devices or to block. In addition, controlled based on the GPS data, the functionality of the system, limited or completely prevented.
Provided below or in the claims, the "Eingerichtetsein" of the control unit or the evaluation unit for carrying out further operations is described, as these operations are to be understood steps of the proposed method as possible (optional). Accordingly, the computer program product may comprise software code sections, in which Unless described vice versa in the following instructions are coded to perform these further operations, for example, for execution by the processor of the device or other component of the system., the process steps by means of a component of the system can be performed, such as by means of the control unit, the evaluation unit or other component, then this also
implies a corresponding "Eingerichtetsein" of the respective components. This "Eingerichtetsein" may for example again by loading the appropriately designed computer program product, for example, on the device or said further computer of the system are made possible.
The predefined illumination image sequence is typically fully defined by illumination parameter partially or preferably. Concrete examples of lighting parameters are described below. The lighting parameters are typically stored on at least one data memory of the system, for example on the internal data storage of the mobile electronic device and / or on a data memory of another component of the system, for example said computer. For example, can be made to the device automatically saves the Beleuchtungspa-parameters in the internal memory of the device by loading the computer program product. For example, the software code of the computer program product definitions and / or values of the parameters may include illumination. The control unit of the mobile electronic device may be configured to retrieve in the at least one data storage stored illumination parameters from the data memory and to determine on the basis of the retrieved lighting parameters predefined lighting sequence. Typically, the control unit controls only after the screen for displaying the images of the illumination light thus obtained predefined image sequence and in synchronism therewith the camera for capturing the measurement images.
The mobile electronic device may have a user interface, which the device can be operated by means perform, for example, the proposed method. For example, the predefined illumination image sequence may be adjustable or at least influenced via the user interface, for example, by setting or changing rpm least one of the illumination parameter. Additionally or alternatively, a selection between different (stored) predetermined illumination image sequences may be enabled by means of the user interface, wherein the illumination image sequences differ from each other, for example by one or more lighting parameters. Additionally or alternatively, it is also possible that the type of the measurement object to be examined can be entered by means of the user interface. In addition to such further input command, for example, a selection of properties of interest of the selected object to be measured can be made possible by means of the user interface. From such inputs via the user interface and the subsequent evaluation of the measured images may depend in addition to the definition of the lighting sequence. For example, the inputs may also be considered by an evaluation unit of the system, as will be described further below. From such inputs via the user interface and the subsequent evaluation of the measured images may depend in addition to the definition of the lighting sequence. For example, the inputs may also be considered by an evaluation unit of the system, as will be described further below. From such inputs via the user interface and the subsequent evaluation of the measured images may depend in addition to the definition of the lighting sequence. For example, the inputs may also be considered by an evaluation unit of the system, as will be described further below.
For example, it can be provided that a plurality of different illumination image sequences or a plurality of different pre-defined sets of lighting parameters defining each one of the plurality of illumination image sequences are predefined and as described above in one or more of the stored data memory mentioned. The different predefined lighting effects or lighting picture parameter sets may, for example, one of several different predefined (measuring) (for example, defined by the respective test object, the property of interest and / or action recommendation) associated applications. (Examples of various applications are given below.) For example, it may be provided that the user selects a specific application (e.g., via the user interface of the mobile device) (for example, from at least one displayed by the user interface application list) and the control unit then respectively in dependence on the selected application from the data memory that pertains to the selected application predefined illumination image sequence ( . illumination parameters), and then reads out the measurement with the read-out illumination image sequence (or the read-out illumination parameters) is carried out as described. Additionally or alternatively, it is also possible that the evaluation of the measurement images depends on the selected application.
The screen can be configured as a touch screen and thus serve as said user interface of the device, for example via the representation of a graphical user interface with touch-screen displayed on the input fields.
The user interface can also be configured to output a warning, for example, when ambient light effects are judged to be too large or if an operation performed image registration of the measurement images or object recognition, for example, due to the object properties or user behavior could not be successfully completed.
The user interface may include an audible output of the device, which can for example, generate the alarms mentioned above. The user interface may include a vibration module of the device, which can for example generate warning messages mentioned. Other user interfaces can be implemented with display, for example by means of further communicating devices such as
Smart Watches and head-mounted displays. The different modules can be thereby used in combination, provided each is present.
for (intermediate) the at least one internal data memory or external data memory, for example of said further computer can serve storing the recorded measurement images. Accordingly, the control unit may be adapted, a transmission of the recorded measurement images on these at least to perform a data store or to be initiated.
Furthermore, the control unit may be adapted to drive the screen to display the captured measurement images, for example automatically after receiving the measurement images. For example, for example, measurement results on the screen during or immediately after the measurement on the screen of the device can be displayed and, for example, overlap a captured image of the measurement object or a current live image of the camera on the screen by means of the screen in this way, for example, augmented implement reality techniques.
On at least one internal data memory of the device and an operating system of the device may for example be installed, such as iOS, Android, Windows, Linux, BlackBerry OS or another operating system, and typically more application programs, such as an Internet browser and / or an App Store application. over
App Store application, for example, an (internet), so be producible connecting the unit to an app store an Internet-based digital distribution platform for application software, such as Apple's App Store or Play Store of Google. In one embodiment, the computer program product is loaded as an application via this app-store application to the internal data memory of the device, where it is stored, for example permanently (for example, up to one initiated by the user and / or confirmed deletion). Another option is to copy the computer program product and the app directly (eg via USB cable) to the device, particularly the smart phone, if this is not blocked by the operating system. In another embodiment, the computer program product can be loaded as a web app via the Internet browser of the device from an Internet page of a provider to the internal memory of the device. The web app, for example, temporarily (for example, only for a predefined period of time or for a predetermined number of penetrations of the process) stored in the internal memory and then automatically deleted from the internal memory of the device. In all cases, however, the computer program product is run preferably directly after loading into the internal SpeI more of the device on the device and usable for a user for carrying out the method. The web app, for example, temporarily (for example, only for a predefined period of time or for a predetermined number of penetrations of the process) stored in the internal memory and then automatically deleted from the internal memory of the device. In all cases, however, the computer program product is run preferably directly after loading into the internal SpeI more of the device on the device and usable for a user for carrying out the method. The web app, for example, temporarily (for example, only for a predefined period of time or for a predetermined number of penetrations of the process) stored in the internal memory and then automatically deleted from the internal memory of the device. In all cases, however, the computer program product is run preferably directly after loading into the internal SpeI more of the device on the device and usable for a user for carrying out the method.
The device typically includes one or more wired or preferably wireless data interface, such as at least a radio interface to connect the device, for example to the Internet or other possible components of the system, such as one or more computer servers, for example via the Internet.
The mobile (portable) electronic device is as light as possible, in order from a user easily (in particular during the above process steps, that is, during the display of the illumination images and the detection of the measurement images) is preferably relatively with two hands or with just one hand in a suitable position to be aligned to the measurement object and held. therefore, the device weighs preferably less than 3 kg, less than 2 kg or less than 1 kg. A maximum edge length of the housing be-wearing typically no more than 30 cm, typically less than 25 cm or less than 20 cm. For example, the housing may be configured substantially cuboidal. A minimal edge length is typically less than 5 cm, preferably less than 2 cm.
in general, the camera includes a lens that is arranged on a front side of the housing and defines the observation area of the camera.
The screen is then typically also arranged on the front of the housing. The camera (at least the object of the camera) and the screen are typically located on the same side of the housing and visible from the same side of the housing. The camera also typically micron summarizes an image sensor, for example a photosensitive semiconductor chip, such as a CCD - or CMOS sensor or an InGaAs sensor.
Further, the device may include a speaker and a microphone to enable, for example, by means of a device installed in the internal memory telephony application calls over a mobile radio telephone network or over the Internet. Furthermore, the device may comprise a (rechargeable) energy storage include, for supplying the unit with electrical energy, in particular of the screen, the camera and the control unit of the device.
In carrying out the process by means of the system of the image screen of the device emits light during the illumination of displaying images. Characterized in that the screen is facing the observation area of the camera, arranged in the observation area of the camera object to be measured can thus be illuminated by means of the screen. Here, the light emitted when displaying images of the illumination light from the screen to the measurement object is reflected on the measurement object and captured by the camera.
Here, the reflected light typically enters through the lens of the camera in the camera and is imaged on the image sensor of the camera.
The image sensor of the camera typically has a plurality in a total grid arranged sensor units. Each of the sensor units may comprise one or more sensor elements of the image sensor. For example, each sensor unit corresponds to an image point (pixel) of a measurement image captured by the camera. The positions of the sensor units and their sensor elements within the image sensor are defined by two sensor coordinates (XY) of the respective sensor unit.
Each of the measurement images thus also includes a plurality in a general raster arranged image points (pixels), which are assigned to the sensor units of the image sensor and their positions within the respective measurement image by two image coordinates (XY) are defined corresponding to the sensor coordinates of the respective sensor units typically. The measurement images also include image data in which image information is encoded. In the image data as brightness values of the respective pixels of the measurement images are coded. The brightness values of the pixels of the measurement images are typically dependent on the charge or discharge state of the light-sensitive sensor elements of the sensor units while detecting the respective measurement image.
Due to the difference of lighting images different measurement images contain different information about the target. For example, the illuminant images can be prepared by the spectral composition of the light emitted at its display from the screen of light different from each other. Alternatively or additionally, it is possible that the light images are arranged in different areas of the screen, so that the measurement object is illuminated from the perspective of the camera from different directions.
Thus, it is advantageously possible to obtain different information about the reflective properties or other properties of the measurement object from the captured images each measurement. In addition, the information content of the measurement images can be easily influenced by changing the lighting sequence.
Another important advantage is that the mobile electronic device may be, for example, a smartphone, a tablet computer ( "tablet"), a laptop or a similar popular mobile electronic device. Advantageously, it is for the user / consumer very simple, a to configure such a commercially available device for carrying out the proposed method, for example, namely, simply by the user / consumer, the proposed computer program product on the
Device loads, such as an app store or from a web site of a provider of the computer program product as described above. Thus, the system and method are very inexpensive compared to many conventional measuring devices on the illumination image sequence and, very variable configurable and also intuitively applicable to many users or feasible, for example, in the mobile device integrated evaluation unit for data evaluation, as described below. Another advantage over prior art systems that the mobile electronic device does not need to be retrofitted with additional (external) optical hardware, neither for the production of a dispersing optic effect, or to the control / monitoring of specific parameters of the lighting and / or the image recording. The method described here can be advantageously carried out so, without the device having to retrofit this with other optical or electronic components. In particular, it is not necessary for the method to retrofit the mobile device with additional components such as filters, lenses, mirrors, visors, shields, light sources, sensors, etc., or to arrange such components during the execution of the process between the mobile device and the measurement object.
Prior to taking the measurement images can be provided that preprocessing steps that can be performed automatically by the camera turned off, or disabled. For example, that carried out automatically by the camera setting a color temperature of the recorded images is turned off or that the color temperature is set for example to a fixed value, and then taken into account in the evaluation of the measurement images can be provided. The same applies to automatic settings of other exposure parameters of the camera, such as sensitivity, exposure time and White balance-equal.
According to that an automatic brightness control of the display is turned off (by the controller) and the illumination is set for example to the highest possible brightness can be provided.
The screen of the mobile electronic device emits light in a rule mainly or solely in the visible range, so light having wavelengths between about 400 nm and about 800 nm. Typically, the screen is a color screen and thus set up to display color images. For example, the screen includes a plurality of color channels. In each of the color channels of the screen comprises a channel-specific spectral emission characteristic which will be referred to hereinafter as Dd (A). The light emitted in a light color channel thus has a pre-defined for that color channel spectral ntensitätsverteilung I and corresponds to a displayable with the screen color of the screen. For example, the screen can a red color channel, have a blue color channel and a green color channel. The colors of the color channels, so for example, red, green and blue, then represent the primary colors of the screen. The screen and the cameras are typically adapted to the human visual system. Visible light having wavelengths up to about 485 nm is used as a blue, of about 500 nm to approx. 550 nm is perceived as green and from about 630 nm as red. Corresponding to the red color channel (mainly) emits light in a red wavelength region, the green color channel (mainly) in a green wavelength range and the blue color channel of the screen light (mainly) in a green wavelength range. The screen and the cameras are typically adapted to the human visual system. Visible light having wavelengths up to about 485 nm is used as a blue, of about 500 nm to approx. 550 nm is perceived as green and from about 630 nm as red. Corresponding to the red color channel (mainly) emits light in a red wavelength region, the green color channel (mainly) in a green wavelength range and the blue color channel of the screen light (mainly) in a green wavelength range. The screen and the cameras are typically adapted to the human visual system. Visible light having wavelengths up to about 485 nm is used as a blue, of about 500 nm to approx. 550 nm is perceived as green and from about 630 nm as red. Corresponding to the red color channel (mainly) emits light in a red wavelength region, the green color channel (mainly) in a green wavelength range and the blue color channel of the screen light (mainly) in a green wavelength range.
The screen typically has a plurality in a total grid of the screen arranged light-emitting elements, which image points (pixels) forming the screen and together fill a total image area of the screen. Each of the color channels is then formed by a subset of the light-emitting elements of the screen, the spectral emission characteristics coincide with the channel-specific spectral emission characteristics of the respective color channel. Each pixel of the screen is formed for example by a group of adjacent luminous elements that belong to the different color channels. Belonging to a common pixel light-emitting elements of different color channels are referred to as sub-pixels of the screen. The light-emitting elements of each color channel are respectively arranged in a grid.
Typically, in the camera of the mobile electronic device with a color camera, that is pfindlich for light having wavelengths between about 400 nm and about 800 nm and em several different
Having color channels. For each of the color channels typically have the camera on a channel-specific spectral sensitivity, which in the following as C C is designated (A). For example, the camera may include a red color channel, a blue color channel and a green color channel. In many cases, the wavelength range of the color channels of the camera tune color channels of the screen pairs largely (typically but not completely) consistent with those.
Each of the color channels of the camera is formed by a subset of sensor elements of the image sensor, the spectral sensitivities correspond to the ka-nalspezifischen spectral sensitivity of each color channel of the camera. Each sensor unit of the image sensor of the camera is formed for example by a group of adjacent sensor elements of the image sensor, which belong to the different color channels of the camera. The sensor elements of each color channel are thus arranged in a grid part that extends beyond the image sensor. The sub-scanning of the sensor elements of the different color channels overlap spatially with each other and thus form the total height of the sensor units of the image sensor. For example, the sensor elements of the red color channel are most sensitive to red light, the sensor elements of the green color channel most sensitive most sensitive to green light and the sensor elements of the blue color channel for blue light. Red light has, for example, a wavelength of about 605 nm or more, the green light has a wavelength of about 555 nm and blue light of about 450 nm or more. Other examples of wavelength ranges for the different colors are given above.
For example, the control unit of the mobile electronic device is adapted to control the display of the mobile electronic device to display one or more images or each of the illumination of the predefined illumination image sequence
- by activating the phosphor elements of only one color channel of the screen and by driving all the activated luminous elements of this
Color channel with a predefined color for that channel uniform brightness value, or
- by activating the light emitting elements of multiple color channels, and by driving all the activated luminous elements with a pre-defined for each color channel uniform brightness value, or
- by replacing the above-mentioned uniform brightness values by a gradient. Instead of having a uniform brightness value, the activated light emitting elements of a given color channel can be controlled for example with different brightness values that differ according to a given color channel for this gradient from each other. The gradient of each color channel may be for example a predefined vector away unitary (that is constant) may be, for example, over the entire screen. The brightness values of the light-emitting elements of this color channel then take along the direction of the gradient vector corresponding to the amount of the gradient vector uniformly (or alternatively take evenly).
The activation of the light-emitting elements of a color channel may for example take place by turning the light-emitting elements or by driving with a uniform brightness value, which is greater than a kleinstmögli-rather brightness value of the light-emitting elements. In order to achieve the brightest possible illumination of the measurement object by means of the activated luminous elements of the respective uniform brightness value, preferably a maximum brightness value of the light emitting elements corresponds.
Accordingly, the respective non-activated light elements of übri-gen color channels can be switched on or remain switched off or are each driven with a smallest brightness value.
By driving with a uniform brightness is achieved in that the respective illumination image has a uniform color, so that each pixel of the screen in these uniform color lights or, if the illumination image is not the entire screen, so the total image area of the screen fills up, is turned off on or only with the smallest possible brightness. In this manner, the measurement object can be spatially homogeneous illuminated with light of a defined spectral intensity distribution with the screen.
For example, if only one color channel of the screen is on, the screen will light uniformly in the respective color of the screen, for example, in red, green or blue. For example, the lighting sequence may include a red light image, a green light image and a blue image light, or only one or only two of these illumination images. The control unit is, for example, adapted to control the screen,
- display the red illumination image by activating the light-emitting elements only the red color channel of the screen and by driving all the activated luminous elements of the red color channel with a predefined for the red color channel uniform brightness value,
- display the green lighting image by activating the light-emitting elements only of the green color channel of the screen and by driving all the activated luminous elements of the green color channel with a predefined for the green color channel uniform brightness value, and / or
- display the blue light image by activating the light-emitting elements only the blue color channel of the screen and by driving all the activated luminous elements of the blue color channel with a predefined for the blue color channel uniform brightness value. The order of illumination images can be arbitrarily set.
By simultaneously activating a plurality of color channels of uniform mixtures of the primary colors of the screen can be generated. One of the illumination images may be for example a white light image (hereinafter referred to as white picture), in which activated all lighting elements of the screen and controlled with the maximum brightness value. Another illumination image can be, for example, a black light image (hereinafter also referred to as a black screen), are in the process off all lighting elements of the screen or disabled or be driven by the smallest possible brightness value. The white light image and the black image lighting can be used for example for calibration of other measurement images and for estimating ambient light influences. Based on the determined maximum and minimum brightness calibration to account for ambient light effects can be achieved for example by a linear function (shift and scaling). It can also be reached via a nonlinear function, for example, dark areas of the image to raise or lower light areas in the image.
To define the lighting Pictures at an example,
eral use the following lighting parameters:
- spectral composition of the light emitted from the screen when displaying the respective illumination light image, and / or
- for each color channel of the screen, respectively, a uniform brightness value, and / or
- a screen area that is filled by the respective illumination image, especially the size and shape of the screen area, and / or
- an arrangement of a screen area that is filled by the respective illumination image, within the total image area of the screen.
Each of the illumination images is typically contiguous. For example, one, several or each of the illumination images can completely fill each of the total image area of the screen. It is also possible that one, in each case fill several or each of the illumination images only a portion of the total image area of the screen, wherein the screen outside of the completed from the illumination image portion is typically black (ie, light-emitting elements are turned off, or not activated, thus does not illuminate or only with the smallest possible brightness). The each filled by the light images of the screen corresponds, for example at least 1/6, 1/5, 1/4, 1/3, 1/2 or more of the total image area of the screen. For example, the lighting sequence of images R may include illumination images, filling each about only l / R-te \ of the total image area of the screen, where R is, for example, a natural number which is greater than 2, and for example smaller than the twentieth It is typically between 3 and 10. For example, therefore, R = 3, 4, 5 or 6. Typically overlap each filled partial areas of the illumination images are not each other on the screen.
The filled portions of the illumination images can be arranged at a same point within the entire image area of the screen. Typically, the illumination images differ but then, at least in color from each other. Alternatively, it is possible that the illumination images differ not only in color but also in their arrangement on the screen. It is also possible that the lighting pictures do not differ in color, but only in their arrangement on the screen.
For example, the image contents of each illuminant image may each have a single-colored filled area to be (which the partial region referred typically completely fills), the color (for example red, green or blue) may be of the screen, for example, one of the primary colors or white (all color channels of the same, preferably wise maximum brightness), as described above.
Have the illuminant images of the same color, and they differ only in their position on the screen, so it is in the illumination images typically each to monochrome solid areas (which are the respective partial areas completely fill), the color, for example, each have the same color (for example, red, green or blue) of the screen or white (all color channels of the same, preferably wise maximum brightness).
For example, the total image area of the screen may have an upper edge, a bottom edge, a left edge and a right edge, wherein the solid portions of the illumination images differ from each other in their distance from the upper edge of the total image area of the screen, the lens above the upper edge of the total image area of the screen is arranged.
For example, the lighting sequence of images may be defined by one or more of the following other lighting parameters:
- Total number of illumination images,
- order of illumination images,
- duration of the illumination images,
- temporal distance between the display of the individual illumination images.
The total number of illumination images obtained, for example from the number of color channels of the camera and computer screen. Both have for example three mutually corresponding color channels (e.g. red, green and blue), so the illumination image sequence may include at least three light images, one for each color channel (red, green and blue). In addition, the illumination image sequence may include the white image and the black image as described above, so that the illumination image sequence then, for example (at least) includes five light images. The order may be examples game as set arbitrarily. The display duration is at least as long chosen such that the image sensor while recording the measurement images can be sufficiently exposed. The display time is typically in a range between 10 ms and 500 ms, preferably in a range between 100 ms and 200 ms. The lighting and images are typically not displayed one after the other at the same time. The time interval between displaying the individual lighting images is typically in a range between 1 ms and 20 ms, preferably in a range between 5 ms and 10 ms. A total duration for detecting the measurement images is therefore typically in a range between 60 ms and 3000 ms.
Each of the recorded measurement images includes a plurality of pixels and image data associated with the respective pixels. As described above, the system may comprise an evaluation unit which is part of the device (for example, as a logical or integrated unit of the processor of the device) or part of another component of the system (for example, as a logical or an integrated unit of the processor of the respective component ) can be for example a computer server.
The evaluation unit is for example arranged for example by means of the computer program product zusammenzu-run the pixels of the measurement images and combine the image data merged pixels to data rates of the respective merged pixels. Typically, the merging of the image points is effected by means of an image registration of the measurement images. The merged image points then form a single registered measurement image, the pixels of the registered Messbil-des include the respectively associated measurement data sets.
Further processing of the captured image data is preferably performed using this measurement data sets. Instead of a single, sequential evaluation of individual measured images, the evaluation can be carried out simultaneously by using the measured data sets described all measurement images of time and thus for all measurement images. From the image data of
Measurement images measurement data sets obtained provide, for example, (hyper) spectral data sets represent (below also referred to as spectral fingerprints) that are each associated with a common spatial position in the measurement images and that the measurement data of several or all measured images,
which have been recorded during an illumination image sequence include. By using the proposed measurement data sets, it is possible the measurement data with spatial resolution process (by the measurement records the merged Bildunkte processed individually wer-den). Each measured data set can thereby be considered as independent of the other measured data records measurement, which depends on the local characteristics of the object in the imaged by the measurement data set each object region. Depending on the resolution of the camera, a large number of independent measurements can be generated thereby through the unique implementation of the proposed measurement method, which are each represented by one of the measurement records. Due to the plurality of the measurement data records generated at each measurement, the measurement data sets are particularly suitable as training data for training of machine learning algorithms, such as classification of methods, such as artificial neural networks. Accordingly, the data sets are just as well for evaluation by such algorithms. Preferably, the evaluation unit is set up to evaluate the measurement data records by means of an algorithm trained by a method of machine learning or trainable, such as a classifi Kati-onsverfahren, such as an artificial neural network. Characterized in that the illumination image sequence is predefined, and the data format of the measurement data records is pre-defined in a corresponding manner. In particular, can be defined by the definition of the illumination image sequence, for example, in advance, the components of the measurement data records to which BL LEVEL-tung image (and thus, for example, at what wavelength range) belong. Such a fixed assignment facilitates the further processing of the measurement data records by means of predetermined evaluation algorithms or calibrated models typically require a specific data format or programmed to process a particular data format.
Image transformations of the measurement images are typically required for the image registration of the measurement images, for example, (local) coordinate transformations (rotation, translation, tilt, and / or (local) rescaling, sub-pixel interpolation), to compensate for relative movements between the device and the measured object during the acquisition of measured images or herauszurechnen.. Ideally, there is a l: l-Korrepondenz between the
Pixels of the measurement images, but typically a l: X-correspondence, wherein X * X * 1. l, the measured values of the combined pixels are typically interpolated or averaged to determine the measured data sets.
For example, on the basis of the measurement images, preferably on the
Basis of the registered measurement image, an object recognition algorithm are performed to identify those pixels in the image or measuring in the registered measurement image, which image the object to be measured. These pixels are referred to as object pixels. Each of these erkann-th object pixels depicts a portion on the surface of the object in the measurement image and registered in the measurement image. These sections are referred to as object points. For example, the object detection algorithm may include a "region growing" algorithm. At the beginning of this algorithm, a first image point is defined, which is believed that it is an object pixel. As a first pixel, a pixel in the middle one of the measurement images or the registered measurement image may for example be defined. Alternatively, the first pixel may also be defined by the user via the user interface. , Especially when the screen is configured as a touch screen example, by marking an area on a displayed screen with the measurement image displayed or recorded measurement image. It is then checked how much the measurement data sets of adjacent pixels differ from the measuring data of the first pixel. Only at a sufficiently small deviation a neighboring pixel is also classified as an object image point. This algorithm is (starting from the classified each as a new object points) continues as long or iterated
Have the screen and the camera on a plurality of color channels and different lighting images in color, as described above, each of the measurement data sets may for example be a so-called
Be "spectral fingerprint" of the measurement object in the corresponding object point of the measurement object., The screen includes, for example, M color channels and the camera, for example, N color channels, each of the measurement data sets may for example M χN readings or included more. For example, an illumination image, a measuring image can be used for each color channel of the screen and for each of these lighting images are recorded, said measured in the individual color channels of the camera brightness values are contained as individual measured values in the measurement data sets. The (first) M x N measured values of the measurement data set of an object point corresponding to, for example, the various possible combinations of the color channels of the screen with the color channels of the camera. For example, M = 3 and N = 3 may be, when the camera and the screen each have the color channels of red, green and blue. If, in addition, the white image and the black image display described above, and each was added a measurement image of each measurement dataset (M + 2) can χN measurement values comprise.
The related to an object point of the measurement object or to its object pixel data set is referred to hereinafter by F (d, c) when the camera and the screen each have a plurality of color channels. The index d indicates the color of the illumination images (or the color channels of the screen) and may for example be defined numerically and meet according to the above examples, for example, l 1 and N> 1, wherein each respective data set comprises at least M χ N measured values (F (d, c) with l
Documents
Application Documents
| # |
Name |
Date |
| 1 |
201917026615.pdf |
2019-07-03 |
| 2 |
201917026615-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [03-07-2019(online)].pdf |
2019-07-03 |
| 3 |
201917026615-STATEMENT OF UNDERTAKING (FORM 3) [03-07-2019(online)].pdf |
2019-07-03 |
| 4 |
201917026615-PRIORITY DOCUMENTS [03-07-2019(online)].pdf |
2019-07-03 |
| 5 |
201917026615-FORM 1 [03-07-2019(online)].pdf |
2019-07-03 |
| 6 |
201917026615-DRAWINGS [03-07-2019(online)].pdf |
2019-07-03 |
| 7 |
201917026615-DECLARATION OF INVENTORSHIP (FORM 5) [03-07-2019(online)].pdf |
2019-07-03 |
| 8 |
201917026615-COMPLETE SPECIFICATION [03-07-2019(online)].pdf |
2019-07-03 |
| 9 |
abstract.jpg |
2019-08-09 |
| 10 |
201917026615-FORM 3 [22-01-2020(online)].pdf |
2020-01-22 |
| 11 |
201917026615-FORM-26 [28-04-2020(online)].pdf |
2020-04-28 |
| 12 |
201917026615-Verified English translation [21-09-2020(online)].pdf |
2020-09-21 |
| 13 |
201917026615-FORM 18 [10-11-2020(online)].pdf |
2020-11-10 |
| 14 |
201917026615-Proof of Right [16-09-2021(online)].pdf |
2021-09-16 |
| 15 |
201917026615-PETITION UNDER RULE 137 [16-09-2021(online)].pdf |
2021-09-16 |
| 16 |
201917026615-FER.pdf |
2021-10-18 |
| 17 |
201917026615-OTHERS [15-02-2022(online)].pdf |
2022-02-15 |
| 18 |
201917026615-Information under section 8(2) [15-02-2022(online)].pdf |
2022-02-15 |
| 19 |
201917026615-FORM 3 [15-02-2022(online)].pdf |
2022-02-15 |
| 20 |
201917026615-FER_SER_REPLY [15-02-2022(online)].pdf |
2022-02-15 |
| 21 |
201917026615-DRAWING [15-02-2022(online)].pdf |
2022-02-15 |
| 22 |
201917026615-COMPLETE SPECIFICATION [15-02-2022(online)].pdf |
2022-02-15 |
| 23 |
201917026615-CLAIMS [15-02-2022(online)].pdf |
2022-02-15 |
| 24 |
201917026615-ABSTRACT [15-02-2022(online)].pdf |
2022-02-15 |
| 25 |
201917026615-PETITION UNDER RULE 137 [16-02-2022(online)].pdf |
2022-02-16 |
| 26 |
201917026615-US(14)-HearingNotice-(HearingDate-18-11-2025).pdf |
2025-11-03 |
| 27 |
201917026615-FORM-26 [14-11-2025(online)].pdf |
2025-11-14 |
| 28 |
201917026615-Correspondence to notify the Controller [14-11-2025(online)].pdf |
2025-11-14 |
Search Strategy
| 1 |
201917026615E_02-07-2021.pdf |