Sign In to Follow Application
View All Documents & Correspondence

Method For Controlling User Interface And Program And Device

Abstract: The embodiments disclose a method for controlling a user interface that gives instructions to an application on the basis of an operation made by a user on a display on a screen. The method has: a step in which a device to which the screen belongs acquires the display; a step in which at least one feature of the acquired display is extracted; a step in which a user action is received; a step in which a prescribed operation is retrieved from a database by means of the received action and the at least one extracted feature; and a step in which the retrieved prescribed operation which is different from the action is applied in accordance with the display and instructions are thereby given to the application.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 April 2019
Publication Number
27/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
mehta@mehtaip.com
Parent Application

Applicants

HI CORPORATION
4-15-7, Nishi-shinjuku, Shinjuku-ku, Tokyo 1600023

Inventors

1. AOYAMA, Tomonobu
c/o HI CORPORATION, 4-15-7, Nishi-shinjuku, Shinjuku-ku, Tokyo 1600023
2. SASAKI, Tatsuo
c/o HI CORPORATION, 4-15-7, Nishi-shinjuku, Shinjuku-ku, Tokyo 1600023
3. KATAOKA, Seiichi
c/o HI CORPORATION, 4-15-7, Nishi-shinjuku, Shinjuku-ku, Tokyo 1600023

Specification

0001]The present invention relates to a method for controlling a user interface, a program, and apparatus.
BACKGROUND
[0002]A mobile phone, a portable terminal, a music playback device, such as a tablet, the device having a CPU, such as a laptop computer, a wide variety of applications to work. These applications, as well as accept an instruction from a user is provided with a user interface for providing information to the user.
[0003]
 Each application running on these devices, uniquely designed by each unique user interface, realizes the interaction between the user and the equipment in a unique manner. Of course, the basic user interface provided by the operating system, the applications running on the same operating system, often are unified. In other applications, group same company was developed based on the same design concept, it is often the same user interface is employed.
[0004]
 However, the world, there are a plurality of operating systems, and because also present are various peripheral devices supported by the operating system level, may differ from each operating system in this basic user interface is there. Further, even in applications group made under the same design concept in one company, there may be different parts in their user interface. Moreover, even for the same application, the version differences, there is a case where the user interface is changed. Thus, for each application, often also interaction between the user and the equipment different.
[0005]
 Also, if you like car driving, it may not be able to operate the mobile terminal, sometimes interfering with the operation of an application in operation.
[0006]
 Thus, for example, take the case of enlarging the display screen as an example, by the application, which instructs the pinch-out, which instructs a mouse double click, such as that indicated by rotating the mouse wheel, various instructions are present.
[0007]
 Accordingly, even when providing the same instruction, by the individual application, often different operations for the user is requested. Further, such as during operation, which may hinder itself to operate.
[0008]
 In such a situation, a technique of replacing the operation of the user via an interface to another operation exists.
[0009]
 For example, it is possible to short cut the branches by an audio and / or manual user interface for accessing the functions of the device that can be accessed through a hierarchy of menus, certain voice commands called voice shortcut. The device user storage means, means for finding a voice shortcut corresponding to a sequence of the stored interaction, if the voice shortcut is found, the existence of this voice shortcut for storing a sequence of interactions by the user techniques and means to deliver messages to the user is present to notify (e.g., see Patent Document 1).
[0010]
 Further, in the user interface for querying and displaying records from a database, it is operated given user profile, cooperating with queries in the same way as other criteria. For example, a user incorporates the user profile explicit selection in a rule format, more in the same manner as typical selective profile, such techniques "implicit" profiles can be added to the query exists (e.g., patent references 2).
[0011]
 However, techniques to replace such an operation, for a particular application, by realizing the like shortcut, a technique directed to the purpose of eliminating the labor of the operator. Thus, such prior art, for each application, to perform the operation corresponding to each operation of the user, it is necessary to design each application.
[0012]
 Still, application developer, for each application, it is necessary to fabricate the user interface, the user must learn the operation in accordance with the user interface.
[0013]
 Therefore, in various usage, for access by the user ease, ease of use or the like (Accession tweezers capability), be further improved are desired for various applications.
CITATION
Patent Document
[0014]
Patent Document 1: JP 2000-231398 Patent Publication
Patent Document 2: JP-T 2003-529154 Patent Publication
Summary of the Invention
Problems that the Invention is to Solve
[0015]
 The technology disclosed, for various applications, for access by the user ease, ease of use or the like (Accession tweezers capability), it aims to improve.
Means for Solving the Problems
[0016]
 One embodiment, based on a user operation on the display screen, a method for controlling a user interface providing instructions for an application, device having the screen, a step of acquiring the display, which is acquired the Search extracting one or more features present in the display, and receiving an operation of a user, the operation and received, said by the one or more features extracted, a predetermined operation from the database a step, a retrieved the predetermined operation, said predetermined operation different from the operation by applying in response to said display, comprising the steps of providing an indication to the application
 discloses a method comprising the .
Effect of the invention
[0017]
 The disclosed technique, for various applications, for access by the user ease, ease of use or the like (Accession tweezers capability), can be further improved.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018]
Is a diagram illustrating an example of FIG. 1 device configuration in one embodiment.
FIG. 2 is a diagram illustrating an outline of functions of an embodiment.
3 is a diagram illustrating an example of an operation defined in one embodiment.
Diagrams [4] shows a concrete example of display of the application.
Is a diagram showing an example for designating the enlargement and reduction of the 5 screen.
6 is a flowchart illustrating operation of one embodiment.
7 is a diagram illustrating an example of an operation environment of the user interface control program in which an embodiment is implemented.
8 is a diagram illustrating an example of an operation environment of the user interface control program in which an embodiment is implemented.
[9] The user operation, from the feature extraction results of a screen display, a diagram illustrating an example of creating a database for extracting appropriate instructions and operations.
Is a diagram illustrating an example of FIG. 10 hardware configuration.
DESCRIPTION OF THE INVENTION
[0019]
 
 Figure 1 shows an example of a device configuration of an embodiment. This embodiment includes a mobile terminal 100, the server device 110, and an interface device 160. The server device 110 and the interface device 160 may not necessarily be present, only the mobile terminal 100, embodiments may be implemented.
[0020]
 Embodiment, for example, may be implemented as a program running in the mobile terminal 100. Further, embodiments, the mobile terminal 100 and another device (e.g., server 110, interface device 160) and may be implemented in conjunction. Embodiments may be implemented as a method invention. Furthermore, embodiments may be implemented as an invention of a material such as device.
[0021]
 Interface device 160, for example, voice 140 is emitted from the user 142, gesture 152 of a human finger, the gesture 154, receives information such as a signal from the switch SW 130. Interface device 160 converts the received information into an electrical signal, via the communication 145, by wireless or wired, may send the converted information to the portable terminal.
[0022]
 The mobile terminal 100, mobile telephone network, a wireless local area network (WiFi), Bluetooth (registered trademark), a communication function such as a near field communication (NFC) is mounted. The mobile terminal 100 uses the communication function, such as the Internet network system, such as, or using a P2P communication, by communicating with various computer such as another mobile terminal, or the server apparatus 110, the various information it is possible to perform the transmission.
[0023]
 Further, the mobile terminal 100, without passing through the interface device 160, voice 140, gesture 152 of a human finger, gesture 154 may receive information, such as a signal from the switch SW 130. Mobile terminal 100 includes one or more cameras 102, for example, can be acquired gesture 152 of a human finger, such as gesture 154 as a moving image or a still image. The audio 140 emitted from the user 142 by the microphone 106 can be acquired in the mobile terminal 100. Further, by using a switch SW130 which is connected to the mobile terminal, the instruction input by the user 142, may be incorporated in the mobile terminal 100. The mobile terminal 100 may be terminal (not shown) is provided for connecting the switch SW 130. Alternatively, the mobile terminal 100, the switch SW130 and may be connected wirelessly. The portable terminal includes a display 104. Screen display of the display 104 is pattern recognition in this embodiment is utilized.
[0024]
 In one embodiment, as will be described later, update information in the database may be provided to the mobile terminal from the server apparatus 110 via the network NW120. Alternatively, the mobile terminal 100, via the network NW120, sends a search command to the database residing in the server device 110, the portable terminal 100 may acquire a search result from the server device 110. Described later database may exist in any of the hardware connected to the network NW120.
[0025]
 In FIG. 1, the mobile terminal 100 is shown having a shape of a mobile phone, the mobile terminal 100 is not limited to a mobile phone, for example, it fixed embedded like on the front panel of an automobile it may be a hardware. Alternatively, the mobile terminal 100 watches having a communication function, or may be any other wearable devices. The interface device 160 may be a hardware which is fixed to a vehicle steering wheel.
[0026]
 Figure 2 is a diagram illustrating an outline of functions of an embodiment.
[0027]
 More particularly, FIG. 2 shows an embodiment for controlling a user interface of an application for displaying a map that runs on the mobile terminal.
[0028]
 First, the operation of the map application in Figure 2. Thereafter, a description of an embodiment.
[0029]
 
 In FIG. 2 (A), the screen display 210 of the portable terminal 100 is shown. Mark 211 of the arrow is a symbol indicating the destination on the map. On the star 212 is a symbol indicating the current location on the map (the position of the mobile terminal 100 exists currently).
[0030]
 Display which is surrounded by a square frame "destination" 214 is a display of the button to be used for instruction to display the destination in the center of the screen. The user, with a finger, when you tap the position of the button "destination" 214, an instruction to display the destination in the center of the screen, can be given to the map application. In the screen display 210 in FIG. 2 (A), the button "destination" 214 indicates a state immediately after tapped. Accordingly, the mark 211 of the arrow indicating the destination is displayed in the center of the screen display 210.
[0031]
 Display "current position" 216 surrounded by a square frame is displayed buttons used for instructions for displaying the current position in the center of the screen. User, with a finger, touch the position of the button "current position" 216, an instruction for displaying the current position in the center of the screen, can be provided to map application.
[0032]
 In the screen display 220 in FIG. 2 (B), it shows a state immediately after the "current position" 216 is tapped. Accordingly, the mark 212 of the star indicating the current position, is displayed at the center of the screen display 210.
[0033]
 
 In the present embodiment, the voice of the user, an example of giving instructions to the map application. The details are discussed below in operation, in FIG. 2, an outline of operation of an embodiment below.
[0034]
 In the map application, it has a function of accepting a tap to the screen by the user. This embodiment, for this map application, rather than the operation of the tap, for example, by the user's voice, can be given to the map application, the instruction similar to the tap.
[0035]
 Therefore, in the embodiment, by analyzing the display screen 210, the screen of the map application recognizes that button "destination" 214 on the screen display and the button "current position" 216, is present. This recognition by those skilled in the art is well known, by using the existing pattern recognition techniques, in advance to allow the feature extraction in the present embodiment. Further, since the button "destination" 214 and button "You are here" 216 is present in the lower left portion of the screen display, it is also possible to identify the map application. Further, the button "destination" 214 and the button "current position" 216, also from the feature that has a square shape, it is possible to identify the map application. By identifying the map application, it is possible to predict the behavior of the map application more precisely. Therefore, it is desirable to identify the type of the map application.
[0036]
 Alternatively, if it is possible to obtain a type of map application from the operating system, therefore the screen display layout of the map application can infer, for example, pattern recognition for "destination" 214 and the button "current position" 216 screen display be started from a predetermined position, it is possible to increase the speed of pattern recognition.
[0037]
 In the present embodiment, when the voice of the user is "present location", to the map application, the operation button "current position" 216 is tapped to provide a map application. That is, for example, when a user pronounce "current position" is a program that implements the present embodiment, by the user is recognized pronunciation "current position". Then, the program that implements the present embodiment, an event that the button "current position" 216 taps are made in screen position that exists and gives the operating system of the mobile terminal 100. Thereafter, the operating system, the screen position of the button "current position" 216, that the tap has been made, because it gives a map application, recognizing that the map application, the indication that the button "current position" 216 is tapped is made and, as shown in the screen display 220 in FIG. 2 (B), the mark 212 of the star indicating the current position, so as to be positioned at the center of the screen display, displaying a map on the screen display 220.
[0038]
 If the user has pronounced the "mokutekichi" is that the button "destination" 214 is tapped, the program of the present embodiment, providing the operating system. After that, the operating system, the screen position of the button "destination" 214, that the tap has been made, because it gives to the map application, the map application, the button "destination" 214 is an instruction that has been tapped recognize, as shown in the screen display 210 in FIG. 2 (a), the mark 211 of the arrow indicating the destination, so as to be positioned at the center of the screen display, displaying a map on the screen display 210. For more information about these operations will be described later with reference to FIGS. 7 and 8 and the like.
[0039]
 
 FIG. 3 is a diagram illustrating an example of an operation defined in one embodiment.
[0040]
 The table 300 of FIG. 3, the operation of the user, be either of gestures and speech, a plurality of applications (application X, application Y, the application Z) with respect to, perform a predetermined instruction. Hereinafter, details will be described a table 300.
[0041]
 Column 310, the application, in order to give the corresponding indication indicates whether to recognize what gestures as the operation of the user.
[0042]
 Column 312, the application, in order to give the corresponding indication indicates whether to recognize which audio as an operation of a user.
[0043]
 Column 314 defines an instruction to be given to the application.
[0044]
 Column 316, an application X is, by the operation from any user originally shows how to execute the corresponding instruction.
[0045]
 Column 318, application Y is the operation from any user originally shows how to execute the corresponding instruction.
[0046]
 Column 319, an application Z is, by the operation from any user originally shows how to execute the corresponding instruction.
[0047]
 Thus, for example, as a user operation (gesture), column 310, the "operation of the hand to direct the index finger down" described in row 340, the mobile terminal 100 via the camera 102 of the interface device 160 or the mobile terminal 100 but it takes up a case in which the recognition as an example. In this case, column 314, shown in row 340, instructions will be representing the "instructions on the center of the screen the current position" (user pronunciation "current position", was recognized If the same is true).
[0048]
 Therefore, the above case, for example, if the application X is running, programs that implement the present embodiment, the application X, "gives the operation of tapping the display portion of" current position "in the application X" It will be. As a result, application X displays the current location in the center of the screen. In providing this operation application X, the operating system may be involved.
[0049]
 Further, when the application Y is running, programs that implement the present embodiment, the application Y, gives the operation of clicking the display area of ​​the button "current position" in the application Y "would be. As a result, application Y displays the current position in the center of the screen. In providing this operation application X, the operating system may be involved.
[0050]
 Further, when the application Z is running, programs that implement the present embodiment, the application Z, becomes "the operation of tapping the symbol B gives the application Z" it. As a result, application Z displays the current position in the center of the screen.
[0051]
 When the other user action (gesture or voice) is recognized, as shown in Table 300, with respect to currently running applications, to provide a corresponding operation, performing the corresponding instructions can.
[0052]
 In Table 300, as the operation of the user, both the gesture and sound, are shown to correspond to each of the instructions, only one of the, may correspond to a particular instruction.
[0053]
 Note that the gesture of the user, for example, from a video taken by the camera 102 of the portable terminal 100, using techniques well known pattern recognition to those skilled in the art, the recognition is performed. The voice of the user, for example, from a sound signal obtained by the microphone 106 of the interface device 160 or the mobile terminal 100, using techniques well known pattern recognition to those skilled in the art, the recognition is performed.
[0054]
 The SW130 in Fig 1 with the user, for example within a predetermined time, to recognize how many times the switch is pressed is performed, it may also be aware of the operation of the user (not shown). Other, various sensors 1162 mobile terminal 100 is provided, may be aware of the operation of the user (not shown).
[0055]
 The present embodiment, in the original application, or tap the screen, or the click operation, Without a predetermined troublesome operation for could not be instructed applications, such as simple gesture or speech utterance of by a simple operation of the user can give an instruction to the application.
[0056]
 Also, for a plurality of applications having different operation is required to provide a similar indication, the same operation of the user, so it is also possible to give the same instruction.
[0057]
 In addition, for each application, without modification, it is possible to freely change the user interface.
[0058]
 Further, instead of the SW 130, if connected to an apparatus for automatically generating a switch operation predetermined, not through the operation of the user, the application to check the behavior of the application automatically, the present embodiment It can also be used.
[0059]
 Further, in a case of a while the user is driving a car, if it can not operate the mobile terminal 100, via a gesture, or the user's voice, can give a desired instruction to the application.
[0060]
 Figure 4 is a diagram showing a specific display example of a plurality of application shown in FIG.
[0061]
 FIG. 4 (A) shows an example of a screen of an application Y shown in column 318 of FIG. Below the map 410, in order from the left, departure point 414, the current location 416, there is displayed the destination 418. Then, the map 410, a cross mark 411 showing the departure, black dot mark 412 indicating the current location is shown marked 413 of star indicating the destination, connecting these routes are indicated by black line ing. For example, by clicking the departure point 414, a cross mark 411 showing the departure point is displayed in the center of the map 410. Further, as shown on line 330 of FIG. 3, the user gesture stretch the index finger, or by speaking a voice "Shuppatsuchi", the program that implements the present embodiment, the display of the button "departure" give the operation of clicking a part in the application Y. Thus, similarly to the original operation of the mark 411 of the cross indicating the departure point, and is displayed in the center of the map 410.
[0062]
 Further, below the map 410, in order from the left, departure point 414, the current location 416, a feature that the display of the destination 418 is present, the program that implements the present embodiment, it is displayed on the screen by the application Y it can be recognized. By which application to recognize whether the working, so that more operations for proper instructions can be given to the application.
[0063]
 FIG. 4 (B) shows an example of a screen of an application Z shown in column 319 of FIG. The lower left portion of the screen display 420, a symbol A426 inverted triangular, and double circle symbol B428 is displayed. These two symbols are present at the bottom left of the screen, it can also be recognized that the application Z is running.
[0064]
 Black circles mark 422 represents the current location. Arrow tip 423 indicates the destination. For example, by tapping the symbols A426 inverted triangle, an arrow tip 423 indicating the destination is displayed on the center of the screen display 420. As shown on line 350 of FIG. 3, the user gesture extend the thumb upward, or by speaking a voice "mokutekichi" program implementing the present embodiment, tap the symbol A give operation to the application Z. Thus, similarly to the original operation of the, arrow tip 423 indicating the destination, and is displayed in the center of the screen display 420.
[0065]
 Figure 5 is a diagram showing an example for designating an enlargement and reduction of the screen.
[0066]
 FIG. 5 (B) shows a screen display 512 of an enlarged screen display 510 of FIG. 5 (A). FIG. 5 (C) shows a screen display 514 further enlarged screen display 512 in FIG. 5 (B).
[0067]
 Gesture 520 of fist in FIG. 5 (D) is the same as the gesture shown in the row 370 of FIG. For example, when the screen display 514 shown in FIG. 5 (C) is displayed, if the gesture 520 is recognized, the display is reduced to the screen display 512 in FIG. 5 (B). Furthermore, for example, when the screen display 512 shown in FIG. 5 (B) is displayed, if the gesture 520 is recognized, the display is reduced to the screen display 510 of FIG. 5 (A).
[0068]
 Par gesture 530 shown in FIG. 5 (E) is the same as the gesture shown in the row 360 of FIG. For example, when the screen display 510 shown in FIG. 5 (A) is displayed, if the gesture 530 is recognized, the display is enlarged on the screen display 512 in FIG. 5 (B). Furthermore, for example, when the screen display 512 shown in FIG. 5 (B) is displayed, if the gesture 530 is recognized, the display is enlarged on the screen display 514 in FIG. 5 (C).
[0069]
 Arrows 522 and arrows 532 indicate the direction of the screen transition.
[0070]
 Thus, without touching the screen, it can be a predetermined gesture of the user, realizing the operation of the more easily screen enlargement or reduction. Above it, the user is the same when performing the operation of the utterance of "reduction" operation and the "expansion" of speech.
[0071]
 Also, other, the operation of the user, by recognizing the various sensors 1162, may be configured to perform an operation of enlargement or reduction.
[0072]
 Figure 6 is a flowchart illustrating operation of one embodiment.
[0073]
 In step S602, the program of the embodiment obtains the screen display.
[0074]
 In step S604, the program of embodiments, for example, performs the following operation.
1. Analyzing the acquired screen display, it recognizes the application running.
2. Based on the recognized application, and identify each object present in the acquired screen display, to identify the respective positions.
[0075]
 Note that in this step, but recognizes the running application analyzes the acquired screen display, application running may obtain what the operating system.
[0076]
 In step S606, the program of embodiment obtains user operation (speech, gestures, etc.).
[0077]
 In step S608, the program of embodiments recognize acquired user operation (speech, gestures, etc.).
[0078]
 In step S610, the program of the embodiment, the operating system information which the application is running, recognized running application, recognized objects and the position of an object, and, by using a recognized user of the operation It searches the database and extracts the operation to be done by the user to the application running. The information of the operating system, programs that implement the present embodiment may acquire information directly specifying the operating system from the operating system. Further, information specifying the application running as described above, may be obtained directly from the operating system.
[0079]
 In step S612, the program of the embodiment, the extracted operation, the application running, applied on behalf of the user.
[0080]
 In step S614, the program of the embodiment, to perform the desired operation to the application in operation.
[0081]
 Thus, this embodiment, by recognizing the screen display, it is possible to specify the operation corresponding to the instruction appropriately corresponding to the screen displayed on the flexible. Thereby, such as when driving a car, the screen of the mobile terminal user even when it can not operate directly, appropriate instructions the user wants, it is possible to give the mobile terminal.
[0082]
 Moreover, in order to realize the above, it is also possible to avoid tampering with the application program itself.
[0083]
 Figure 7 is a functional block diagram of an embodiment.
[0084]
 Screen display acquisition unit 702 acquires the screen display of the mobile terminal 100. The screen display may be an image image. Screen display, for example, be obtained from the operating system. Screen display, may be obtained from a drawing signal to the screen. Alternatively, by imaging the screen camera, etc., may obtain the screen display.
[0085]
 Screen display feature extraction unit 704, from the obtained screen, to extract one or more features. The characteristics, the symbol display for which tapping, button display, these present position, and the like features that can identify applications.
[0086]
 The user operation acquisition unit 706, for example, the acquisition of images of the movement of the user's hand, the acquisition of the speech voice information, depression of the switch, and the like.
[0087]
 Information obtained from the user operation recognition portion 708, for example, the recognition result of the state of the user's hand, the speech recognition result, facial expressions of the user, authentication result of the user, and the like the press count of the switch at a given time.
[0088]
 Operation search unit 710, the screen display feature extraction unit 704 and a user operation recognition portion 708 and other information (identification information such as the operating system), and searches the database 712 to extract the operation corresponding to the instruction sought . Incidentally, as already mentioned, the operation search unit 710, the information of the type of the application that is currently displayed on the foreground of the display 104 of the portable terminal 100 acquires from the operating system, the type of application that the currently displayed the information may be used to search the database 712. Alternatively, information on the type of application that is currently displayed in the foreground of the display 104 of the portable terminal 100, the screen display feature extractor 704 may be extracted from the characteristics of the obtained screen.
[0089]
 Application instruction section 714 gives the extracted instruction, to the application on behalf of the user, it is operating. As a result, the instructions are determined for the application is made. The application performs the processing based on the instruction.
[0090]
 Figure 8 is a diagram illustrating an example of an operation environment of the user interface control program in which an embodiment is implemented.
[0091]
 Application 806, the present embodiment behalf of the user, which is the subject of the application to provide the operation.
[0092]
 User interface control program 804 is a program in which an embodiment is implemented. Application 806 and the user interface control program 804 can operate on the operating system 802 present in the mobile terminal 100. The operation 810 (gesture, voice, etc.) of the user is acquired by the hardware 800, it is passed to the operating system 802.
[0093]
 Above operating environment is an example, embodiments are not intended to be limited to the operating environment.
[0094]
 9, a user operation, from the feature extraction results of the screen display is a diagram illustrating an example of creating a database for extracting appropriate instructions and operations.
[0095]
 As shown in FIG. 9, the database creation unit 920, an operator 910 to enter the entry information 930 to be stored in the database, the entry information 930 may be stored in the database.
[0096]
 Entry information 930, as shown in the column of "instruction to an application" shows the contents of the entry indicates "possible to display the current position in the center of the screen" for the application.
[0097]
 That is, the "screen display feature" is "Application Z" and "existence of a symbol B" and "position of symbol B" is a case where the predetermined position, the following (a) or (b) If the operation or speech any user is recognized in makes it possible to extract the operation of applying the "tap symbol B" in the screen display.
(A) "user operation" is "direct index finger down" when an operation
case (b) "user's voice" is "current position".
[0098]
 Example search databases such entry information 930 is stored is as follows.
[0099]
 Search the database recognized as "screen display feature" and "user operation" as a search key, it can be extracted "operations applied to the screen display." Alternatively, search the database recognized as "screen display feature" to "user's voice" as the search key, can be extracted "operations applied to the screen display."
[0100]
 As described above, the database entries for the current location display when the application Z is operating is completed.
[0101]
 The above processing, for each application, so as to cover each operation, by storing in a database to make each entry, the database is completed.
[0102]
 
 In the above embodiments, receives an operation of the user, but is aware of the operation of the user received, the embodiment is not limited to being directed to a user action. For example, an image drawn on paper, paintings, characters, display of the display, photographic, or views, receives an imaging information such as the object, may recognize the contents.
[0103]
 This embodiment may be implemented in one or more programs running on one or more hardware. Further, this embodiment is constituted by a plurality of devices, a plurality of devices may be implemented by cooperation through a communication line or the like.
[0104]
 In one embodiment, when the screen display of the mobile terminal is received, the screen display may be encrypted.
[0105]
 Screen display to be acquired may be, for example, a screen display part of the portable terminal 100.
[0106]
 Further, a portion of the screen display, such as a predetermined mark may be displayed based on the implemented program of the present embodiment. By doing so, the program that implements this embodiment is the subject of the screen display for controlling the user interface, the program that implements the present embodiment can be confirmed.
[0107]
 
 FIG. 10 is a diagram showing a hardware configuration of the portable terminal 100 and server device 110. 10 describes collectively the hardware configuration of the portable terminal 100 and server device 110.
[0108]
 The hardware configuration, the display control unit 1110, a display portion 1112, CPU 1120, memory 1130, a communication control unit 1140, the external memory control unit 1150, a storage medium 1152, an input interface 1160, the various sensors 1162, a camera 1164, a microphone 1166, an output interface 1170, a speaker 1172, a display 1174, a vibrator 1176, a touch panel controller 1180, a touch panel 1182, and the like. Incidentally, the communication control unit 1140, although the wireless network 1142 is communicatively coupled, wired network may be communicatively coupled. The components may be interconnected by a bus 1190.
[0109]
 Mobile terminal 100 and the server apparatus 110, among these hardware may not partly exist, other hardware may be present.
[0110]
 Incidentally, all or part of this embodiment may be implemented by a program. The program may be stored in the storage medium 1152. A storage medium 1152, refers to one or more non-transitory (non-transitory) storage medium having the structure. As illustrated, the storage medium 1152, a magnetic recording medium, an optical disk, a magneto-optical recording medium, a semiconductor memory is a nonvolatile memory. In semiconductor memory, RAM, ROM, there is such as an SD memory. The magnetic recording medium, HDD, a flexible disk (FD), a magnetic tape (MT). On the optical disc, DVD (Digital Versatile Disc), DVD-RAM, CD-ROM (Compact Disc-Read Only Memory), CD-R (Recordable) / RW (ReWritable), and the like. Further, the magneto-optical recording media include MO (Magneto-Optical disk). The program stored in the recording medium is read and executed by the CPU, all or some of the embodiments may be implemented.
[0111]
 Each embodiment is for understanding the invention, it should be noted that not intended to limit the scope of the present invention. Further, a plurality of embodiments described above are not mutually exclusive. Thus, unless any contradiction occurs it should be noted that it is also contemplated to combine the elements of different embodiments. The invention according to a method or a program according to claim may change the order of processing unless inconsistent, or may be performed a plurality of processing simultaneously. And, these embodiments also, it goes without saying that are included in the technical scope of the invention as claimed.
[0112]
 Further, by executing the read program by computer, besides the functions of the above embodiments are realized on the basis of the instructions of the program code, other programs such as the operating system running on the computer There performs a part or all of the actual processing, even if is realized functions embodiment by the processing, it is needless to say that are included in the present invention.
[0113]
 Further, each of the components of the various embodiments may be implemented in physically separate the plurality of hardware. Further, each of the components of the various embodiments of the present invention may be implemented is distributed to a plurality of virtual machines running on a single computer.
[0114]
 This application claims the priority of which was filed foundation Application No. 2016-201171 on October 12, 2016 in the Japan Patent Office, which is incorporated in its entirety herein by reference.
DESCRIPTION OF SYMBOLS
[0115]
100 portable terminal
102 camera
104 displays
106 microphone
110 server apparatus
120 network
NW 130 switch
SW 140 speech
142 user
152 gesture
154 gesture
160 interface device

WE CLAIM

Based on the operation of a user on a display screen, a method for controlling a user interface providing instructions for an application,
 device having the screen,
 a step of acquiring the display,
 present in acquired the display 1 extracting more than three characteristics,
 comprising: receiving an operation of a user,
 the operation and received, said by the one or more features extracted, retrieving a predetermined operation from the database,
 it is searched wherein a predetermined operation with, the predetermined operation different from the operation by applying in response to said display, comprising the steps of providing an indication to the application
 process with.
[Requested item 2]
 Step using a technique of the pattern recognition for the display, and extracts the one or more features, the method of claim 1 wherein said extract.
[Requested item 3]
 The operation is the voice of the user acquired from the body motion or the microphone of the user captured by the camera, the method according to claim 1 or 2.
[Requested item 4]
 Device, the a device different from the device application works, the method of any one of claims 1 to 3 having the screen.
[Requested item 5]
 The device having the screen, of the claims 1 to 4, a program for executing the method of any one of claims.
[Requested item 6]
 Based on the user's operation on the display screen, a device for controlling a user interface providing instructions to the application running,
 the display acquisition unit that acquires the display,
 one that is present on the display, which is acquired a feature extraction unit for extracting the above features,
 the operation receiving unit for receiving an operation of a user,
 the operation and received, said by the one or more features extracted, the search unit which searches a predetermined operation from the database When,
 a retrieved the predetermined operation, the operation and is by applying correspondingly different predetermined operation on the display, and instruction unit that gives an instruction to the application
 apparatus having a.

Documents

Application Documents

# Name Date
1 201917014164.pdf 2019-04-08
2 201917014164-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [08-04-2019(online)].pdf 2019-04-08
3 201917014164-STATEMENT OF UNDERTAKING (FORM 3) [08-04-2019(online)].pdf 2019-04-08
4 201917014164-PRIORITY DOCUMENTS [08-04-2019(online)].pdf 2019-04-08
5 201917014164-FORM 1 [08-04-2019(online)].pdf 2019-04-08
6 201917014164-DRAWINGS [08-04-2019(online)].pdf 2019-04-08
7 201917014164-DECLARATION OF INVENTORSHIP (FORM 5) [08-04-2019(online)].pdf 2019-04-08
8 201917014164-COMPLETE SPECIFICATION [08-04-2019(online)].pdf 2019-04-08
9 201917014164-certified copy of translation (MANDATORY) [16-04-2019(online)].pdf 2019-04-16
10 201917014164-OTHERS-120419.pdf 2019-04-22
11 201917014164-OTHERS-120419-.pdf 2019-04-22
12 201917014164-Form 5-120419.pdf 2019-04-22
13 201917014164-Correspondence-120419.pdf 2019-04-22
14 201917014164-OTHERS-220419.pdf 2019-04-26
15 201917014164-Correspondence-220419.pdf 2019-04-26
16 abstract.jpg 2019-05-16
17 201917014164-Proof of Right (MANDATORY) [29-06-2019(online)].pdf 2019-06-29
18 201917014164-FORM-26 [29-06-2019(online)].pdf 2019-06-29
19 201917014164-Power of Attorney-040719.pdf 2019-07-16
20 201917014164-OTHERS-040719.pdf 2019-07-16
21 201917014164-Correspondence-040719.pdf 2019-07-16
22 201917014164-FORM 3 [05-09-2019(online)].pdf 2019-09-05
23 201917014164-FORM 18 [15-07-2020(online)].pdf 2020-07-15
24 201917014164-FER.pdf 2021-10-18

Search Strategy

1 14164E_30-07-2021.pdf