Programmable Display Device And Screen Operation Processing Program Therefor
Abstract:
A programmable display device is provided with a display unit (101) a coordinate input unit (102) a display processing unit (107) an operation processing unit (108) a control unit (110) and a switching unit that switches so as to enable or disable operation using screen gestures. A display screen comprises a screen gesture application area in which it is possible to change display content and a screen gesture non application area in which it is not possible to change display content. The switching unit enables operation of operation objects in the screen gesture application area and the screen gesture non application area when operation by screen gestures is disabled and disables operation of operation objects in the screen gesture application area and enables screen gestures when operation by screen gestures is enabled. The display processing unit (107) performs predetermined display in the screen gesture application area when operation by screen gestures is enabled.
Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence
c/o Mitsubishi Electric Corporation 7 3 Marunouchi 2 chome Chiyoda ku Tokyo
1008310
2. KAWAI Hidenori
c/o Mitsubishi Electric Corporation 7 3 Marunouchi 2 chome Chiyoda ku Tokyo
1008310
Specification
DESCRIPTION
Field
[0001] The present invention relates to a programmable display device and a screen-operation processing program therefor.
Background
[0002] Computer devices including a display device and a coordinate input unit capable of detecting one or more touch-operation coordinates, such as tablet computers, smartphones, and programmable display devices used as display/operation terminals in industry, have come into wide use. With the advanced features and high screen resolution of such computer devices, the definition of display content has become higher. Meanwhile, when many pieces of information are displayed (arranged) on a screen having a limited size, the size of each element (object) becomes small, and this reduces visibility and operability. Particularly in touch operations, it is difficult to precisely indicate the coordinates of an object that has an insufficient size with respect to human fingers, and this tends to cause incorrect operations. The term "object" here refers to a virtual component on a computer, such as a component having a switch function that is operated by touch and a component having an information presenting function such as a graph or a lamp.
[0003] In response to such a problem, there are some computer devices that include a pen-shaped attachment (a stylus) for pointing at small objects. The stylus is
effective for operations on small objects and operations requiring precision such as handwritten character input. However, the stylus has other problems such as it can be lost or damaged, and it is unsuitable for simultaneous operations at two or more points because, in the operation of a portable computer device,' one hand holds the computer device and the other hand holds the stylus pen to operate the device.
[0004] As disclosed in Non Patent Literature 1, for example, there has been a product that provides a system for enlarging or reducing part of a display screen in response to the operation of touching two points simultaneously with two fingers and spreading the two points apart (referred to as "pinch open" or "pinch-out") and the operation of narrowing the distance between the two points (referred to as "pinch close" or "pinch-in"), respectively. With this system, in a dynamic manner, it is possible to switch operations between an enlarged display of a target to be browsed or operated that is performed as needed to improve the visibility and operability and a reduced display of the target that is performed to overview and check many pieces of information. Further, display content can be scrolled (change of display positions) by moving a touch position while keeping touch on one point and then releasing the touch (referred to as "dragging", or as "flicking" when hitting the point quickly). These operations including such movement of touch positions and change in the touch state are referred to as "gesture" (a gesture operation).
[0005] While there is such an operation of moving a touch position while maintaining the touch as described above, there are various types of gestures such as a tapping operation that is an operation of touching and
releasing a point quickly, and a double-tapping operation that is an operation of consecutively tapping a point twice. In order to consecutively perform a determination of these gesture types, the determination is required to be performed on the basis of the number of touch points, the time period during which the touch is kept, and change in the touch coordinates. For example, assuming there is an object representing a virtual switch arranged on a screen with the characteristics described above, it is possible to visually present a touch operation acting on the switch in such a manner that the image of the switch is switched to an image in the pressed state at the moment when the switch is touched. However, it is not preferable to activate a function assigned to the switch (for example, the function of switching a screen) simultaneously with the touch. This is because, at the moment of the touch, the switch functions while it has not been distinguished whether the touch is an operation of the switch or a dragging or flicking operation, or whether the first touch of an operation for pinch open or pinch close is being performed accidentally in the area of the switch. This may cause an operation unintended by a user. In view of this, in a general computer device supporting gesture operations, when the touch state of an object requiring an operation such as a switch is released (when the touch is released), the type of operation is determined. For convenience, such a state where an object operates at the time of the release of the touch is referred to as "OFF synchronization" in the present specification. Meanwhile, the state where an object operates at the time of the touch of the object is referred to as "ON synchronization".
[0006] Patent Literature 1 discloses a technique related to a touch-panel processing device that switches between a
scroll mode for scrolling a screen and a flick mode for performing an assigned process according to a touch operation. With this technique, a long press on an arbitrary position is performed to switch to the flick mode, and after a menu (an operation guide including multiple sections) is displayed, a touch position is shifted for menu selection, and switching to the scroll mode is performed.
[0007] Furthermore, Patent Literature 2 discloses a technique of performing predetermined processing operations by setting a main input mode and an auxiliary input mode, and in the main input mode, a predetermined processing operation according to the detection result of a position detecting unit for each touch operation of a touch switch is performed, and in the auxiliary input mode, a predetermined processing operation is performed by using a plurality of touch operations of the touch switch as a series of related-operation input.
[0008] Further, Patent Literature 3 discloses a technique related to multiple touch operations, which are touch operations performed by an operator on device display areas on a monitor screen, of pinching a displayed target symbol of a plant device with two or more fingers, and spreading or narrowing touched parts or twisting and rotating these parts.
Citation List Patent Literatures
[0009] Patent Literature 1: International Publication No. WO2012/060352
Patent Literature 2: Japanese Patent Application Laid-open No. 2010-204964
Patent Literature 3: Japanese Patent Application
Non Patent Literature
[0010] Non Patent Literature 1: "iOS human interface guideline", [online], December 17, 2012, Apple Japan GK, February 5, 2013, Internet (URL:
https://developer.apple.com/jp/devcenter/ios/library/docume ntation/MobileHIG.pdf)
Summary
Technical Problem
[0011] For example, in a programmable display device for industrial use, objects including switches are arranged and displayed in a display screen of the programmable display device. By operating these objects, operations are performed or an instruction is made for writing a value into a device of a control device such as a PLC (Programmable Logic Controller) that is connected to the programmable display device. If a case is considered where such a programmable display device is used for a manufacturing apparatus, it is necessary for a switch and the like to be operated at the moment of touch, that is, an object to function in ON synchronization in order to improve responsiveness as much as possible because the response speed of the switch affects productivity. Further, such a programmable display device also requires a switch for switching a value written into a device of a control device, depending on whether the switch is being pressed or not. Such a switch is referred to as "momentary switch", and can be regarded as one of ON-synchronization objects, in an aspect that the switch operates at the moment of touch. [0012] In such a programmable display device, similarly
to general computerdevices, in some cases, it is necessary to arrange many objects on a screen for simultaneously viewing the conditions of a device. Meanwhile, in such a programmable display device, a system that can correctly operate an object for operating a switch or the like by a touch operation is required more than in general computer devices. In order to satisfy the above requirements simultaneously, it is required that the programmable display device has a function of zooming in a part to be operated (information that is desired to be viewed, in some cases) as needed and operating objects even in ON synchronization.
[0013] With the technique of gesture operations used in general tablet computers or smartphones disclosed in Non Patent Literature 1, an operation of a switch by ON synchronization and a gesture operation such as zoom and scroll cannot be performed at the same time, as described above. Therefore, in some applications that require an operation of a switch by ON synchronization, enlarged/reduced display by pinch open/pinch close or scroll by dragging/flicking is invalidated, or a gesture operation on the switch is invalidated by limiting the area in which a gesture operation is applicable. Accordingly, enlargement of a display including a switch that operates in ON synchronization cannot be performed. This is because when enlargement of a screen is performed by a gesture in an area excluding the switch, if the display area is filled with the switch as a result of the enlargement, subsequent gesture operations cannot be performed. [0014] In Patent Literature 3, it assumed that the target symbol (an object) is a target of the multiple touch operations of pinching the target with two or more fingers and spreading and narrowing or twisting and rotating the
pointed parts, and the target symbol itself can handle multiple touch operations. However, Patent Literature 3 does not disclose any method of performing operations such as enlarging/reducing or scrolling displayed content by multiple touch operations.
[0015] Meanwhile, in Patent Literature 1, a flick mode and a scroll mode are set. It is assumed that a long press on the screen is made as an operation of switching a mode for effectively utilizing the flick mode. Although Patent Literature 1 describes that, when a short press is made, it is possible to change the mode to the flick mode; however, this method has a high possibility of causing an incorrect operation of a switch that operates in ON synchronization. Therefore, this technique cannot be applied directly to programmable display devices that require ON synchronization.
[0016] Further, in Patent Literature 2, a main input mode and an auxiliary input mode are set and switching between these modes is performed by pressing of a shift button. However, Patent Literature 2 does not disclose any method of enlarging/reducing or scrolling a screen. Therefore, in Patent Literature 2, there is an assumption that there will not be a case, for example, where as a result of enlargement and scroll of the screen, the shift button moves outside a display area and operations cannot be performed.
[0017] Furthermore, in application software or basic software for personal computers, there is a certain type of software in which a left-click of a mouse operates objects while a right-click causes special menus (context menus) to be displayed, and mode switching is performed via the context menus. However, in touch operations, it is not possible to determine whether it is a left-click or a
right-clicks Therefore, this technique cannot be applied to devices operated mainly by touch operations. [0018] The present invention has been achieved in view of the above problems, and an object of the present invention is to provide a programmable display device that monitors and operates a control device and can be operated by a touch operation, in which the operation of the control device can be performed by an ON synchronization operation and an enlargement/reduction or scroll operation of a display screen can be performed by a screen gesture operation on an arbitrary position on the display screen, and to provide a screen-operation processing program therefor.
Solution to Problem
[0019] In order to achieve the above object, a programmable display device according to an aspect of the present invention is a programmable display device that monitors and operates a control device connected to the programmable display device via a communication line, including: a display unit; a coordinate input unit that detects at least one operation coordinate of an input indicator that is in contact with the display unit; a display processing unit that displays, in a display screen displayed in the display unit, a plurality of objects including a display object displaying only information or an operation object that is operable; an operation processing unit that extracts change of the input indicator from the operation coordinate of the input indicator obtained by the coordinate input unit; a control unit that performs a predetermined operation according to change of the input indicator; and a switching unit that switches between validation and invalidation of an operation by a
screen gesture, wherein the display screen includes a screen-gesture applicable area in which display content is capable of being changed and a screen-gesture non-applicable area in which display content is not capable of being changed, when an operation by the screen gesture is invalid, the switching unit validates an operation on the operation object in the screen-gesture applicable area and the screen-gesture non-applicable area, and, when an operation by the screen gesture is valid, the switching unit invalidates an operation on the operation object in the screen-gesture applicable area and validates the screen gesture, and when an operation by the screen gesture is valid, the display processing unit performs predetermined display in the screen-gesture applicable area.
Advantageous Effects of Invention
[0020] According to the present invention, because validation and invalidation of operations by screen gestures can be switched, when operations by screen gestures are invalidated, an operation on an operation object is performed in ON synchronization, and when operations by screen gestures are validated, an operation on an operation object arranged in a screen-gesture applicable area is invalidated. As a result, when display content in the screen-gesture applicable area is changed, even if an operation including touch on the screen is performed, an effect is obtained where an operation object included in the display content is not operated by mistake.
Brief Description of Drawings
[0021] FIG. 1 is a diagram illustrating an example of
objects used in a programmable display device.
FIG. 2 is a diagram illustrating an example of a
display screen of, the programmable display device.
FIG. 3 is a diagram illustrating an example of a display area of the programmable display device according to an embodiment.
FIG. 4 is an explanatory diagram of two modes that are switched by the programmable display device according to the embodiment.
FIG. 5 is a block diagram schematically illustrating a configuration of the programmable display device according to the embodiment.
FIG. 6 is a flowchart illustrating an example of procedures of a mode switching process in the embodiment.
FIG. 7 is a diagram illustrating an example of mode switching and a gesture operation when a base screen and a window screen exist in a mixed manner.
FIG. 8 is a flowchart illustrating an example of process procedures when a gesture operation is performed in a screen gesture mode in the embodiment.
FIG. 9 is a diagram illustrating an example of a scroll process in the screen gesture mode.
FIG. 10 is a diagram illustrating an example of behaviors when a zoom operation is performed in the screen gesture mode.
FIG. 11 is a diagram illustrating an example of an enlargement/reduction process in the screen gesture mode.
FIG. 12 is a diagram illustrating a relation between coordinate positions before and after zoom and/or scroll is applied.
Description of Embodiments
[0022] Exemplary embodiments of a programmable display device and a screen-operation processing program therefor according to the present invention will be explained below
in detail with reference to the accompanying drawings. The present invention is not limited to the embodiments. [0023] FIG. 1 is a diagram illustrating an example of objects used in the programmable display device. Objects used in a touch-panel programmable display device include display objects displaying only information and operation objects responding to operations, such as a touch switch. [0024] The display objects include, for example, a lamp 501 switching display according to the device value of an external connection device, a trend graph 502 collecting device values on a regular basis and displaying a line graph of stored time-series information, and a numerical value display 503 displaying a device value in numerical form as it is.
[0025] The operation objects include, for example, a switch 511 rewriting a device value with a touch operation, numerical value input 513 normally displaying a device value in numerical form and setting a numerical value in the device by input from a ten key 512 for changing numerical values, and slider control 514 changing a value continuously by moving a touch position while touching a "knob" arranged in a predetermined area and setting a value at a point when the touch is released in the device. A window frame for adjusting the position or size of a window screen can be regarded as a kind of the operation objects. [0026] FIG. 2 is a diagram illustrating an example of a display screen of the programmable display device. The programmable display device can display a base screen 610, which is a screen displayed on the entire display unit, and a window screen 620, which is a screen displayed on part or the entirety of the display unit so as to cover the base screen 610. A plurality of the window screens 620 can be displayed on the display unit. At least any of a display
object and an operation object is arranged as appropriate on the base screen 610 and the window screen 620. The arrangement of the display object and the operation object is defined by project data.
[0027] FIG. 3 is a diagram illustrating an example of a display area of the programmable display device according to the present embodiment. In the present embodiment, areas on a display screen are distinguished in such a way that part of a display area 700 of the programmable display device is set as a screen-gesture applicable area 701 where screen gestures by a user are valid while the remaining area is set as a screen-gesture non-applicable area 702 where screen gestures by the user are invalid. That is, when an instruction is issued by a screen gesture to change display content, the display content is changed according to the instruction in the screen-gesture applicable area 701, while the display content is not changed in the screen-gesture non-applicable area 702. [0028] In this example, the screen-gesture non-applicable area 702 is provided in a belt shape in the uppermost part of the display area 700. In the screen-gesture non-applicable area 702, a mode switching switch 710 as an operation object for switching between a normal operation mode and a screen gesture mode, which are described later, is arranged. Each time the mode switching switch 710 is pressed, switching is made between the normal operation mode and the screen gesture.
[0029] The normal operation mode is a mode in which operations are performed on the basis of the setting in project data set in the programmable display device. The screen gesture mode is a mode for invalidating operations on operation objects arranged in the screen-gesture applicable area 701, while changing the display content in
the screen-gesture applicable area 701 on the basis of a predetermined operation performed in the screen-gesture applicable area 701. In this case, changing the display content means enlargement or reduction of the display content or changing (scrolling) the display position, for example. That is, in the normal operation mode, operations on operation objects displayed on the display screen, for example, are validated and when an operation object is operated, the operation object operates in ON synchronization. In contrast, in the screen gesture mode, because operations on the operation objects are invalidated, any operation does not operate in ON synchronization, even when an input indicator, such as a finger, is in contact with the operation object.
[0030] In the screen gesture mode, while operations on the operation objects in the screen-gesture applicable area 701 are invalidated, operations on the operation objects arranged in the screen-gesture non-applicable area 702 can be performed. Accordingly, even in the screen gesture mode, switching to the normal operation mode can be performed by operating the mode switching switch 710 arranged in the screen-gesture non-applicable area 702. Each screen (for example, each of the base screen 610 and the window screen 620) can be set as the screen-gesture applicable area 701 or the screen-gesture non-applicable area 702 in the project data.
[0031] FIG. 3 illustrates the screen-gesture applicable area 701 having a rectangular shape as an example; however, the shape is not limited to a rectangular shape and can be an ellipse or arbitrary polygon. As described above, it is desirable that the screen-gesture applicable area 701 can be set to each screen. The reason therefor is that it is generally preferable to distinguish whether a screen
gesture function is applied for each displayed screen. However, to simplify the setting, the screen-gesture applicable area 701 may be set to each project data. [0032] FIG. 4 is an explanatory diagram of two modes that are switched by the programmable display device according to the present embodiment, where FIG. 4(a) is a diagram illustrating an example of a screen state in the normal operation mode and FIG. 4(b) is a diagram illustrating an example of a screen state in the screen gesture mode. In the normal operation mode, as illustrated in FIG. 4(a), it is possible to operate operation objects (not illustrated) arranged in the screen-gesture applicable area 701 and the screen-gesture non-applicable area 702. Meanwhile, in the screen gesture mode, as illustrated in FIG. 4(b), to allow a user to visually recognize that the display screen is in the screen gesture mode, the displaying manner is set to be different from that in the normal operation mode. In this example, in the screen gesture mode, the outer peripheral portion of the screen-gesture applicable area 701 is surrounded by a thick line 703. The thick line 703 may be colored in a visually distinctive color, such as red, or may be blinked. [0033] The thick line 703 is used as a line for surrounding the screen-gesture applicable area 701 in order to prevent the display content in the screen-gesture applicable area 701 from being affected as much as possible, even in the screen gesture mode. That is, using the thick line 703 for the outer peripheral portion of the screen-gesture applicable area 701 does not significantly reduce the area of the screen-gesture applicable area 701. Therefore, it is not necessary to zoom out the content displayed in the screen-gesture applicable area 701. However, depending on the screen content to be displayed,
for example, an icon indicating that the display screen is in the screen gesture mode can be displayed on the display screen in a superimposed manner, instead of using such a thick line surrounding the screen-gesture applicable area 701.
[0034] On the other hand, in the normal operation mode, in consideration of the effective utilization of the display area 700 by displaying as less extra information as possible, FIG. 4(a) illustrates a case where no special display for indicating that the display screen is in the normal operation mode is performed. However, special display for indicating that the display screen is in the normal operation mode can be performed. [0035] In this way, in the present embodiment, the screen-gesture applicable area 701 and the screen-gesture non-applicable area 702 are provided in the display screen and the mode switching switch 710 for switching between the normal operation mode and the screen gesture mode is provided in the screen-gesture non-applicable area 702. In the normal operation mode, when an operation object is touched, the operation object operates in ON synchronization, whereas in the screen gesture mode, even when an operation object is touched, the operation object does not operate in ON synchronization, but operates after determining an operation instruction of a screen gesture on the basis of the trajectory of the screen gesture. A programmable display device achieving such functions is described below.
[0036] FIG. 5 is a block diagram schematically illustrating a configuration of the programmable display device according to the present embodiment. A programmable display device 100 includes a display unit 101, a coordinate input unit 102, a communication interface unit
103 (denoted as "communication I/F" in FIG. 5),, an external-storage interface unit 104 (denoted as "external-storage I/F" in FIG. 5), an internal storage unit 105, a file-system processing unit 106, a display processing unit 107, an operation processing unit 108, a communication processing unit 109, and a control unit 110. [0037] The display unit 101 is constituted by, for example, a liquid crystal display or an organic EL (Electroluminescence) display.
[0038] The coordinate input unit 102 is a touch panel that is arranged such that it overlaps with the display unit 101, for example, and detects coordinates (touch coordinates) of a contact position with an input indicator, such as a finger. The coordinate input unit 102 can detect a plurality of touch coordinates simultaneously. Examples of a touch panel that can detect a plurality of touch coordinates simultaneously include a variety of products such as that of a resistance film type, an electrostatic capacity type, and an optical type, and any of these types can be used.
[0039] The communication interface unit 103 is a part serving as an interface when communication is performed with an external connection device 120 such as a control device, a personal computer 130, or the like. [0040] The external-storage interface unit 104 is a part serving as an interface when communication is- performed with a portable external storage medium 150 such as a memory card or a USB (Universal Serial Bus) memory. [0041] The internal storage unit 105 is constituted by a nonvolatile storage medium, such as a NAND or NOR flash memory or a hard disk device.
[0042] In the internal storage unit 105 or the external storage medium 150, for example, project data 180 for
operating the programmable display device 100 is stored. The project data 180, for example, includes screen data to be displayed on the display unit 101. The screen data includes arrangement positions of operation objects or display objects. In the internal storage unit 105 or the external storage medium 150, a mode-status storage area in which mode status information indicating the current mode of a mode switching switch is stored is provided. For example, the mode status information stores therein a mode status for each display screen.
[0043] The file-system processing unit 106 performs reading and writing of the project data 180, which is stored in the internal storage unit 105 or the external storage medium 150.
[0044] The display processing unit 107 causes the display unit 101 to display a predetermined screen on the basis of the project data 180. The display processing unit 107 combines display content on the display unit 101 while taking overlapping of the base screen 610 and the window screen 620 into consideration.
[0045] The operation processing unit 108 extracts change of an input indicator from the touch coordinates of the input indicator from the coordinate input unit 102. For example, in the normal operation mode, the operation processing unit 108 detects a pressing operation when the operation object is a button or detects a touch position and a release position of slider control. In the screen gesture mode, the operation processing unit 108 obtains, from the coordinate input unit 102, the number of touch points of an input indicator, touch coordinates, and the distance between two touch positions when the number of touch points is two, and then calculates change in the number of touch points, a reference point of touch
coordinates, a, displacement of the touch. coordinates from the reference point, the reference distance as a distance between two reference touch positions, or distance change from the reference distance.
[0046] The communication processing unit 109 has a function to perform communication with the external connection device 120. The communication processing unit 109 has a communication protocol different for each external connection device 120 and accommodates the difference in the communication method due to the difference in the external connection device 120. [0047] The control unit 110 reads the project data 180 via the file-system processing unit 106, and interprets the content of the project data 180 to perform a process with respect to objects to be displayed on the display unit 101 or to perform a process with respect to a non-object function without involving any object, such as a logging function of collecting data.
[0048] The control unit 110 reads a device value from the external connection device 120 that is connected via the communication interface unit 103 on the basis of the setting in the project data 180, and writes the device value into the external connection device. Further, on the basis of the device value of the external connection device 120 obtained via the communication processing unit 109, the control unit 110 responds to device value. [0049] The control unit 110 identifies an operation by an input indicator on the basis of the output result from the operation processing unit 108, and performs a process corresponding to the operation. For example, when an operation object indicating writing of a device value into the external connection device 120 is pressed in the normal operation mode, the control unit 110 instructs the display
processing unit 107 to change the display of the operation object to the pressed state and instructs the communication processing unit 109 to write the device value into the corresponding external connection device 120. Meanwhile, when an input indicator issues an instruction to change display content in the screen gesture mode, the control unit 110 calculates the change instruction on the basis of the output result from the operation processing unit 108, and issues the change instruction to the display processing unit 107.
[0050] The project data 180 is created or edited by drawing software 131, which is one of applications in the personal computer 130. The project data 180 can be copied to the external storage medium 150 by the personal computer 130, and the project data 180 can be transferred via the communication interface unit 103 to the internal storage unit 105 of the programmable display device 100 or to the external storage medium 150 incorporated in the external-storage interface unit 104.
[0051] The programmable display device 100 having such a configuration is used as a display and input device that is a replacement product for a display device, such as an electrical or mechanical lamp or meter, or an operation inputting device, such as a switch and a volume control, for example, in a manufacturing apparatus. [0052] In order to cause a control target device to perform a desired operation in real time, a control device such as a PLC generally performs a series of processes including receiving predetermined data from the control target device, performing predetermined calculations on the basis of the received data, and transmitting the calculation result to the control target device, at a predetermined cycle. In the present embodiment, an
instruction or an operation command from the programmable display device 100 to the control device in the normal operation mode is transmitted in real time, that is, in ON synchronization.
[0053] Next, processing in the programmable display device having such a configuration is described. FIG. 6 is a flowchart illustrating an example of procedures of the mode switching process in the present embodiment. [0054] First, the coordinate input unit 102 detects touch coordinates of an input indicator of a user at a predetermined time interval, and outputs the detection result to the operation processing unit 108. The operation processing unit 108 obtains the input state of the user from the touch coordinates of the input indicator and the change in the touch coordinates over time. The control unit 110 determines whether a mode switching instruction has been issued on the basis of the input state (Step Sll). For example, as described above, in. a case where the mode switching instruction is issued by pressing the mode switching switch 710 arranged in the screen-gesture non-applicable area 702, when a tapping operation on the mode switching switch 710 in the screen-gesture non-applicable area 702 is performed, it is determined that a mode switching instruction has been issued, and when no tapping operation on the mode switching switch 710 is performed, it is determined that no mode switching instruction has been issued.
[0055] When there is no mode switching instruction (NO at Step Sll), the current status is maintained (Step S12), and the process ends. When there is a mode switching instruction (YES at Step Sll), the control unit 110 obtains the current mode from the mode-status-information storage area provided in the internal storage unit 105 or in the
external storage medium 150 (Step S13). When the current mode is in the normal operation mode (a case of the normal operation mode at Step S13), the control unit 110 switches the mode to the screen gesture mode (Step S14), and the process ends. For example, the mode is switched from the normal operation mode illustrated in FIG. 4(a) to the screen gesture mode illustrated in FIG. 4(b). [0056] Meanwhile, when the current mode is in the screen gesture mode (a case of the screen gesture mode at Step S13), the control unit 110 switches the mode to the normal operation mode (Step S15) , and the process ends. In this case, for example, the mode is switched from the screen gesture mode illustrated in FIG. 4(b) to the normal operation mode illustrated in FIG. 4(a), for example. [0057] In the above example, a case where switching between the normal operation mode and the screen gesture mode is performed each time the mode switching switch 710 is pressed has been described. Similarly to the arrangement of other objects, the arrangement of the mode switching switch 710 is included in the project data 180 and is set by the drawing software 131.
[0058] As another method for switching the mode, it is also possible to define a long-press operation, which is an operation of continuously touching a position other than operation objects in the display area 700 for a predetermined time, as a screen gesture. Specifically, in the normal operation mode, switching to the screen gesture mode is performed by a long-press operation on an arbitrary position other than operation objects in the display area 700 for a predetermined time, while in the screen gesture mode, switching to the normal operation mode is performed by a long-press operation on an arbitrary area in the screen-gesture applicable area 701 or an area other than
the operation objects inthe screen-g:esture non-applicable area 702.
[0059] As still another method, it is also possible to define a double-tapping operation, which is an operation of consecutively touching a position other than the operation objects in the display area 700 twice within a predetermined time, as a screen gesture for switching the mode. Specifically, in the normal operation mode, switching to the screen gesture mode is performed by a double-tapping operation on an arbitrary position other than the operation objects in the display area 700, while in the screen gesture mode, switching to the normal operation mode is performed by a double-tapping operation on an arbitrary area in the screen-gesture applicable area 701 or an area other than the operation objects in the screen-gesture non-applicable area 702.
[0060] Furthermore, in the screen gesture mode, when a no-operation state continues for a predetermined time, it is also possible to switch from the screen gesture mode to the normal operation mode automatically (hereinafter, "screen-gesture-mode automatic releasing function"). The term "no-operation state" refers to a state where no touch is performed after detection of release of the last touch operation. When the screen-gesture-mode automatic releasing function is used, it suffices that the time to automatic release is also set in the project data 180. [0061] Methods of using a switch, long-press, and double-tapping have been described above as examples of means for switching the mode. However, any methods including a method using a simultaneous touch to a plurality of points or a slide operation of moving the touched position in a predetermined direction can be also used, as long as these methods can be distinguished from a
simple touch (tap) operation. Note that, when there is an ON-operation switch, it is desirable to provide an explicit switch for switching the mode (the mode switching switch 710) from the viewpoint of preventing incorrect operations. [0062] The method to be used as means for switching the mode can be freely selected by a creator (a screen designer) of the project data 180 of the programmable display device 100 with the drawing software 131, depending on a device or a system to which the programmable display device 100 is applied and can be set in the project data 180.
[0063] It is also possible that the display content in the screen-gesture non-applicable area 702 is displayed while being switched to other display content prepared in advance only in the screen gesture mode. , [0064] Next, processes performed when the base screen 610 and the window screen 620 exist in a mixed manner are described. FIG. 7 is a diagram illustrating an example of mode switching and a gesture operation when a base screen and a window screen exist in a mixed manner. As illustrated in FIG. 7(a), the window screen 620 is displayed on the base screen 610. In this case, it is assumed that a target to which the screen gesture function is applied is the base screen 610 and the display screen is in the normal operation mode.
[0065] When a user presses the mode switching switch 710 in the screen-gesture non-applicable area 702 in the base screen 610 in this state, the mode shifts to the screen gesture mode in FIG. 7(b). At this point, only the base screen 610 shifts to the screen gesture mode and the screen gesture mode is not applied to the window screen 620 to be superimposed on the base screen 610. That is, in the base screen 610, enlargement/reduction and scroll of the display
content inthe screen-gesture applicable area 701 are. performed, but display of the window screen 620 is not influenced. Further, in the screen gesture mode, the window screen 620 to which the screen gesture mode is not applied is hidden.
[0066] Subsequently, for example, when a predetermined operation for enlarging display content such as pinch-out is performed at a predetermined position in the screen-gesture applicable area 701 in the screen gesture mode, the base screen 610 is zoomed in as illustrated in FIG. 7(c). When a predetermined operation for moving a display area such as a dragging operation is performed in this state, the display area moves as illustrated in FIG. 7(d). Even when these operations are performed on an operation object, the operation object does not operate because these operations are performed in the screen gesture mode. In the example in FIG. 7, even when an operation is performed on a button-shaped operation object 720, the button-shaped operation object 720 is not pressed.
[0067] Subsequently, when the mode switching switch 710 in the screen-gesture non-applicable area 702 in the base screen 610 is pressed, the mode shifts to the normal operation mode as illustrated in FIG. 7(e). At this point, the window screen 620 is displayed again in a superimposed manner on the base screen 610 that has been enlarged and scrolled.
[0068] When a plurality of window screens are displayed and a user wishes to cause one of the window screens to enter the screen gesture mode, mode switching is performed on the one window screen to shift it to the screen gesture mode. At this point, window screens other than the window screen to which the screen gesture function is applied are hidden. While display of the base screen may be continued,
in order to inform a user that operations to the base screen are invalidated, for example, the base screen may be filled with a predetermined pattern so as to be hidden in appearance, or the saturation of the base screen may be lowered.
[0069] Reasons why the window screen 620 other than the screen to which the screen gesture function is applied is temporarily hidden in the screen gesture mode as described above are explained. One of the reasons is that, if the display of the window screen 620 other than the screen to which the screen gesture function is applied is maintained, when the window screen 620 is superimposed on the screen-gesture applicable area, it is difficult in some cases to perform an operation by a screen gesture. Another reason is that the possibility of an incorrect operation to the window screen 620 to which the screen gesture function is not to be applied becomes a problem. On the other hand, non-display of the base screens 610 or equivalent display thereto provides an effect that a target of the screen gesture function is clearly shown. When a screen to which the screen gesture function is applied is displayed, the base screen 610 is displayed, and operations are allowed to be performed, it is possible to use the base screen 610 for the same purpose as that of the screen-gesture non-applicable area 702 by arranging the mode switching switch 710 in the base screen 610.
[0070] Next, process procedures for determination of a gesture operation in the screen gesture mode and behaviors when a gesture operation is performed are described. FIG. 8 is a flowchart illustrating an example of process procedures when a gesture operation is performed in a screen gesture mode in the present embodiment, FIG. 9 is a diagram illustrating an example of a scroll process in the
screen gesture mode, FIG. 1,0 is a diagram illustrating an example of behaviors when a zoom operation is performed in the screen gesture mode, and FIG. 11 is a diagram illustrating an example of an enlargement/reduction process in the screen gesture mode. In this case, it is assumed that the display screen is in the screen gesture mode. [0071] The control unit 110 periodically causes the operation processing unit 108 to check the number of touch points and touch coordinates that are detected by the touch panel (the coordinate input unit 102), and the control unit 110 processes the results of the check.
[0072] First, the operation processing unit 108 checks the current number of touch points (Step S31). When the number of touch points is one, whether the number of touch points has changed from the preceding state is determined (Step S32). When the number of touch points has changed (YES at Step S32), that is, when no point was touched or two or more points were touched in the preceding state and one point is touched this time, the operation processing unit 108 holds the current touch coordinates as a reference point of touch coordinates (Step S33). The process then returns to Step S31.
[0073] Meanwhile, when the number of touch points has not changed from the preceding state (NO at Step S32), the operation processing unit 108 calculates the displacement between the current touch coordinates and the reference point of touch coordinates (Step S34). Subsequently, the control unit 110 calculates a scroll amount corresponding to the displacement from the reference point of touch coordinates that is obtained from the operation processing unit 108 (Step S35). Change in the scroll amount here is made to match physical change in the touch coordinates on the display unit 101. FIG. 9(a) illustrates that the
content to which zoom is applied and then displayed in the display unit 101 is a display area 801 of display content 800 in its entirety. It is assumed that scroll has been performed in a direction of an arrow 802 in the display area 801.
[0074] Next, the control unit 110 checks whether the display area includes an area outside the display content 800 (protrudes from the display content 800) as a result of the scroll (Step S36). When the display area includes an area outside the display content 800 (NO at Step S36), the display position is corrected so as not to cause any protrusion to occur (Step S37). When the display area 801 is scrolled along the arrow 802 as it is as illustrated in FIG. 9(b), the display content 800 and an area (a protruding area) 811 outside the display content 800 are included in a display area 810 after the scroll. Therefore, when the display position is corrected so as not to include the protruding area 811 in the display area 810, the corrected area becomes a corrected display area 812. [0075] Thereafter, or when an area outside the display content 800 is not included in the display area at Step S36 (YES at Step S36, a case where the display area 801 is included in the display content 800 as in FIG. 9(a)), display on the display unit 101 is updated by applying the scroll in a state with no protrusion (Step S38). The process then returns to Step S31.
[0076] When the current number of touch points is two at Step S31, the operation processing unit 108 checks whether the number of touch points has changed from the preceding state (Step S39). When the number of touch points has changed from the preceding state (YES at Step S39), that is, when the number of touch points at the preceding state is one or less or three or more, the operation processing unit
108 holds the distance-between the current two touch positions as a reference distance (Step S40). The process then returns to Step S31.
[0077] Meanwhile, when the number of touch points has not changed from the preceding state (NO at Step S39), the operation processing unit 108 calculates the amount of change between the reference distance obtained at Step S40 and the distance between the current two touch positions (Step S41). Subsequently, the control unit 110 calculates a zoom amount (an amount of enlargement or reduction of display content) on the basis of the amount of change in the distance between the two touch positions obtained from the operation processing unit 108 (Step S42). In this case, when the reference distance is DO, the distance between the current two points is Dl, the zoom amount before the start of a zoom operation is Z0, a minimum zoom amount is Zmin, a maximum zoom amount is Zmax, and k is an appropriate coefficient, a new zoom amount Z is calculated by the following expression (1). Note that Zmin
Documents
Orders
Section
Controller
Decision Date
Application Documents
#
Name
Date
1
3145-MUMNP-2015-IntimationOfGrant27-04-2023.pdf
2023-04-27
1
Power of Attorney [03-11-2015(online)].pdf
2015-11-03
2
Form 5 [03-11-2015(online)].pdf
2015-11-03
2
3145-MUMNP-2015-PatentCertificate27-04-2023.pdf
2023-04-27
3
Form 3 [03-11-2015(online)].pdf
2015-11-03
3
3145-MUMNP-2015-Written submissions and relevant documents [28-03-2023(online)]-1.pdf
2023-03-28
4
Form 18 [03-11-2015(online)].pdf
2015-11-03
4
3145-MUMNP-2015-Written submissions and relevant documents [28-03-2023(online)].pdf
2023-03-28
5
Form 1 [03-11-2015(online)].pdf
2015-11-03
5
3145-MUMNP-2015-Correspondence to notify the Controller [13-03-2023(online)].pdf