Sign In to Follow Application
View All Documents & Correspondence

A Method For Controlling Apparatus And An Apparatus Thereof

Abstract: A METHOD FOR CONTROLLING APPARATUS AND AN APPARATUS THEREOF The present disclosure provides an apparatus (100) and a method (400) for controlling an operation of the apparatus (100). The apparatus (100) comprises a processor (310), and a memory (320). The memory (320) is communicatively 5 coupled to the processor (310). The memory (320) stores processor instructions, which, on execution, causes the processor (310) to receive sensed input from a plurality of sensors (210). The sensed input corresponds to one or more patterns associated with a grid (110). The plurality of sensors (210) is disposed in the grid (110). The method (400) comprises three steps. First step (410) is to receive inputs 10 sensed by plurality of sensors (210). Second step (420) is to identify pre defined actions associated with the inputs correspond to the one or more patterns and third step (430) is to execute the pre defined actions associated with the sensed inputs by the processor (310). To be published with Fig. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 January 2024
Publication Number
31/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TVS Motor Company Limited
Jayalakshmi Estate, No 29 (Old No 8), Haddows Road
TVS MOTOR COMPAMY LTD
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006

Inventors

1. RAGHAVENDRA PRASAD
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006
2. HARENI ESWARI SURENDRAN NAGHARAJAN
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006

Specification

Description:A METHOD FOR CONTROLLING APPARATUS AND AN APPARATUS THEREOF
TECHNICAL FIELD
[0001] The present subject matter generally relates to an apparatus and a method for controlling an operation of an apparatus thereof.
BACKGROUND 5
[0002] Conventional apparatus is configured with plurality of buttons which required a user to provide pressure on the buttons to perform a certain function. This pressure providing inputs causes user fatigue and may distract the user while riding a vehicle. Buttons causing user fatigue will lead to ergonomic problems and severely impacts the user experience The buttons 10 further cause safety risks as in hazardous situations, fumbling with buttons can delay a user’s ability to take necessary action. Additionally, small buttons may be difficult for users with dexterity issues to press, and buttons without clear labels can be confusing for all users. There are incidents where it has been found that buttons provides lag in operation which also causes delays, safety risks, 15 frustration amongst users and dull performance. Buttons are easy to break, torn etc.
[0003] Buttons often look bulky and fail to provide an aesthetic appeal to the users as the users feel overwhelmed and find difficult to press the right button required, when lot of buttons are provided on the apparatus. Plurality of 20 buttons can cause confusion to the users as the users may press the wrong button unintentionally which triggers a wrong action consequently. Buttons on an apparatus, herein apparatus can be a wearable device such as helmet has various other drawbacks such as accidental activation, limited access, discomfort, distraction, complexity, customization limitations, and aesthetics. For example, 25 buttons exposed to external elements like wind or bumping against objects can be accidentally pressed, potentially triggering unintended functions like visor
3
opening or communication toggles. This can be especially
dangerous in high-speed situations. Reaching and actuating buttons with precision can be challenging. This can be particularly problematic for users with limited dexterity or in cold weather when wearing bulky gloves. Buttons are mechanical components susceptible to wear and tear, especially in harsh 5 environments. Dust, moisture, and even sweat can compromise their functionality, leading to stuck buttons or delayed responses. Buttons can create pressure points on the head, leading to discomfort or even headaches during extended wear. Focusing on finding and actuating buttons can divert attention from the road or surroundings, potentially impacting riding safety. A plethora 10 of buttons can overwhelm users, making helmet operation confusing and time-consuming. This can be detrimental in emergencies or time-sensitive situations. Bulky or poorly designed buttons can detract from the overall aesthetic of the helmet. [0004] Thus, there arises a need for an apparatus and a method for
15 controlling an operation of the apparatus that overcomes the other disadvantages as mentioned previously.
SUMMARY
[0005] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and 20 features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
[0006] According to embodiments illustrated herein, the present disclosure provides an apparatus that comprises a processor and a memory. The memory 25 is communicatively coupled to the processor. The memory stores processor instructions, which, on execution, causes the processor to receive sensed input from a plurality of sensors. Herein, the sensed input corresponds to one or more patterns associated with a grid. The plurality of sensors is disposed in the grid. The grid is defined based on a location of disposition of the plurality of sensors. 30
4
Processor identifies actions associated with each of the sensed input and execute the identified actions. In an embodiment, the grid is located on at least one side of the apparatus. [0007] According to embodiments illustrated herein, the present disclosure provides a method for controlling an operation of an apparatus. The method 5 comprises various steps. First step is receiving inputs sensed by a plurality of sensors herein the inputs correspond to one or more patterns. Second step is identifying by a processor, pre-defined actions associated with the inputs correspond to the one or more patterns, Third step is executing by the processor, the pre-defined actions associated with the sensed inputs. Herein, the plurality 10 of sensors is disposed in the grid. The sensed input corresponds to the one or more patterns associated with the grid.
BRIEF DESCRIPTION OF DRAWINGS
15
[0008] The present invention will become more fully understood from the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present invention.
[0009] The detailed description is described with reference to the 20 accompanying figures, which is related to an apparatus which may be a wearable device such as helmet being one embodiment of the present subject matter. However, the present subject matter is not limited to the depicted embodiment(s). In the figures, the same or similar numbers are used throughout to reference features and components. 25
[00010] Fig. 1 illustrates a perspective view of an apparatus herein helmet in which a grid is disposed, in accordance with an embodiment of the present subject matter.
5
[00011] Fig. 2 illustrates a grid having plurality of sensors, in accordance with an embodiment of the present subject matter.
[00012] Fig. 3 illustrates a block diagram depicting components of the apparatus, in accordance with an embodiment of the present subject matter.
[00013] Fig. 4 illustrates a flowchart of a method for controlling the 5 operation of the apparatus, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[00014] The present disclosure may be best understood with reference to the detailed figures and description set forth herein. Various embodiments are 10 discussed below with reference to the figures. However, those skilled in the art will readily appreciate that the detailed descriptions given herein with respect to the figures are simply for explanatory purposes as the methods and systems may extend beyond the described embodiments. For example, the teachings presented and the needs of a particular application may yield multiple 15 alternative and suitable approaches to implement the functionality of any detail described herein. Therefore, any approach may extend beyond the particular implementation choices in the following embodiments described and shown.
[00015] References to “one embodiment,” “at least one embodiment,” “an embodiment,” “one example,” “an example,” “for example,” and so on indicate 20 that the embodiment(s) or example(s) may include a particular feature, structure, characteristic, property, element, or limitation but that not every embodiment or example necessarily includes that particular feature, structure, characteristic, property, element, or limitation. Further, repeated use of the phrase “in an embodiment” does not necessarily refer to the same embodiment. 25
[00016] The present invention now will be described more fully hereinafter with different embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather those embodiments are provided so that this disclosure
6
will be thorough and complete, and fully convey the scope of the invention to those skilled in the art. [00017] The present subject matter is further described with reference to accompanying figures. It should be noted that the description and figures merely illustrate principles of the present subject matter. Various arrangements may be 5 devised that, although not explicitly described or shown herein, encompass the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and examples of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[00018] Various features and embodiments of the present subject matter here 10 will be discernible from the following further description thereof, set out hereunder. It is contemplated that the concepts of the present subject matter may be applied to any kind of vehicle within the spirit and scope of this subject matter. The detailed explanation of the constitution of parts other than the present subject matter which constitutes an essential part has been omitted at 15 suitable places.
[00019] It is an object of the present invention to provide an apparatus having plurality of sensors which provides an improved user experience by eliminating user fatigue. It is an object of the present invention to solve ergonomic problems by providing easy operation of the plurality of sensors by the user. It is also an 20 object of the present invention to increase safety of the users as easy operation of the apparatus helps the rider to focus and avoid distraction while riding a vehicle. It is another object of the present invention to provide prompt responses of actions once the user has provided required inputs to the plurality of sensors. This avoids lag between input and output. It is also an object of the present 25 invention to provide an easy way to operate the apparatus by eliminating buttons and providing plurality of sensors to perform certain functions. It is also an object of the present invention to provide flexibility to the users as the apparatus having grid made up of the plurality of sensors offer more freedom as compared
7
to physical buttons. This further provides a seamless user experience while interacting with the apparatus. [00020] It is also an object of the present invention to eliminate the need for bulky buttons or any visible protrusions which look clumsy and may confuse the user. It is an additional object of the present invention to make an apparatus 5 having aesthetic appeal and easy to use by the user. It is also an object of the present invention to provide a wider range of inputs which was difficult to provide with manual buttons. The present invention also aims to differentiate between various types of interaction which can be tap, swipe, hold, multi touch gestures and thereby aims to provide possibilities for richer user interactions 10 and more intuitive operation. It is an another object of the present invention to provide enhanced durability and reliability. Unlike physical buttons, the sensors have no moving parts, making them less susceptible to wear and tear, breakage, or jamming. This is especially advantageous in harsh environments or for apparatus used intensively. It is also an object of the present 15 invention to provide easy cleaning and maintenance as the smooth surface of the sensors makes them easier to clean and maintain compared to buttons with crevices or intricate mechanisms. This is beneficial when apparatus used in hygienic environments or exposed to dirt and debris.
[00021] The present subject matter along with all the accompanying 20 embodiments and their other advantages would be described in greater detail in conjunction with the figures in the following paragraphs.
[00022] The present subject matter is further described with reference to accompanying figures. It should be noted that the description and figures merely illustrate principles of the present subject matter. Various arrangements may be 25 devised that, although not explicitly described or shown herein, encompass the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and examples of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
8
[00023] The present subject matter may be implemented in any apparatus. The apparatus include but not limited to wearable devices such as helmets. However, for the purpose of explanation and by no limitation, the present invention, and corresponding additional advantages and features are described through the following embodiments depicting an apparatus such as helmet. 5
[00024] Fig. 1 illustrates a perspective view of an apparatus (100) herein helmet in which a grid (110) is disposed, in accordance with an embodiment of the present subject matter.
[00025] In an embodiment, the apparatus include but not limited to wearable devices such as helmet as depicted in Fig. 1. However, the apparatus can also 10 mean an instrument cluster, a wearable device such as smart watch, smart helmet, a handheld device, a visor and the like. Fig. 1 for illustration purpose herein depicts a helmet having the grid (110) located on at least one side of the helmet. In a preferred embodiment, the grid (110) is located on a left side of the helmet. The left side is easy to access while riding a vehicle. 15
[00026] Fig. 2 illustrates a grid (110) having plurality of sensors (210), in accordance with an embodiment of the present subject matter. In an embodiment, the grid (110) comprises a first portion (210F), a second portion (210S) and a third portion (210T). The second portion (210S) further comprises an upper zone (210SU), an intermediate zone (210SI), and a lower zone 20 (210SL). The first portion (210F) is provided with a first sensor A, the second portion (210S) is provided with three sensors placed in three different zones and the third portion (210T) has a fifth sensor B. The upper zone (210SU), the intermediate zone (210SI) and the lower zone (210SL) is provided with a second sensor A, a third sensor E and a fourth sensor C respectively. Herein, 25 the first portion (210F) is located at one edge of the grid (110) and the third portion (210T) is located on another edge of the grid (110) opposite to the first portion (210F). The second portion (210S) is located in between the first portion (210F) and the second portion (210T).
9
[00027] In an embodiment, the grid (110) comprises a plurality of regions corresponding to each of the plurality of sensors (210), herein each of the plurality of regions is defined using a pre-defined periphery around each of the plurality of sensors (210). For example, the first sensor D is located in specific location within the first portion (210F). However, the specific location is to be 5 understood as the predefined periphery surrounding the first sensor D. Thus, the first sensor D will receive inputs from the user even if the user has interacted within the pre defined periphery surrounding the corresponding sensor.
[00028] In an embodiment, the pre-defined periphery comprises at least one of a circular region, a rectangular region, a quadrilateral region, or a polygonal 10 region. For example, the pre defined periphery can be of various shapes which include but not limiting to circular region, rectangular region, polygonal region, quadrilateral region, elliptical region and like. The predefined periphery offers an additional area which is susceptible to receiving inputs from the user.
[00029] In an embodiment, the sensed input is defined by the user, by an 15 interaction from at least one of the plurality of regions to at least a remaining of the plurality of regions. For example, user can provide various inputs to the apparatus to perform certain functions. Providing inputs to the apparatus by the user is the interaction which user does with the apparatus to control the apparatus for operating different functions. 20
[00030] In an embodiment, the interaction comprises at least one of a touch input, a swipe input, a gesture based input, and a pressure based input and a combination thereof. The user can interact with the apparatus in various ways which includes but not limiting to, swipe input, gesture based input, pressure based input, single tap or double tap input or can be a combination of these as 25 mentioned.
[00031] In an embodiment, the grid (110) is configured to have a plurality of interactions. The plurality of interactions comprises around ten interactions. Herein a first interaction comprises interacting from the first sensor D to the fifth sensor B via one of the second sensor A, or the third sensor E or the fourth 30
10
sensor C or a combination thereof. For example, swiping from the first sensor D to the fifth sensor B via any of the second sensor A, the third sensor E, or the fourth sensor C to accept an incoming call. [00032] In an embodiment, a second interaction comprises interacting from the fifth sensor B to the first sensor D via one of the second sensor A, or the 5 third sensor E or the fourth sensor C or a combination thereof. For example, swiping from the fifth sensor B to the first sensor D via any of the second sensor A, the third sensor E, or the fourth sensor C to reject the incoming call.
[00033] In an embodiment, a third interaction comprises interacting from the third sensor E to any of the first sensor D, the second sensor A, to the fourth 10 sensor C, the fifth sensor B or a combination thereof. In another embodiment, the third interaction comprises interacting from the second sensor A to any of the first sensor D, the third sensor E, the fifth sensor B or a combination thereof. In an embodiment, the third interaction comprises interacting from the fourth sensor C to any of the first sensor D, the third sensor E, the fifth sensor B or a 15 combination thereof.
[00034] In an embodiment, a fourth interaction comprises interacting from the second sensor A to the fourth sensor C via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination thereof. For example, swiping from the second sensor A to the fourth sensor C via the third sensor E 20 to merge multiple calls.
[00035] In another embodiment, a fifth interaction comprises interacting from the fifth sensor C to the second sensor A via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination thereof. In a working example, swiping up from the fourth sensor C to the second sensor A via the 25 third sensor E to send a template text while rejecting the incoming call. The template text can be customized as per user requirements such as “I am busy, will call you back”.
11
[00036] In an embodiment, a sixth interaction comprises interacting with any of the plurality of the sensors (210) individually. For example, tapping the fifth sensor B to play succeeding music option, tapping the first sensor D to play preceding music option. Another working example is tapping the third sensor E to play or pause ongoing music option. 5
[00037] In another embodiment, a seventh interaction comprises interacting from the second sensor A to the first sensor D via the fifth sensor B, and the fourth sensor C, in a clockwise direction in a predefined motion. In a working example, swiping from the second sensor A to the first sensor D via the fifth sensor B and the fourth sensor C in a clockwise direction in circular motion to 10 increase audio intensity. Herein the predefined motion is the circular motion or arched motion. In another working example, swiping from the second sensor A to the first sensor D via the fifth sensor B and the fourth sensor C in a clockwise direction in linear motion to enable recording. Herein, the predefined motion is the linear motion. 15
[00038] In an embodiment, an eighth interaction comprises interacting from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise direction in the predefined motion. In a working example, swiping from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise direction in circular 20 motion to decrease audio intensity. Herein the predefined motion is the circular motion or arched motion. In another working example, swiping from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in anticlockwise direction in linear motion to disable recording. Herein, the predefined motion is the linear motion. 25
[00039] In an embodiment, a ninth interaction comprises interacting from the first portion (210F) to the third portion (210T) via the second portion (210S). For example, swiping from the first portion (210F) to the third portion (210T) via the second portion (210S) to enable location recording.
12
[00040] In another embodiment, a tenth interaction comprises interacting from the upper zone (210SU) to the lower zone (210SL) via the intermediate zone (210SI). For example, swiping from the upper zone (210SU) to the lower zone (210SL) via the intermediate zone (210SI) to disable location recording.
[00041] In an embodiment, each of the plurality of interactions results in one 5 of the identified actions. The identified actions comprises accepting an incoming call, rejecting the incoming call, rejecting the incoming call while sending a template text, and merging multiple calls. For example, swiping from the first sensor D to the fifth sensor B via any of the second sensor A, the third sensor E, or the fourth sensor C to accept an incoming call. 10
[00042] In another embodiment, the plurality of interactions comprises playing succeeding music option, playing preceding music option, playing or pausing ongoing music option, increasing audio intensity and decreasing the audio intensity. For example, swiping from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise 15 direction in circular motion to decrease audio intensity.
[00043] In an embodiment, the plurality of interactions comprises entering a ride link mode, exiting the ride link mode, muting the ride link mode, entering a ride grid mode, exiting the ride grid mode, and merging the ride grid mode. For example, single tap on the second sensor A to enter ride link mode, single 20 tap on the fourth sensor C to exit the ride link mode, single tap on the third sensor E to mute the ride link mode. In another working example, double tap on the fifth sensor B to enter ride grid mode, double tap on the first sensor D to exit the ride grid mode and double tap on the third sensor E to merge the ride grid mode. 25
[00044] In an embodiment, the plurality of interactions comprises enabling voice assist option, disable the voice assist option, enable audio recording, disable audio recording, enable recording of location and disable recording of location. For example, swiping from the second sensor A to the fifth sensor B
13
via the first sensor D, and the fourth sensor C in anticlockwise direction in linear motion to disable recording. [00045] Fig. 3 illustrates a block diagram (300) depicting components of the apparatus (100), in accordance with an embodiment of the present subject matter. The apparatus (100) comprises plurality of sensors (210) arranged in the 5 grid (110). The memory (320) is communicatively coupled to the processor (310). Herein, the memory (320) stores processor instructions, which, on execution, causes the processor (310) to receive sensed input from the plurality of sensors (210). The sensed input corresponds to one or more patterns associated with the grid (110). The processor (310) identifies actions associated with each of the sensed input and 10 execute the identified actions.
[00046] The processor (310) may include suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory (320). The processor (310) may be implemented based on a number of processor technologies known in the art. Examples of the processor (310) 15 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CIBC) processor, and/or other processor.
[00047] The memory (320) comprises suit able logic, circuitry, interfaces, 20 and/or code that is configured to store the set of instructions, which may be executed. In an embodiment, the memory (320) may be configured to store one or more programs, routines, or scripts that may be executed in coordination with the processor (310). The memory (320) may be implemented based on a Random Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk 25 Drive (HDD), a storage server, and/or a Secure Digital (SD) card.
[00048] Fig. 4 illustrates a flowchart of a method (400) for controlling the operation of the apparatus (100), in accordance with an embodiment of the present subject matter. The flowchart depicts the various steps of the method (400) that is performed for controlling the operation of the apparatus (100). The 30
14
[00049] The method (400) comprises of various steps. Herein a first step (410) is to receive inputs sensed by a plurality of sensors (210). The inputs correspond to one or more patterns. Second step (420) is to identify by a processor, pre defined actions associated with the inputs correspond to the one or more patterns. Third step (430) is to execute the pre defined actions 5 associated with the sensed inputs by the processor (310). In an embodiment, the plurality of sensors (210) is disposed in the grid (110). In another embodiment, the sensed input corresponds to the one or more patterns associated with the grid (110).
[00050] In an embodiment, the plurality of sensors (210) is arranged in the 10 grid (110). The grid (110) comprises a first portion (210F), a second portion (210S), and a third portion (210T). The first portion (210F) is provided with a first sensor D. The second portion (210S) is divided into an upper zone (210SU), an intermediate zone (210SI), and a lower zone (210SL). The upper zone (210SU) is provided with a second sensor A. The intermediate zone (210SI) is 15 provided with a third sensor E and the lower zone (210SL) is provided with a fourth sensor C. The third portion (210T) is configured to have a fifth sensor B. In an embodiment, the grid (110) comprises a plurality of regions corresponding to each of the plurality of sensors (210), herein each of the plurality of regions is defined using a pre-defined periphery around each of the plurality of sensors 20 (210).
[00051] In an embodiment, the pre-defined periphery comprises at least one of a circular region, a rectangular region, a quadrilateral region, or a polygonal region.
[00052] In an embodiment, the sensed input is defined, by a user, by an 25 interaction from at least one of the plurality of regions to at least a remaining of the plurality of regions.
[00053] In an embodiment, the interaction comprises at least one of a touch input, a swipe input, a gesture based input, and a pressure based input and a combination thereof. 30
15
[00054] In an embodiment, the grid (110) is configured to have a plurality of interactions. The plurality of interactions comprises ten interactions. A first interaction comprises interacting from the first sensor D to the fifth sensor B via one of the second sensor A, or the third sensor E or the fourth sensor C or a combination thereof. A second interaction comprises interacting from the fifth 5 sensor B to the first sensor D via one of the second sensor A, or the third sensor E or the fourth sensor C or a combination thereof. A third interaction comprises interacting from the third sensor E to any of the first sensor D, the second sensor A, to the fourth sensor C, the fifth sensor B or a combination thereof. The third interaction comprises interacting from the second sensor A to any of the first 10 sensor D, the third sensor E, to the fourth sensor C, the fifth sensor B or a combination thereof. The third interaction also comprises interacting from the fourth sensor C to any of the first sensor D, the second sensor A, the third sensor E, the fifth sensor B or a combination thereof. A fourth interaction comprises interacting from the second sensor A to the fourth sensor C via one of the first 15 sensor D, or the third sensor E or the fifth sensor B or a combination thereof. A fifth interaction comprises interacting from the fifth sensor C to the second sensor A via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination thereof. A sixth interaction comprises interacting with any of the plurality of the sensors (210) individually. A seventh interaction comprises 20 interacting from the second sensor A to the first sensor D via the fifth sensor B, and the fourth sensor C, in a clockwise direction in a predefined motion. An eighth interaction comprises interacting from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise direction in the predefined motion. A ninth interaction, the ninth interaction 25 comprises interacting from the first portion (210F) to the third portion (210T) via the second portion (210S). A tenth interaction comprises interacting from the upper zone (210SU) to the lower zone (210SL) via the intermediate zone (210SI).
[00055] In an embodiment, each of the plurality of interactions results in one 30 of the identified actions. The identified actions comprise accepting an incoming
16
call, rejecting the incoming call, rejecting the incoming call while sending a template text, and merging multiple calls. The identified actions further comprise playing succeeding music option, playing preceding music option, playing or pausing ongoing music option, increasing audio intensity, decreasing the audio intensity. The identified actions also comprise entering a ride link 5 mode, exiting the ride link mode, muting the ride link mode, entering a ride grid mode, exiting the ride grid mode, and merging the ride grid mode. The identified actions comprise enabling voice assist option, enable audio recording, disable audio recording, enable recording of location, and disable recording of location. [00056] The present subject matter offers an advantage of providing an 10 apparatus having a plurality of sensors which provides an improved user experience by eliminating user fatigue. Another advantage of the present invention is solving ergonomic problems by providing easy operation of the plurality of sensors by the user. Another advantage of the present invention is increasing safety of the users as easy operation of the apparatus helps the rider 15 to focus and avoid distraction while riding a vehicle. Another advantage of the present invention is providing prompt responses of actions once the user has provided required inputs to the plurality of sensors. This avoids lag between input and output. Another advantage of the present invention is providing an easy way to operate the apparatus by eliminating buttons and providing plurality 20 of sensors to perform certain functions. Another advantage of the present invention is providing flexibility to the users as the apparatus having grid made up of the plurality of sensors offer more freedom as compared to physical buttons. This further provides a seamless user experience while interacting with the apparatus. 25
[00057] The present invention also offers an advantage to eliminate the need for bulky buttons or any visible protrusions which look clumsy and may confuse the user. Another advantage of the present invention is making an apparatus having aesthetic appeal and easy to use by the user. Another advantage of the present invention is providing a wider range of inputs. The present invention 30
17
also offers the advantage of differentiating between various types of interaction which can be tap, swipe, hold, multi touch gestures and thereby provides possibilities for richer user interactions and more intuitive operation. Another advantage of the present invention is providing enhanced durability and reliability. Unlike physical buttons, the sensors have no moving parts, making 5 them less susceptible to wear and tear, breakage, or jamming. This is especially advantageous in harsh environments or for apparatus used intensively. Another advantage of the present invention is providing easy cleaning and maintenance as the smooth surface of the sensors makes them easier to clean and maintain compared to buttons with crevices or intricate mechanisms. This is beneficial 10 when apparatus is used in hygienic environments or exposed to dirt and debris. [00058] Thus, by claiming an apparatus and a method to control the operation of the apparatus, the present subject matter offers numerous advantages as mentioned above.
[00059] While certain features of the claimed subject matter have been 15 illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes that fall within the true spirit of the claimed subject matter. , Claims:We claim:
1. An apparatus (100), the apparatus (100) comprising:
a processor (310);
a memory (320), the memory (320) being communicatively coupled 5 to the processor (310);
wherein the memory (320) stores processor instructions, which, on execution, causes the processor (310) to:
receive sensed input from a plurality of sensors (210), wherein the sensed input corresponds to one or more patterns 10 associated with a grid (110);
wherein the plurality of sensors (210) being disposed in the grid (110), wherein the grid (110) is defined based on a location of disposition of the plurality of sensors (210);
identify actions associated with each of the sensed input; and 15
execute the identified actions;
wherein the grid (110) being located on at least one side of the apparatus (100).
2. The apparatus (100) as claimed in claim 1, wherein the plurality of sensors 20 (210) being arranged in the grid (110), wherein the grid (110) comprises:
a first portion (210F);
a second portion (210S); and
a third portion (210T);
wherein, the first portion (210F) being provided with a first 25 sensor D;
the second portion (210S) being divided into:
an upper zone (210SU), the upper zone (210SU) being provided with a second sensor A;
19
an intermediate zone (210SI), the intermediate zone (210SI) being provided with a third sensor E;
a lower zone (210SL), the lower zone (210SL) being provided with a fourth sensor C; and
the third portion (210T) being provided with a fifth sensor B. 5
3. The apparatus (100) as claimed in claim 1, wherein the grid (110) comprises a plurality of regions corresponding to each of the plurality of sensors (210), wherein each of the plurality of regions being defined using a pre-defined periphery around each of the plurality of sensors (210). 10
4. The apparatus (100) as claimed in claim 3, wherein the pre-defined periphery comprises at least one of a circular region, a rectangular region, a quadrilateral region, or a polygonal region.
15
5. The apparatus (100) as claimed in claim 3, wherein the sensed input being defined by a user, by an interaction from at least one of the plurality of regions to at least a remaining of the plurality of regions.
6. The apparatus (100) as claimed in claim 5, wherein the interaction 20 comprises at least one of a touch input, a swipe input, a gesture based input, and a pressure based input and a combination thereof.
7. The apparatus (100) as claimed in claim 2, wherein the grid (110) being configured to have a plurality of interactions, the plurality of interactions 25 comprises:
a first interaction, the first interaction comprises interacting from the first sensor D to the fifth sensor B via one of the second sensor A, or the third sensor E or the fourth sensor C or a combination thereof;
a second interaction, the second interaction comprises interacting 30 from the fifth sensor B to the first sensor D via one of the second
20
sensor A, or the third sensor E or the fourth sensor C or a combination thereof;
a third interaction, the third interaction comprises;
interacting from the third sensor E to any of the first sensor D, the second sensor A, to the fourth sensor C, the fifth sensor B or 5 a combination thereof;
interacting from the second sensor A to any of the first sensor D, the third sensor E, the fifth sensor B or a combination thereof;
interacting from the fourth sensor C to any of the first sensor D, the third sensor E, the fifth sensor B or a combination thereof; 10
a fourth interaction, the fourth interaction comprises interacting from the second sensor A to the fourth sensor C via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination thereof;
a fifth interaction, the fifth interaction comprises interacting from 15 the fifth sensor C to the second sensor A via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination thereof;
a sixth interaction, the sixth interaction comprises interacting with any of the plurality of the sensors (210) individually; 20
a seventh interaction, the seventh interaction comprises interacting from the second sensor A to the first sensor D via the fifth sensor B, and the fourth sensor C, in a clockwise direction in a predefined motion;
an eighth interaction, the eighth interaction comprises interacting 25 from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise direction in the predefined motion;
a ninth interaction, the ninth interaction comprises interacting from the first portion (210F) to the third portion (210T) via the second 30 portion (210S); and
21
a tenth interaction, the tenth interaction comprises interacting from the upper zone (210SU) to the lower zone (210SL) via the intermediate zone (210SI).
8. The apparatus (100) as claimed in claim 7, wherein each of the plurality of 5 interactions results in one of the identified actions, the identified actions comprises;
accepting an incoming call, rejecting the incoming call, rejecting the incoming call while sending a template text, and merging multiple calls; 10
playing succeeding music option, playing preceding music option, playing or pausing ongoing music option;
increasing audio intensity, decreasing the audio intensity;
entering a ride link mode, exiting the ride link mode, and muting the ride link mode; 15
entering a ride grid mode, exiting the ride grid mode, and merging the ride grid mode;
enabling voice assist option;
enable audio recording, disable audio recording;
enable recording of location, disable recording of location 20
9. A method (400) for controlling an operation of an apparatus (100), the method (400) comprising:
receiving (410), inputs sensed by a plurality of sensors (210), wherein the inputs correspond to one or more patterns; 25
identifying (420), by a processor (310), pre-defined actions associated with the inputs correspond to the one or more patterns; and
executing (430), by the processor (310), the pre-defined actions associated with the sensed inputs; 30
22
wherein the plurality of sensors (210) being disposed in the grid (110);
wherein the sensed input corresponds to the one or more patterns associated with the grid (110).
5
10. The method (400) as claimed in claim 9, wherein the plurality of sensors (210) being arranged in the grid (110), wherein the grid (110) comprises:
a first portion (210F);
a second portion (210S); and
a third portion (210T); 10
wherein, the first portion (210F) being provided with a first sensor D;
the second portion (210S) being divided into:
an upper zone (210SU), the upper zone (210SU) being provided with a second sensor A; 15
an intermediate zone (210SI), the intermediate zone (210SI) being provided with a third sensor E;
a lower zone (210SL), the lower zone (210SL) being provided with a fourth sensor C; and
the third portion (210T) being provided with a fifth sensor B. 20
11. The method (400) as claimed in claim 9, wherein the grid (110) comprises a plurality of regions corresponding to each of the plurality of sensors (210), wherein each of the plurality of regions being defined using a pre-defined periphery around each of the plurality of sensors (210). 25
12. The method (400) as claimed in claim 11, wherein the pre-defined periphery comprises at least one of a circular region, a rectangular region, a quadrilateral region, or a polygonal region.
23
13. The method (400) as claimed in claim 11, wherein the sensed input being defined, by a user, by an interaction from at least one of the plurality of regions to at least a remaining of the plurality of regions.
14. The method (400) as claimed in claim 13, wherein the interaction comprises 5 at least one of a touch input, a swipe input, a gesture based input, and a pressure based input and a combination thereof.
15. The method (400) as claimed in claim 10, wherein the grid (110) being configured to have a plurality of interactions, the plurality of interactions 10 comprises:
a first interaction, the first interaction comprises interacting from the first sensor D to the fifth sensor B via one of the second sensor A, or the third sensor E or the fourth sensor C or a combination thereof;
a second interaction, the second interaction comprises interacting 15 from the fifth sensor B to the first sensor D via one of the second sensor A, or the third sensor E or the fourth sensor C or a combination thereof;
a third interaction, the third interaction comprises;
interacting from the third sensor E to any of the first sensor 20 D, the second sensor A, to the fourth sensor C, the fifth sensor B or a combination thereof;
interacting from the second sensor A to any of the first sensor D, the third sensor E, to the fourth sensor C, the fifth sensor B or a combination thereof; 25
interacting from the fourth sensor C to any of the first sensor D, the second sensor A, the third sensor E, the fifth sensor B or a combination thereof;
a fourth interaction, the fourth interaction comprises interacting from the second sensor A to the fourth sensor C via one of the first 30
24
sensor D, or the third sensor E or the fifth sensor B or a combination thereof;
a fifth interaction, the fifth interaction comprises interacting from the fifth sensor C to the second sensor A via one of the first sensor D, or the third sensor E or the fifth sensor B or a combination 5 thereof;
a sixth interaction, the sixth interaction comprises interacting with any of the plurality of the sensors (210) individually;
a seventh interaction, the seventh interaction comprises interacting from the second sensor A to the first sensor D via the fifth sensor B, 10 and the fourth sensor C, in a clockwise direction in a predefined motion;
an eighth interaction, the eighth interaction comprises interacting from the second sensor A to the fifth sensor B via the first sensor D, and the fourth sensor C in an anticlockwise direction in the 15 predefined motion;
a ninth interaction, the ninth interaction comprises interacting from the first portion (210F) to the third portion (210T) via the second portion (210S); and
a tenth interaction, the tenth interaction comprises interacting from 20 the upper zone (210SU) to the lower zone (210SL) via the intermediate zone (210SI).
16. The method (400) as claimed in claim 15, wherein each of the plurality of interactions results in one of the identified actions, the identified actions 25 comprises;
accepting an incoming call, rejecting the incoming call, rejecting the incoming call while sending a template text, and merging multiple calls;
playing succeeding music option, playing preceding music option, 30 playing or pausing ongoing music option;
25
increasing audio intensity, decreasing the audio intensity;
entering a ride link mode, exiting the ride link mode, and muting the ride link mode;
entering a ride grid mode, exiting the ride grid mode, and merging the ride grid mode; 5
enabling voice assist option;
enable audio recording, disable audio recording; and
enable recording of location, disable recording of location.

Documents

Application Documents

# Name Date
1 202441005582-STATEMENT OF UNDERTAKING (FORM 3) [25-01-2024(online)].pdf 2024-01-25
2 202441005582-REQUEST FOR EXAMINATION (FORM-18) [25-01-2024(online)].pdf 2024-01-25
3 202441005582-FORM 18 [25-01-2024(online)].pdf 2024-01-25
4 202441005582-FORM 1 [25-01-2024(online)].pdf 2024-01-25
5 202441005582-FIGURE OF ABSTRACT [25-01-2024(online)].pdf 2024-01-25
6 202441005582-DRAWINGS [25-01-2024(online)].pdf 2024-01-25
7 202441005582-COMPLETE SPECIFICATION [25-01-2024(online)].pdf 2024-01-25